Sample records for exploring ant-based algorithms

  1. Fuzzy Rules for Ant Based Clustering Algorithm

    Amira Hamdi


    Full Text Available This paper provides a new intelligent technique for semisupervised data clustering problem that combines the Ant System (AS algorithm with the fuzzy c-means (FCM clustering algorithm. Our proposed approach, called F-ASClass algorithm, is a distributed algorithm inspired by foraging behavior observed in ant colonyT. The ability of ants to find the shortest path forms the basis of our proposed approach. In the first step, several colonies of cooperating entities, called artificial ants, are used to find shortest paths in a complete graph that we called graph-data. The number of colonies used in F-ASClass is equal to the number of clusters in dataset. Hence, the partition matrix of dataset founded by artificial ants is given in the second step, to the fuzzy c-means technique in order to assign unclassified objects generated in the first step. The proposed approach is tested on artificial and real datasets, and its performance is compared with those of K-means, K-medoid, and FCM algorithms. Experimental section shows that F-ASClass performs better according to the error rate classification, accuracy, and separation index.

  2. Effective ANT based Routing Algorithm for Data Replication in MANETs

    N.J. Nithya Nandhini


    Full Text Available In mobile ad hoc network, the nodes often move and keep on change its topology. Data packets can be forwarded from one node to another on demand. To increase the data accessibility data are replicated at nodes and made as sharable to other nodes. Assuming that all mobile host cooperative to share their memory and allow forwarding the data packets. But in reality, all nodes do not share the resources for the benefits of others. These nodes may act selfishly to share memory and to forward the data packets. This paper focuses on selfishness of mobile nodes in replica allocation and routing protocol based on Ant colony algorithm to improve the efficiency. The Ant colony algorithm is used to reduce the overhead in the mobile network, so that it is more efficient to access the data than with other routing protocols. This result shows the efficiency of ant based routing algorithm in the replication allocation.


    Haoxiang XIA; Shuguang WANG; Taketoshi YOSHIDA


    Ant-based text clustering is a promising technique that has attracted great research attention. This paper attempts to improve the standard ant-based text-clustering algorithm in two dimensions. On one hand, the ontology-based semantic similarity measure is used in conjunction with the traditional vector-space-model-based measure to provide more accurate assessment of the similarity between documents. On the other, the ant behavior model is modified to pursue better algorithmic performance.Especially, the ant movement rule is adjusted so as to direct a laden ant toward a dense area of the same type of items as the ant's carrying item, and to direct an unladen ant toward an area that contains an item dissimilar with the surrounding items within its Moore neighborhood. Using WordNet as the base ontology for assessing the semantic similarity between documents, the proposed algorithm is tested with a sample set of documents excerpted from the Reuters-21578 corpus and the experiment results partly indicate that the proposed algorithm perform better than the standard ant-based text-clustering algorithm and the k-means algorithm.

  4. Ants-Based On-Demand Routing Algorithm for Mobile Ad Hoc Networks


    An ants-based on-demand routing algorithm (AORA) specialized for mobile ad hoc networks is proposed. AORA measures the network's traffic information including delivery time,route energy etc. by the continuous delivery of data packets,then calculates the compositive parameter for each route which can be seen as the stigmity and uses it to choose the comparatively optimal route in real time.To adjust the weight of each traffic information,the algorithm can meet the different demand of the network's user. Multipath source self repair routing (MSSRR) algorithm and dynamic source routing (DSR) can be seen as the special samples of AORA. The routing overhead is not increased in this algorithm. By using simulation, it can be seen that the performance of AORA is better than that of DSR in all scenarios obviously,especially the delivery fraction is increased by more than 100%.

  5. Energy Efficiency Performance Improvements for Ant-Based Routing Algorithm in Wireless Sensor Networks

    Adamu Murtala Zungeru


    Full Text Available The main problem for event gathering in wireless sensor networks (WSNs is the restricted communication range for each node. Due to the restricted communication range and high network density, event forwarding in WSNs is very challenging and requires multihop data forwarding. Currently, the energy-efficient ant based routing (EEABR algorithm, based on the ant colony optimization (ACO metaheuristic, is one of the state-of-the-art energy-aware routing protocols. In this paper, we propose three improvements to the EEABR algorithm to further improve its energy efficiency. The improvements to the original EEABR are based on the following: (1 a new scheme to intelligently initialize the routing tables giving priority to neighboring nodes that simultaneously could be the destination, (2 intelligent update of routing tables in case of a node or link failure, and (3 reducing the flooding ability of ants for congestion control. The energy efficiency improvements are significant particularly for dynamic routing environments. Experimental results using the RMASE simulation environment show that the proposed method increases the energy efficiency by up to 9% and 64% in converge-cast and target-tracking scenarios, respectively, over the original EEABR without incurring a significant increase in complexity. The method is also compared and found to also outperform other swarm-based routing protocols such as sensor-driven and cost-aware ant routing (SC and Beesensor.

  6. Performance Evaluation of Ant-Based Routing Protocols for Wireless Sensor Networks

    Adamu Murtala Zungeru


    Full Text Available High efficient routing is an important issue in the design of limited energy resource Wireless Sensor Networks (WSNs. Due to the characteristic of the environment at which the sensor node is to operate, coupled with severe resources; on-board energy, transmission power, processing capability, and storage limitations, prompt for careful resource management and new routing protocol so as to counteract the differences and challenges. To this end, we present an Improved Energy-Efficient Ant-Based Routing (IEEABR Algorithm in wireless sensor networks. Compared to the state-of-the-art Ant-Based routing protocols; Basic Ant-Based Routing (BABR Algorithm, Sensor-driven and Cost-aware ant routing (SC, Flooded Forward ant routing (FF, Flooded Piggybacked ant routing (FP, and Energy-Efficient Ant-Based Routing (EEABR, the proposed IEEABR approach has advantages in terms of reduced energy usage which can effectively balance the WSN node€™s power consumption, and high energy efficiency. The performance evaluations for the algorithms on a real application are conducted in a well known WSN MATLAB-based simulator (RMASE using both static and dynamic scenario.

  7. Exploration of new multivariate spectral calibration algorithms.

    Van Benthem, Mark Hilary; Haaland, David Michael; Melgaard, David Kennett; Martin, Laura Elizabeth; Wehlburg, Christine Marie; Pell, Randy J. (The Dow Chemical Company, Midland, MI); Guenard, Robert D. (Merck & Co. Inc., West Point, PA)


    A variety of multivariate calibration algorithms for quantitative spectral analyses were investigated and compared, and new algorithms were developed in the course of this Laboratory Directed Research and Development project. We were able to demonstrate the ability of the hybrid classical least squares/partial least squares (CLSIPLS) calibration algorithms to maintain calibrations in the presence of spectrometer drift and to transfer calibrations between spectrometers from the same or different manufacturers. These methods were found to be as good or better in prediction ability as the commonly used partial least squares (PLS) method. We also present the theory for an entirely new class of algorithms labeled augmented classical least squares (ACLS) methods. New factor selection methods are developed and described for the ACLS algorithms. These factor selection methods are demonstrated using near-infrared spectra collected from a system of dilute aqueous solutions. The ACLS algorithm is also shown to provide improved ease of use and better prediction ability than PLS when transferring calibrations between near-infrared calibrations from the same manufacturer. Finally, simulations incorporating either ideal or realistic errors in the spectra were used to compare the prediction abilities of the new ACLS algorithm with that of PLS. We found that in the presence of realistic errors with non-uniform spectral error variance across spectral channels or with spectral errors correlated between frequency channels, ACLS methods generally out-performed the more commonly used PLS method. These results demonstrate the need for realistic error structure in simulations when the prediction abilities of various algorithms are compared. The combination of equal or superior prediction ability and the ease of use of the ACLS algorithms make the new ACLS methods the preferred algorithms to use for multivariate spectral calibrations.

  8. Coordinating Exploration and Exploitation To Construct Genetic Algorithms

    江瑞; 罗予频; 胡东成; 司徒国业


    A new genetic algorithm is proposed based on the careful coordination of the exploration in the solution space of the given problem and the exploitation of the information from the previous search. In the new algorithm architecture, the population in each generation consists of three sub-populations: a preserved part, a reproduced part, and a randomized part. Two parameters are incorporated into the algorithm to efficiently control the percentage of each sub-population to achieve good balance between the exploration and exploitation processes during the optimization. By modeling the algorithm as a homogeneous finite Markov chain, the new genetic algorithm is shown to converge towards the global optimum of the problem at hand. Experiments were designed to test the algorithm using the Rastrigin function, the Griewangk function, and the Schaffer function. Data analyses using the average success ratio, the average objective calculating number, the average first passage time to solution, and the standard deviation of the first passage time were compared with those of the canonical genetic algorithm, the elitist genetic algorithm, and the steady genetic algorithm. The results show strong evidence that our algorithm is superior in performance in terms of economy, robustness and efficiency.

  9. Efficient algorithms to explore conformation spaces of flexible protein loops.

    Yao, Peggy; Dhanik, Ankur; Marz, Nathan; Propper, Ryan; Kou, Charles; Liu, Guanfeng; van den Bedem, Henry; Latombe, Jean-Claude; Halperin-Landsberg, Inbal; Altman, Russ Biagio


    Several applications in biology - e.g., incorporation of protein flexibility in ligand docking algorithms, interpretation of fuzzy X-ray crystallographic data, and homology modeling - require computing the internal parameters of a flexible fragment (usually, a loop) of a protein in order to connect its termini to the rest of the protein without causing any steric clash. One must often sample many such conformations in order to explore and adequately represent the conformational range of the studied loop. While sampling must be fast, it is made difficult by the fact that two conflicting constraints - kinematic closure and clash avoidance - must be satisfied concurrently. This paper describes two efficient and complementary sampling algorithms to explore the space of closed clash-free conformations of a flexible protein loop. The "seed sampling" algorithm samples broadly from this space, while the "deformation sampling" algorithm uses seed conformations as starting points to explore the conformation space around them at a finer grain. Computational results are presented for various loops ranging from 5 to 25 residues. More specific results also show that the combination of the sampling algorithms with a functional site prediction software (FEATURE) makes it possible to compute and recognize calcium-binding loop conformations. The sampling algorithms are implemented in a toolkit (LoopTK), which is available at

  10. Ant-based extraction of rules in simple decision systems over ontological graphs

    Pancerz Krzysztof


    Full Text Available In the paper, the problem of extraction of complex decision rules in simple decision systems over ontological graphs is considered. The extracted rules are consistent with the dominance principle similar to that applied in the dominancebased rough set approach (DRSA. In our study, we propose to use a heuristic algorithm, utilizing the ant-based clustering approach, searching the semantic spaces of concepts presented by means of ontological graphs. Concepts included in the semantic spaces are values of attributes describing objects in simple decision systems

  11. Web Structure Mining: Exploring Hyperlinks and Algorithms for Information Retrieval

    P. R. Kumar


    Full Text Available Problem statement: A study on hyperlink analysis and the algorithms used for link analysis in the Web Information retrieval was done. Approach: This research was initiated because of the dependability of search engines for information retrieval in the web. Understand the web structure mining and determine the importance of hyperlink in web information retrieval particularly using the Google Search engine. Hyperlink analysis was important methodology used by famous search engine Google to rank the pages. Results: The different algorithms used for link analysis like PageRank (PR, Weighted PageRank (WPR and Hyperlink-Induced Topic Search (HITS algorithms are discussed and compared. PageRank algorithm was implemented using a Java program and the convergence of the PageRank values are shown in a chart form. Conclusion: This study was done basically to explore the link structure algorithms for ranking and compare those algorithms. The further research on this area will be problems facing PageRank algorithm and how to handle those problems.

  12. Oil exploration oriented multi-sensor image fusion algorithm

    Xiaobing, Zhang; Wei, Zhou; Mengfei, Song


    In order to accurately forecast the fracture and fracture dominance direction in oil exploration, in this paper, we propose a novel multi-sensor image fusion algorithm. The main innovations of this paper lie in that we introduce Dual-tree complex wavelet transform (DTCWT) in data fusion and divide an image to several regions before image fusion. DTCWT refers to a new type of wavelet transform, and it is designed to solve the problem of signal decomposition and reconstruction based on two parallel transforms of real wavelet. We utilize DTCWT to segment the features of the input images and generate a region map, and then exploit normalized Shannon entropy of a region to design the priority function. To test the effectiveness of our proposed multi-sensor image fusion algorithm, four standard pairs of images are used to construct the dataset. Experimental results demonstrate that the proposed algorithm can achieve high accuracy in multi-sensor image fusion, especially for images of oil exploration.

  13. Daytime Ionosphere Retrieval Algorithm for the Ionospheric Connection Explorer (ICON)

    Stephan, Andrew W.; Korpela, Eric J.; Sirk, Martin M.; England, Scott L.; Immel, Thomas J.


    The NASA Ionospheric Connection Explorer Extreme Ultraviolet spectrograph, ICON EUV, will measure altitude profiles of the daytime extreme-ultraviolet (EUV) OII emission near 83.4 and 61.7 nm that are used to determine density profiles and state parameters of the ionosphere. This paper describes the algorithm concept and approach to inverting these measured OII emission profiles to derive the associated O+ density profile from 150-450 km as a proxy for the electron content in the F-region of the ionosphere. The algorithm incorporates a bias evaluation and feedback step, developed at the U.S. Naval Research Laboratory using data from the Special Sensor Ultraviolet Limb Imager (SSULI) and the Remote Atmospheric and Ionospheric Detection System (RAIDS) missions, that is able to effectively mitigate the effects of systematic instrument calibration errors and inaccuracies in the original photon source within the forward model. Results are presented from end-to-end simulations that convolved simulated airglow profiles with the expected instrument measurement response to produce profiles that were inverted with the algorithm to return data products for comparison to truth. Simulations of measurements over a representative ICON orbit show the algorithm is able to reproduce hmF2 values to better than 5 km accuracy, and NmF2 to better than 12% accuracy over a 12-second integration, and demonstrate that the ICON EUV instrument and daytime ionosphere algorithm can meet the ICON science objectives which require 20 km vertical resolution in hmF2 and 18% precision in NmF2.

  14. Adaptive ant-based routing in wireless sensor networks using Energy Delay metrics

    Yao-feng WEN; Yu-quan CHEN; Min PAN


    To find the optimal routing is always an important topic in wireless sensor networks (WSNs). Considering a WSN where the nodes have limited energy, we propose a novel Energy*Delay model based on ant algorithms ("E&D ANTS" for short)to minimize the time delay in transferring a fixed number of data packets in an energy-constrained manner in one round. Our goal is not only to maximize the lifetime of the network but also to provide real-time data transmission services. However, because of the tradeoff of energy and delay in wireless network systems, the reinforcement learning (RL) algorithm is introduced to train the model. In this survey, the paradigm of E&D ANTS is explicated and compared to other ant-based routing algorithms like AntNet and AntChain about the issues of routing information, routing overhead and adaptation. Simulation results show that our method performs about seven times better than AntNet and also outperforms AntChain by more than 150% in terms of energy cost and delay per round.

  15. Planning Readings: A Comparative Exploration of Basic Algorithms

    Piater, Justus H.


    Conventional introduction to computer science presents individual algorithmic paradigms in the context of specific, prototypical problems. To complement this algorithm-centric instruction, this study additionally advocates problem-centric instruction. I present an original problem drawn from students' life that is simply stated but provides rich…

  16. Entropic algorithms and the lid method as exploration tools for complex landscapes

    Barettin, Daniele; Sibani, Paolo


    to a single valley, are key to understand the dynamical properties of such systems. In this paper we combine the lid algorithm, a tool for landscape exploration previously applied to a range of models, with the Wang-Swendsen algorithm. To test this improved exploration tool, we consider a paradigmatic complex...

  17. Fungi use efficient algorithms for the exploration of microfluidic networks.

    Hanson, Kristi L; Nicolau, Dan V; Filipponi, Luisa; Wang, Lisen; Lee, Abraham P; Nicolau, Dan V


    Fungi, in particular, basidiomycetous fungi, are very successful in colonizing microconfined mazelike networks (for example, soil, wood, leaf litter, plant and animal tissues), a fact suggesting that they may be efficient solving agents of geometrical problems. We therefore evaluated the growth behavior and optimality of fungal space-searching algorithms in microfluidic mazes and networks. First, we found that fungal growth behavior was indeed strongly modulated by the geometry of microconfinement. Second, the fungus used a complex growth and space-searching strategy comprising two algorithmic subsets: 1) long-range directional memory of individual hyphae and 2) inducement of branching by physical obstruction. Third, stochastic simulations using experimentally measured parameters showed that this strategy maximizes both survival and biomass homogeneity in microconfined networks and produces optimal results only when both algorithms are synergistically used. This study suggests that even simple microorganisms have developed adequate strategies to solve nontrivial geometrical problems.

  18. A curious robot: An explorative-exploitive inference algorithm

    Pedersen, Kim Steenstrup; Johansen, Peter


    We propose a sequential learning algorithm with a focus on robot control. It is initialised by a teacher who directs the robot through a series of example solutions of a problem. Left alone, the control chooses its next action by prediction based on a variable order Markov chain model selected...

  19. A curious robot: An explorative-exploitive inference algorithm

    Pedersen, Kim Steenstrup; Johansen, Peter


    We propose a sequential learning algorithm with a focus on robot control. It is initialised by a teacher who directs the robot through a series of example solutions of a problem. Left alone, the control chooses its next action by prediction based on a variable order Markov chain model selected to...

  20. Exploring New Clustering Algorithms for the CMS Tracker FED

    Gamboa Alvarado, Jose Leandro


    In the current Front End (FE) firmware clusters of hits within the APV frames are found using a simple threshold comparison (which is made between the data and a 3 or 5 sigma strip noise cut) on reordered pedestal and Common Mode (CM) noise subtracted data. In addition the CM noise subtraction requires the baseline of each APV frame to be approximately uniform. Therefore, the current algorithm will fail if the APV baseline exhibits large-scale non-uniform behavior. Under very high luminosity conditions the assumption of a uniform APV baseline breaks down and the FED is unable to maintain a high efficiency of cluster finding. \

  1. Fast algorithms for glassy materials: methods and explorations

    Middleton, A. Alan


    Glassy materials with frozen disorder, including random magnets such as spin glasses and interfaces in disordered materials, exhibit striking non-equilibrium behavior such as the ability to store a history of external parameters (memory). Precisely due to their glassy nature, direct simulation of models of these materials is very slow. In some fortunate cases, however, algorithms exist that exactly compute thermodynamic quantities. Such cases include spin glasses in two dimensions and interfaces and random field magnets in arbitrary dimensions at zero temperature. Using algorithms built using ideas developed by computer scientists and mathematicians, one can even directly sample equilibrium configurations in very large systems, as if one picked the configurations out of a ``hat'' of all configurations weighted by their Boltzmann factors. This talk will provide some of the background for these methods and discuss the connections between physics and computer science, as used by a number of groups. Recent applications of these methods to investigating phase transitions in glassy materials and to answering qualitative questions about the free energy landscape and memory effects will be discussed. This work was supported in part by NSF grant DMR-1006731. Creighton Thomas and David Huse also contributed to much of the work to be presented.

  2. Exploration Of Deep Learning Algorithms Using Openacc Parallel Programming Model

    Hamam, Alwaleed A.


    Deep learning is based on a set of algorithms that attempt to model high level abstractions in data. Specifically, RBM is a deep learning algorithm that used in the project to increase it\\'s time performance using some efficient parallel implementation by OpenACC tool with best possible optimizations on RBM to harness the massively parallel power of NVIDIA GPUs. GPUs development in the last few years has contributed to growing the concept of deep learning. OpenACC is a directive based ap-proach for computing where directives provide compiler hints to accelerate code. The traditional Restricted Boltzmann Ma-chine is a stochastic neural network that essentially perform a binary version of factor analysis. RBM is a useful neural net-work basis for larger modern deep learning model, such as Deep Belief Network. RBM parameters are estimated using an efficient training method that called Contrastive Divergence. Parallel implementation of RBM is available using different models such as OpenMP, and CUDA. But this project has been the first attempt to apply OpenACC model on RBM.

  3. Exploration Of The Dendritic Cell Algorithm Using The Duration Calculus

    Gu, Feng; Aickelin, Uwe


    As one of the newest members in Artificial Immune Systems (AIS), the Dendritic Cell Algorithm (DCA) has been applied to a range of problems. These applications mainly belong to the field of anomaly detection. However, real-time detection, a new challenge to anomaly detection, requires improvement on the real-time capability of the DCA. To assess such capability, formal methods in the research of rea-time systems can be employed. The findings of the assessment can provide guideline for the future development of the algorithm. Therefore, in this paper we use an interval logic based method, named the Duration Calculus (DC), to specify a simplified single-cell model of the DCA. Based on the DC specifications with further induction, we find that each individual cell in the DCA can perform its function as a detector in real-time. Since the DCA can be seen as many such cells operating in parallel, it is potentially capable of performing real-time detection. However, the analysis process of the standard DCA constrict...

  4. A parallel systematic-Monte Carlo algorithm for exploring conformational space.

    Perez-Riverol, Yasset; Vera, Roberto; Mazola, Yuliet; Musacchio, Alexis


    Computational algorithms to explore the conformational space of small molecules are complex and computer demand field in chemoinformatics. In this paper a hybrid algorithm to explore the conformational space of organic molecules is presented. This hybrid algorithm is based in a systematic search approach combined with a Monte Carlo based method in order to obtain an ensemble of low-energy conformations simulating the flexibility of small chemical compounds. The Monte Carlo method uses the Metropolis criterion to accept or reject a conformation through an in-house implementation of the MMFF94s force field to calculate the conformational energy. The parallel design of this algorithm, based on the message passing interface (MPI) paradigm, was implemented. The results showed a performance increase in the terms of speed and efficiency.


    Xinchao ZHAO; Junling HAO


    In order to tradeoff exploration/exploitation and inspired by cell genetic algorithm a cellshift crossover operator for evolutionary algorithm(EA) is proposed in this paper.The definition domain is divided into n-dimension cubic sub-domains(cell) and each individual locates at an ndimensional cube.Cell-shift crossover first exchanges the cell numbers of the crossover pair if they are in the different cells(exploration)and subsequently shift the first individual from its initial place to the other individual's cell place.If they are already in the same cell heuristic crossover(exploitation) is used.Cell-shift/heuristic crossover adaptively executes exploration/exploitation search with the vary of genetic diversity.The cell-shift EA has excellent performance in terms of efficiency and efficacy on ten usually used optimization benchmarks when comparing with the recent well-known FEP evolutionary algorithm.

  6. Performance of humans vs. exploration algorithms on the Tower of London Test.

    Eric Fimbel

    Full Text Available The Tower of London Test (TOL used to assess executive functions was inspired in Artificial Intelligence tasks used to test problem-solving algorithms. In this study, we compare the performance of humans and of exploration algorithms. Instead of absolute execution times, we focus on how the execution time varies with the tasks and/or the number of moves. This approach used in Algorithmic Complexity provides a fair comparison between humans and computers, although humans are several orders of magnitude slower. On easy tasks (1 to 5 moves, healthy elderly persons performed like exploration algorithms using bounded memory resources, i.e., the execution time grew exponentially with the number of moves. This result was replicated with a group of healthy young participants. However, for difficult tasks (5 to 8 moves the execution time of young participants did not increase significantly, whereas for exploration algorithms, the execution time keeps on increasing exponentially. A pre-and post-test control task showed a 25% improvement of visuo-motor skills but this was insufficient to explain this result. The findings suggest that naive participants used systematic exploration to solve the problem but under the effect of practice, they developed markedly more efficient strategies using the information acquired during the test.

  7. Entropic algorithms and the lid method as exploration tools for complex landscapes

    Barettin, Daniele; Sibani, Paolo


    to a single valley, are key to understand the dynamical properties of such systems. In this paper we combine the lid algorithm, a tool for landscape exploration previously applied to a range of models, with the Wang-Swendsen algorithm. To test this improved exploration tool, we consider a paradigmatic complex...... system, the Edwards-Andersom model in two and three spatial dimension. We find a striking difference between the energy dependence of the local density of states in the two cases: nearly flat in the first case, and nearly exponential in the second. The lid dependence of the data is analyzed to estimate...

  8. EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration


    The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graph mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution, diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'

  9. Algorithmic level power and energy optimization for DSP applications: SoftExplorer


    We present SoftExplorer, a tool to estimate the power and energy consumption of an algorithm directly from the C program. Four models of processor are available, from the simple RISC ARM7 to the very complex VLIW DSP TIC67. Important phenomena are taken into account, like cache misses, pipeline stalls, and internal / external memory accesses. We show how to use SoftExplorer to find the best data mapping for a DSP application, and to choose a processor and its operating frequency for a MPEG-1 ...

  10. A Novel Path Planning for Robots Based on Rapidly-Exploring Random Tree and Particle Swarm Optimizer Algorithm

    Zhou Feng


    Full Text Available A based on Rapidly-exploring Random Tree(RRT and Particle Swarm Optimizer (PSO for path planning of the robot is proposed.First the grid method is built to describe the working space of the mobile robot,then the Rapidly-exploring Random Tree algorithm is used to obtain the global navigation path,and the Particle Swarm Optimizer algorithm is adopted to get the better path.Computer experiment results demonstrate that this novel algorithm can plan an optimal path rapidly in a cluttered environment.The successful obstacle avoidance is achieved,and the model is robust and performs reliably.

  11. Exploring high dimensional data with Butterfly: a novel classification algorithm based on discrete dynamical systems.

    Geraci, Joseph; Dharsee, Moyez; Nuin, Paulo; Haslehurst, Alexandria; Koti, Madhuri; Feilotter, Harriet E; Evans, Ken


    We introduce a novel method for visualizing high dimensional data via a discrete dynamical system. This method provides a 2D representation of the relationship between subjects according to a set of variables without geometric projections, transformed axes or principal components. The algorithm exploits a memory-type mechanism inherent in a certain class of discrete dynamical systems collectively referred to as the chaos game that are closely related to iterative function systems. The goal of the algorithm was to create a human readable representation of high dimensional patient data that was capable of detecting unrevealed subclusters of patients from within anticipated classifications. This provides a mechanism to further pursue a more personalized exploration of pathology when used with medical data. For clustering and classification protocols, the dynamical system portion of the algorithm is designed to come after some feature selection filter and before some model evaluation (e.g. clustering accuracy) protocol. In the version given here, a univariate features selection step is performed (in practice more complex feature selection methods are used), a discrete dynamical system is driven by this reduced set of variables (which results in a set of 2D cluster models), these models are evaluated for their accuracy (according to a user-defined binary classification) and finally a visual representation of the top classification models are returned. Thus, in addition to the visualization component, this methodology can be used for both supervised and unsupervised machine learning as the top performing models are returned in the protocol we describe here. Butterfly, the algorithm we introduce and provide working code for, uses a discrete dynamical system to classify high dimensional data and provide a 2D representation of the relationship between subjects. We report results on three datasets (two in the article; one in the appendix) including a public lung cancer

  12. Distributed Clustering Algorithm to Explore Selection Diversity in Wireless Sensor Networks

    Kong, Hyung-Yun; Asaduzzaman, Hyung-Yun

    This paper presents a novel cross-layer approach to explore selection diversity for distributed clustering based wireless sensor networks (WSNs) by selecting a proper cluster-head. We develop and analyze an instantaneous channel state information (CSI) based cluster-head selection algorithm for a distributed, dynamic and randomized clustering based WSN. The proposed cluster-head selection scheme is also random and capable to distribute the energy uses among the nodes in the network. We present an analytical approach to evaluate the energy efficiency and system lifetime of our proposal. Analysis shows that the proposed scheme outperforms the performance of additive white Gaussian noise (AWGN) channel under Rayleigh fading environment. This proposal also outperforms the existing cooperative diversity protocols in terms of system lifetime and implementation complexity.

  13. An Ant Based Framework for Preventing DDoS Attack in Wireless Sensor Networks

    Juneja, Dimple


    Security and Privacy are two important parameters that need to be considered when dealing with wireless sensor networks as WSN operate in an unattended environment and carry sensitive information critical to the application. However, applying security techniques that consume minimum resources is still a challenge and this paper makes an attempt to address the same. One of the major attacks in sensor network is denial of service attack that not only diminishes the network capacity but also affects the reliability of information being transmitted. This work is an extension of our previous work which could successfully detect DDoS using ants. However, no emphasis was made towards the prevention mechanism. In this paper an ant-based framework that exploits the significance of stateless and stateful signatures and hence preserving the legtimate packets only, thereby discarding the contaminated packets has been proposed.

  14. Using rapidly-exploring random tree-based algorithms to find smooth and optimal trajectories

    Matebese, B


    Full Text Available feasible solution faster than other algorithms. The drawback of using RRT is that, as the number of samples increases, the probability that the algorithm converges to a sub-optimal solution increases. Furthermore, the path generated by this algorithm...

  15. Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms.

    Martinez-Canales, Monica L.; Heaphy, Robert (Sandia National Laboratories, Albuquerque, NM); Gramacy, Robert B. (University of Cambridge); Taddy, Matt (University of California, Santa Cruz, CA); Chiesa, Michael L.; Thomas, Stephen W. (Sandia National Laboratories, Albuquerque, NM); Swiler, Laura Painton (Sandia National Laboratories, Albuquerque, NM); Hough, Patricia Diane; Lee, Herbert K. H. (University of California, Santa Cruz, CA); Trucano, Timothy Guy (Sandia National Laboratories, Albuquerque, NM); Gray, Genetha Anne


    This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

  16. Further Exploration of the Dendritic Cell Algorithm: Antigen Multiplier and Time Windows

    Gu, Feng; Aickelin, Uwe


    As an immune-inspired algorithm, the Dendritic Cell Algorithm (DCA), produces promising performances in the field of anomaly detection. This paper presents the application of the DCA to a standard data set, the KDD 99 data set. The results of different implementation versions of the DXA, including the antigen multiplier and moving time windows are reported. The real-valued Negative Selection Algorithm (NSA) using constant-sized detectors and the C4.5 decision tree algorithm are used, to conduct a baseline comparison. The results suggest that the DCA is applicable to KDD 99 data set, and the antigen multiplier and moving time windows have the same effect on the DCA for this particular data set. The real-valued NSA with constant-sized detectors is not applicable to the data set, and the C4.5 decision tree algorithm provides a benchmark of the classification performance for this data set.

  17. Exploring Design Tradeoffs Of A Distributed Algorithm For Cosmic Ray Event Detection

    Yousaf, Suhail; van Steen, Maarten; Voulgaris, Spyros; Kelley, John L


    Many sensor networks, including large particle detector arrays measuring high-energy cosmic-ray air showers, traditionally rely on centralised trigger algorithms to find spatial and temporal coincidences of individual nodes. Such schemes suffer from scalability problems, especially if the nodes communicate wirelessly or have bandwidth limitations. However, nodes which instead communicate with each other can, in principle, use a distributed algorithm to find coincident events themselves without communication with a central node. We present such an algorithm and consider various design tradeoffs involved, in the context of a potential trigger for the Auger Engineering Radio Array (AERA).

  18. Optimal Parameter Exploration for Online Change-Point Detection in Activity Monitoring Using Genetic Algorithms

    Naveed Khan


    Full Text Available In recent years, smart phones with inbuilt sensors have become popular devices to facilitate activity recognition. The sensors capture a large amount of data, containing meaningful events, in a short period of time. The change points in this data are used to specify transitions to distinct events and can be used in various scenarios such as identifying change in a patient’s vital signs in the medical domain or requesting activity labels for generating real-world labeled activity datasets. Our work focuses on change-point detection to identify a transition from one activity to another. Within this paper, we extend our previous work on multivariate exponentially weighted moving average (MEWMA algorithm by using a genetic algorithm (GA to identify the optimal set of parameters for online change-point detection. The proposed technique finds the maximum accuracy and F_measure by optimizing the different parameters of the MEWMA, which subsequently identifies the exact location of the change point from an existing activity to a new one. Optimal parameter selection facilitates an algorithm to detect accurate change points and minimize false alarms. Results have been evaluated based on two real datasets of accelerometer data collected from a set of different activities from two users, with a high degree of accuracy from 99.4% to 99.8% and F_measure of up to 66.7%.

  19. Exploring profit - Sustainability trade-offs in cropping systems using evolutionary algorithms

    DeVoil, P.; Rossing, W.A.H.; Hammer, G.L.


    Models that implement the bio-physical components of agro-ecosystems are ideally suited for exploring sustainability issues in cropping systems. Sustainability may be represented as a number of objectives to be maximised or minimised. However, the full decision space of these objectives is usually v

  20. Exploring Task Mappings on Heterogeneous MPSoCs using a Bias-Elitist Genetic Algorithm

    Quan, W.; Pimentel, A.D.


    Exploration of task mappings plays a crucial role in achieving high performance in heterogeneous multi-processor system-on-chip (MPSoC) platforms. The problem of optimally mapping a set of tasks onto a set of given heterogeneous processors for maximal throughput has been known, in general, to be

  1. A spatial orthogonal allocation and heterogeneous cultural hybrid algorithm for multirobot exploration mission planning


    A spatial orthogonal allocation method is devised for multirobot tasks allocation.A 3D space model is adopted to describe exploration mission;meanwhile spatial orthogonal tentative technology is utilized to update the attractor position for load balance.Heterogeneous interactive cultural hybrid architecture is proposed to solve a robot route planning problem;it utilizes good-point-set to initialize population spaces,redefine novel evolution model and particle evolution ability,and introduce near-neighbor lo...

  2. Application of the Viterbi Algorithm in Hidden Markov Models for Exploring Irrigation Decision Series

    Andriyas, S.; McKee, M.


    Anticipating farmers' irrigation decisions can provide the possibility of improving the efficiency of canal operations in on-demand irrigation systems. Although multiple factors are considered during irrigation decision making, for any given farmer there might be one factor playing a major role. Identification of that biophysical factor which led to a farmer deciding to irrigate is difficult because of high variability of those factors during the growing season. Analysis of the irrigation decisions of a group of farmers for a single crop can help to simplify the problem. We developed a hidden Markov model (HMM) to analyze irrigation decisions and explore the factor and level at which the majority of farmers decide to irrigate. The model requires observed variables as inputs and the hidden states. The chosen model inputs were relatively easily measured, or estimated, biophysical data, including such factors (i.e., those variables which are believed to affect irrigation decision-making) as cumulative evapotranspiration, soil moisture depletion, soil stress coefficient, and canal flows. Irrigation decision series were the hidden states for the model. The data for the work comes from the Canal B region of the Lower Sevier River Basin, near Delta, Utah. The main crops of the region are alfalfa, barley, and corn. A portion of the data was used to build and test the model capability to explore that factor and the level at which the farmer takes the decision to irrigate for future irrigation events. Both group and individual level behavior can be studied using HMMs. The study showed that the farmers cannot be classified into certain classes based on their irrigation decisions, but vary in their behavior from irrigation-to-irrigation across all years and crops. HMMs can be used to analyze what factor and, subsequently, what level of that factor on which the farmer most likely based the irrigation decision. The study shows that the HMM is a capable tool to study a process

  3. Exploring new bands in modified multichannel regression SST algorithms for the next-generation infrared sensors at NOAA

    Petrenko, B.; Ignatov, A.; Kramar, M.; Kihai, Y.


    Multichannel regression algorithms are widely used to retrieve sea surface temperature (SST) from infrared observations with satellite radiometers. Their theoretical foundations were laid in the 1980s-1990s, during the era of the Advanced Very High Resolution Radiometers which have been flown onboard NOAA satellites since 1981. Consequently, the multi-channel and non-linear SST algorithms employ the bands centered at 3.7, 11 and 12 μm, similar to available in AVHRR. More recent radiometers carry new bands located in the windows near 4 μm, 8.5 μm and 10 μm, which may also be used for SST. Involving these bands in SST retrieval requires modifications to the regression SST equations. The paper describes a general approach to constructing SST regression equations for an arbitrary number of radiometric bands and explores the benefits of using extended sets of bands available with the Visible Infrared Imager Radiometer Suite (VIIRS) flown onboard the Suomi National Polar-orbiting Partnership (SNPP) and to be flown onboard the follow-on Joint Polar Satellite System (JPSS) satellites, J1-J4, to be launched from 2017-2031; Moderate Resolution Imaging Spectroradiometers (MODIS) flown onboard Aqua and Terra satellites; and the Advanced Himawari Imager (AHI) flown onboard the Japanese Himawari-8 satellite (which in turn is a close proxy of the Advanced Baseline Imager (ABI) to be flown onboard the future Geostationary Operational Environmental Satellites - R Series (GOES-R) planned for launch in October 2016.

  4. KohonAnts: A Self-Organizing Ant Algorithm for Clustering and Pattern Classification

    Fernandes, C; Merelo, J J; Ramos, V; Laredo, J L J


    In this paper we introduce a new ant-based method that takes advantage of the cooperative self-organization of Ant Colony Systems to create a naturally inspired clustering and pattern recognition method. The approach considers each data item as an ant, which moves inside a grid changing the cells it goes through, in a fashion similar to Kohonen's Self-Organizing Maps. The resulting algorithm is conceptually more simple, takes less free parameters than other ant-based clustering algorithms, and, after some parameter tuning, yields very good results on some benchmark problems.

  5. Exploring New Ways to Deliver Value to Healthcare Organizations: Algorithmic Testing, Data Integration, and Diagnostic E-consult Service.

    Risin, Semyon A; Chang, Brian N; Welsh, Kerry J; Kidd, Laura R; Moreno, Vanessa; Chen, Lei; Tholpady, Ashok; Wahed, Amer; Nguyen, Nghia; Kott, Marylee; Hunter, Robert L


    As the USA Health Care System undergoes transformation and transitions to value-based models it is critical for laboratory medicine/clinical pathology physicians to explore opportunities and find new ways to deliver value, become an integral part of the healthcare team. This is also essential for ensuring financial health and stability of the profession when the payment paradigm changes from fee-for-service to fee-for-performance. About 5 years ago we started searching for ways to achieve this goal. Among other approaches, the search included addressing the laboratory work-ups for specialists' referrals in the HarrisHealth System, a major safety net health care organization serving mostly indigent and underserved population of Harris County, TX. We present here our experience in improving the efficiency of laboratory testing for the referral process and in building a prototype of a diagnostic e-consult service using rheumatologic diseases as a starting point. The service incorporates algorithmic testing, integration of clinical, laboratory and imaging data, issuing structured comprehensive consultation reports, incorporating all the relevant information, and maintaining personal contacts and an e-line of communications with the primary providers and referral center personnel. Ongoing survey of providers affords testimony of service value in terms of facilitating their work and increasing productivity. Analysis of the cost effectiveness and of other value indicators is currently underway. We also discuss our pioneering experience in building pathology residents and fellows training in integrated diagnostic consulting service.

  6. Exploring the velocity distribution of debris flows: An iteration algorithm based approach for complex cross-sections

    Han, Zheng; Chen, Guangqi; Li, Yange; Wang, Wei; Zhang, Hong


    The estimation of debris-flow velocity in a cross-section is of primary importance due to its correlation to impact force, run up and superelevation. However, previous methods sometimes neglect the observed asymmetric velocity distribution, and consequently underestimate the debris-flow velocity. This paper presents a new approach for exploring the debris-flow velocity distribution in a cross-section. The presented approach uses an iteration algorithm based on the Riemann integral method to search an approximate solution to the unknown flow surface. The established laws for vertical velocity profile are compared and subsequently integrated to analyze the velocity distribution in the cross-section. The major benefit of the presented approach is that natural channels typically with irregular beds and superelevations can be taken into account, and the resulting approximation by the approach well replicates the direct integral solution. The approach is programmed in MATLAB environment, and the code is open to the public. A well-documented debris-flow event in Sichuan Province, China, is used to demonstrate the presented approach. Results show that the solutions of the flow surface and the mean velocity well reproduce the investigated results. Discussion regarding the model sensitivity and the source of errors concludes the paper.

  7. 基于蚁群算法的多机器人环境探索%Environment exploration of multi-robots based on ant colony algorithm

    蔡标; 戴学丰; 姜来浩


    To aim at the environment exploration of multi-robots,the ant algorithm is put forward to solve the target distribution and area coverage of multi-robots.According to the comparison of coverage percentage between the ant algorithm and the waiting auction algorithm in two different models,it illustrates that ant algorithm runs fewer steps in the same coverage.%针对多机器人的环境探索问题,采用了蚁群算法,解决了多机器人的目标分配与环境区域覆盖。通过对蚁群算法和等待拍卖算法在两种不同环境模型的覆盖率的比较,表明了蚁群算法在相同覆盖率的情况下运行次数较少。

  8. Ant-based distributed protocol for coordination of a swarm of robots in demining mission

    De Rango, Floriano; Palmieri, Nunzia


    Coordination among multiple robots has been extensively studied, since a number of practical real problem s can be performed using an effective approach. In this paper is investigated a collective task that requires a multi-robot system to search for randomly distributed mines in an unknown environment and disarm them cooperatively. The communication among the swarm of robots influences the overall performance in terms of time to execute the task or consumed energy. To address this problem, a new distributed recruiting protocol to coordinate a swarm of robots in demining mission, is described. This problem is a multi-objective problem and two bio inspired strategies are used. The novelty of this approach lies in the combination of direct and indirect communication: on one hand an indirect communication among robots is used for the exploration of the environment, on the other hand a novel protocol is used to accomplish the recruiting and coordination of the robots for demining task. This protocol attempts to tackle the question of how autonomous robot can coordinate themselves into an unknown environment relying on simple low-level capabilities. The strategy is able to adapt the current system dynamics if the number of robots or the environment structure or both change. The proposed approach has been implemented and has been evaluated in several simulated environments. We analyzed the impact of our approach in the overall performance of a robot team. Experimental results indicated the effectiveness and efficiency of the proposed protocol to spread the robots in the environment.

  9. Accessing primary care Big Data: the development of a software algorithm to explore the rich content of consultation records.

    MacRae, J; Darlow, B; McBain, L; Jones, O; Stubbe, M; Turner, N; Dowell, A


    To develop a natural language processing software inference algorithm to classify the content of primary care consultations using electronic health record Big Data and subsequently test the algorithm's ability to estimate the prevalence and burden of childhood respiratory illness in primary care. Algorithm development and validation study. To classify consultations, the algorithm is designed to interrogate clinical narrative entered as free text, diagnostic (Read) codes created and medications prescribed on the day of the consultation. Thirty-six consenting primary care practices from a mixed urban and semirural region of New Zealand. Three independent sets of 1200 child consultation records were randomly extracted from a data set of all general practitioner consultations in participating practices between 1 January 2008-31 December 2013 for children under 18 years of age (n=754,242). Each consultation record within these sets was independently classified by two expert clinicians as respiratory or non-respiratory, and subclassified according to respiratory diagnostic categories to create three 'gold standard' sets of classified records. These three gold standard record sets were used to train, test and validate the algorithm. Sensitivity, specificity, positive predictive value and F-measure were calculated to illustrate the algorithm's ability to replicate judgements of expert clinicians within the 1200 record gold standard validation set. The algorithm was able to identify respiratory consultations in the 1200 record validation set with a sensitivity of 0.72 (95% CI 0.67 to 0.78) and a specificity of 0.95 (95% CI 0.93 to 0.98). The positive predictive value of algorithm respiratory classification was 0.93 (95% CI 0.89 to 0.97). The positive predictive value of the algorithm classifying consultations as being related to specific respiratory diagnostic categories ranged from 0.68 (95% CI 0.40 to 1.00; other respiratory conditions) to 0.91 (95% CI 0.79 to 1

  10. Exploring the role of decoherence in condensed-phase nonadiabatic dynamics: a comparison of different mixed quantum/classical simulation algorithms for the excited hydrated electron.

    Larsen, Ross E; Bedard-Hearn, Michael J; Schwartz, Benjamin J


    Mixed quantum/classical (MQC) molecular dynamics simulation has become the method of choice for simulating the dynamics of quantum mechanical objects that interact with condensed-phase systems. There are many MQC algorithms available, however, and in cases where nonadiabatic coupling is important, different algorithms may lead to different results. Thus, it has been difficult to reach definitive conclusions about relaxation dynamics using nonadiabatic MQC methods because one is never certain whether any given algorithm includes enough of the necessary physics. In this paper, we explore the physics underlying different nonadiabatic MQC algorithms by comparing and contrasting the excited-state relaxation dynamics of the prototypical condensed-phase MQC system, the hydrated electron, calculated using different algorithms, including: fewest-switches surface hopping, stationary-phase surface hopping, and mean-field dynamics with surface hopping. We also describe in detail how a new nonadiabatic algorithm, mean-field dynamics with stochastic decoherence (MF-SD), is to be implemented for condensed-phase problems, and we apply MF-SD to the excited-state relaxation of the hydrated electron. Our discussion emphasizes the different ways quantum decoherence is treated in each algorithm and the resulting implications for hydrated-electron relaxation dynamics. We find that for three MQC methods that use Tully's fewest-switches criterion to determine surface hopping probabilities, the excited-state lifetime of the electron is the same. Moreover, the nonequilibrium solvent response function of the excited hydrated electron is the same with all of the nonadiabatic MQC algorithms discussed here, so that all of the algorithms would produce similar agreement with experiment. Despite the identical solvent response predicted by each MQC algorithm, we find that MF-SD allows much more mixing of multiple basis states into the quantum wave function than do other methods. This leads to an

  11. Algorithmic Adventures

    Hromkovic, Juraj


    Explores the science of computing. This book starts with the development of computer science, algorithms and programming, and then explains and shows how to exploit the concepts of infinity, computability, computational complexity, nondeterminism and randomness.

  12. 基于改进RRT算法的无人机航迹规划%Path planning of UAV based on the improved rapidly-exploring random tree algorithm

    崔挺; 李俨; 张明庄


    为了提高无人机的作战效率,航迹规划系统必须为无人机设计出安全系数高,能量消耗少,处理时间短,同时还必须满足飞行器自身物理特性的威胁回避轨迹.基于上述研究目的,本文选择快速随机搜索树算法(RRT)作为迹规划航算法主体,结合Dijkstra算法改进了RRT算法,完成最小航迹代价飞行轨迹的设计.%In order to improve the operational efficiency of the uav,path planning for uav system must design a high safety coefficient,less energy consumption,the short processing time,but also must satisfy the physical characteristics of the vehicle itself threat avoidance path.Based on the above research purpose,this article chooses fast rando.m-exploring search tree algorithm (RRT) as a trace planning navigation algorithm,combined with the main Dijkstra algorithm improved the RRT algorithm,complete the minimum path cost trajectory design.

  13. Vision Algorithm for the Solar Aspect System of the High Energy Replicated Optics to Explore the Sun Mission

    Cramer, Alexander Krishnan


    This work covers the design and test of a machine vision algorithm for generating high- accuracy pitch and yaw pointing solutions relative to the sun on a high altitude balloon. It describes how images were constructed by focusing an image of the sun onto a plate printed with a pattern of small cross-shaped fiducial markers. Images of this plate taken with an off-the-shelf camera were processed to determine relative position of the balloon payload to the sun. The algorithm is broken into four problems: circle detection, fiducial detection, fiducial identification, and image registration. Circle detection is handled by an "Average Intersection" method, fiducial detection by a matched filter approach, and identification with an ad-hoc method based on the spacing between fiducials. Performance is verified on real test data where possible, but otherwise uses artificially generated data. Pointing knowledge is ultimately verified to meet the 20 arcsecond requirement.

  14. A Search and Exploration of Multi-Exoplanet Systems Via Transit Timing Variation (TTV) Algorithms for the K2 Mission

    Dholakia, Shishir; Dholakia, Shashank; Cody, Ann Marie


    We use the K2 mission to search for and analyze multi-planet systems with the goal of performing a scalable search for multi-planet systems using the transit timing variation (TTV) method. We developed an algorithm in Python to perform a search for synodic TTVs from multi-planet systems. The algorithm analyzes images taken by the K2 mission, creates light curves, and searches for TTVs on the order of a few minutes for every star in the images. We detected 4 potential TTV signals of which 3 are possible new discoveries. One of the systems has known multiple transiting planets and exhibits TTVs consistent with theoretical and previously published TTVs from n-body simulations. Another exoplanet system exhibits possible TTVs consistent with at least two giant planets. Our results demonstrate that a search for TTVs with the K2 mission is possible, though difficult.

  15. Exploration of the optimisation algorithms used in the implementation of adaptive optics in confocal and multiphoton microscopy.

    Wright, Amanda J; Burns, David; Patterson, Brett A; Poland, Simon P; Valentine, Gareth J; Girkin, John M


    We report on the introduction of active optical elements into confocal and multiphoton microscopes in order to reduce the sample-induced aberration. Using a flexible membrane mirror as the active element, the beam entering the rear of the microscope objective is altered to produce the smallest point spread function once it is brought to a focus inside the sample. The conventional approach to adaptive optics, commonly used in astronomy, is to utilise a wavefront sensor to determine the required mirror shape. We have developed a technique that uses optimisation algorithms to improve the returned signal without the use of a wavefront sensor. We have investigated a number of possible optimisation methods, covering hill climbing, genetic algorithms, and more random search methods. The system has demonstrated a significant enhancement in the axial resolution of a confocal microscope when imaging at depth within a sample. We discuss the trade-offs of the various approaches adopted, comparing speed with resolution enhancement.

  16. The Cyborg Astrobiologist: testing a novelty detection algorithm on two mobile exploration systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    McGuire, P. C.; Gross, C.; Wendt, L.; Bonnici, A.; Souza-Egipsy, V.; Ormö, J.; Díaz-Martínez, E.; Foing, B. H.; Bose, R.; Walter, S.; Oesker, M.; Ontrup, J.; Haschke, R.; Ritter, H.


    In previous work, a platform was developed for testing computer-vision algorithms for robotic planetary exploration. This platform consisted of a digital video camera connected to a wearable computer for real-time processing of images at geological and astrobiological field sites. The real-time processing included image segmentation and the generation of interest points based upon uncommonness in the segmentation maps. Also in previous work, this platform for testing computer-vision algorithms has been ported to a more ergonomic alternative platform, consisting of a phone camera connected via the Global System for Mobile Communications (GSM) network to a remote-server computer. The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon colour, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colours to test this algorithm. The algorithm robustly recognized previously observed units by their colour, while requiring only a single image or a few images to

  17. Exploring the Role of Genetic Algorithms and Artificial Neural Networks for Interpolation of Elevation in Geoinformation Models

    Bagheri, H.; Sadjadi, S. Y.; Sadeghian, S.


    One of the most significant tools to study many engineering projects is three-dimensional modelling of the Earth that has many applications in the Geospatial Information System (GIS), e.g. creating Digital Train Modelling (DTM). DTM has numerous applications in the fields of sciences, engineering, design and various project administrations. One of the most significant events in DTM technique is the interpolation of elevation to create a continuous surface. There are several methods for interpolation, which have shown many results due to the environmental conditions and input data. The usual methods of interpolation used in this study along with Genetic Algorithms (GA) have been optimised and consisting of polynomials and the Inverse Distance Weighting (IDW) method. In this paper, the Artificial Intelligent (AI) techniques such as GA and Neural Networks (NN) are used on the samples to optimise the interpolation methods and production of Digital Elevation Model (DEM). The aim of entire interpolation methods is to evaluate the accuracy of interpolation methods. Universal interpolation occurs in the entire neighbouring regions can be suggested for larger regions, which can be divided into smaller regions. The results obtained from applying GA and ANN individually, will be compared with the typical method of interpolation for creation of elevations. The resulting had performed that AI methods have a high potential in the interpolation of elevations. Using artificial networks algorithms for the interpolation and optimisation based on the IDW method with GA could be estimated the high precise elevations.

  18. Rapidly-exploring Random Tree Algorithm Based on Dynamic Step%动态步长的RRT路径规划算法

    王道威; 朱明富; 刘慧


    Although the traditional Rapidly-exploring Random Tree ( RRT) algorithm has many good features,there is a lot of random-ness in path planning of RRT because of the random selection of the vertex. Based on the improvement of RRT algorithm,a new RRT path planning algorithm of dynamic step size is proposed in this paper. The step size is the minimum unit length when RRT exploring. Based on traditional RRT,the dynamic step size is added,avoiding the uncertainty,and the obstacle avoidance capability is improved,thus the path planning of RRT algorithm has both obstacle avoidance ability and high certainty. The results of simulation experiments show that the algorithm has the features of avoiding the uncertainty,fast speed and obstacle avoidance in path planning.%传统的快速扩展随机树( RRT)算法虽然有很多优良特性,但是由于扩展点的随机选取,规划出来的路径具有很大的随机性。文中在对RRT算法改进的基础上,提出了一种动态步长的RRT路径规划算法。其中步长为RRT生长的最小单位长度。动态步长的RRT算法是在对传统RRT算法的基础上,添加了动态步长的特性,改善了快速扩展随机树的不确定性,提高了避障能力,使得算法确定性和高避障能力兼备。仿真实验结果表明,该算法在路径规划中具有路径确定、速度快和高避障能力的特点。

  19. Genetic Algorithm Supported by Graphical Processing Unit Improves the Exploration of Effective Connectivity in Functional Brain Imaging

    Lawrence Wing Chi Chan


    Full Text Available Brain regions of human subjects exhibit certain levels of associated activation upon specific environmental stimuli. Functional Magnetic Resonance Imaging (fMRI detects regional signals, based on which we could infer the direct or indirect neuronal connectivity between the regions. Structural Equation Modeling (SEM is an appropriate mathematical approach for analyzing the effective connectivity using fMRI data. A maximum likelihood (ML discrepancy function is minimized against some constrained coefficients of a path model. The minimization is an iterative process. The computing time is very long as the number of iterations increases geometrically with the number of path coefficients. Using regular Quad-Core Central Processing Unit (CPU platform, duration up to three months is required for the iterations from 0 to 30 path coefficients. This study demonstrates the application of Graphical Processing Unit (GPU with the parallel Genetic Algorithm (GA that replaces the Powell minimization in the standard program code of the analysis software package. It was found in the same example that GA under GPU reduced the duration to 20 hours and provided more accurate solution when compared with standard program code under CPU.

  20. The Cyborg Astrobiologist: Testing a Novelty-Detection Algorithm on Two Mobile Exploration Systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    McGuire, P C; Wendt, L; Bonnici, A; Souza-Egipsy, V; Ormo, J; Diaz-Martinez, E; Foing, B H; Bose, R; Walter, S; Oesker, M; Ontrup, J; Haschke, R; Ritter, H


    (ABRIDGED)In previous work, two platforms have been developed for testing computer-vision algorithms for robotic planetary exploration (McGuire et al. 2004b,2005; Bartolo et al. 2007). The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone-camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon color, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone-camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing hav...

  1. Arc Based Ant Colony Optimization Algorithm for optimal design of gravitational sewer networks

    R. Moeini


    Full Text Available In this paper, constrained and unconstrained versions of a new formulation of Ant Colony Optimization Algorithm (ACOA named Arc Based Ant Colony Optimization Algorithm (ABACOA are augmented with the Tree Growing Algorithm (TGA and used for the optimal layout and pipe size design of gravitational sewer networks. The main advantages offered by the proposed ABACOA formulation are proper definition of heuristic information, a useful component of the ant-based algorithms, and proper trade-off between the two conflicting search attributes of exploration and exploitation. In both the formulations, the TGA is used to incrementally construct feasible tree-like layouts out of the base layout. In the first formulation, unconstrained version of ABACOA is used to determine the nodal cover depths of sewer pipes while in the second formulation, a constrained version of ABACOA is used to determine the nodal cover depths of sewer pipes which satisfy the pipe slopes constraint. Three different methods of cut determination are also proposed to complete the construction of a tree-like network containing all base layout pipes, here. The proposed formulations are used to solve three test examples of different scales and the results are presented and compared with other available results in the literature. Comparison of the results shows that best results are obtained using the third cutting method in both the formulations. In addition, the results indicate the ability of the proposed methods and in particular the constrained version of ABACOA equipped with TGA to solve sewer networks design optimization problem. To be specific, the constrained version of ABACOA has been able to produce results 0.1%, 1% and 2.1% cheaper than those obtained by the unconstrained version of ABACOA for the first, second and the third test examples, respectively.



    <正>20072798 Chen Fengyun(China University of Mining and Technology,Xuzhou 221008,China);Hang Yuan Algorithm and Application of the Coherency/Variance Cube Technique(Geophysical and Geochemical Exploration,ISSN1000-8918,CN11-1906/P,30(3),2006,p.250-253,257,7 illus.,7 refs.)Key words:seismic exploration The coherency/variance cube technique has been developed in recent years as a new technique of seismic data interpretation.

  3. Exploring Fractals.

    Dewdney, A. K.


    Explores the subject of fractal geometry focusing on the occurrence of fractal-like shapes in the natural world. Topics include iterated functions, chaos theory, the Lorenz attractor, logistic maps, the Mandelbrot set, and mini-Mandelbrot sets. Provides appropriate computer algorithms, as well as further sources of information. (JJK)

  4. Algorithmic Self

    Markham, Annette

    layered set of accounts to help build our understanding of how individuals relate to their devices, search systems, and social network sites. This work extends critical analyses of the power of algorithms in implicating the social self by offering narrative accounts from multiple perspectives. It also......This paper takes an actor network theory approach to explore some of the ways that algorithms co-construct identity and relational meaning in contemporary use of social media. Based on intensive interviews with participants as well as activity logging and data tracking, the author presents a richly...... contributes an innovative method for blending actor network theory with symbolic interaction to grapple with the complexity of everyday sensemaking practices within networked global information flows....

  5. Exploration of Military Logistics Database Encryption and AES Algorithm%军事物流数据库加密与AES算法的探究

    王凤忠; 吕亚飞; 邹饶邦彦


    In this paper, in light of the dire situation of database safety of the military logistics information system, we analyzed the database encryption technology and the encryption and decryption processes and the round key generation process of the AES algorithm, used Java to realize the programming and testing of the algorithm in My Eclipse and at the end, through a numerical example, demonstrated the validity of the algorithm.%针对目前军事物流信息系统中数据库安全面临的严峻形势,分析数据库加密技术方法和AES算法的加解密过程及轮密钥生成过程,采用Java语言在my eclipse中实现AES算法的编程及测试,通过实例验证了算法的有效性。

  6. Exploring Effects of High School Students' Mathematical Processing Skills and Conceptual Understanding of Chemical Concepts on Algorithmic Problem Solving

    Gultepe, Nejla; Yalcin Celik, Ayse; Kilic, Ziya


    The purpose of the study was to examine the effects of students' conceptual understanding of chemical concepts and mathematical processing skills on algorithmic problem-solving skills. The sample (N = 554) included grades 9, 10, and 11 students in Turkey. Data were collected using the instrument "MPC Test" and with interviews. The…

  7. Exploring quadrangulations

    Peng, Chihan


    Here we presented a framework to explore quad mesh topologies. The core of our work is a systematic enumeration algorithm that can generate all possible quadrangular meshes inside a defined boundary with an upper limit of v3-v5 pairs. The algorithm is orders of magnitude more efficient than previous work. The combination of topological enumeration and shape-space exploration demonstrates that mesh topology has a powerful influence on geometry. The Fig. 18. A gallery of different quadrilateral meshes for a Shuriken. The quadrilaterals of the model were colored in a postprocess. Topological variations have distinctive, interesting patterns of mesh lines. © 2014 ACM 0730-0301/2014/01-ART3 15.00.

  8. Group Leaders Optimization Algorithm

    Daskin, Anmer


    Complexity of global optimization algorithms makes implementation of the algorithms difficult and leads the algorithms to require more computer resources for the optimization process. The ability to explore the whole solution space without increasing the complexity of algorithms has a great importance to not only get reliable results but so also make the implementation of these algorithms more convenient for higher dimensional and complex-real world problems in science and engineering. In this paper, we present a new global optimization algorithm in which the influence of the leaders in social groups is used as an inspiration for the evolutionary technique that is designed into a group architecture similar to the architecture of Cooperative Coevolutionary Algorithms. Therefore, we present the implementation method and the experimental results for the single and multidimensional optimization test problems and a scientific real world problem, the energies and the geometric structures of Lennard-Jones clusters.

  9. Exploring a Better Multi-Channel Scheduling Algorithm for Fast Data Collection in Wireless Multimedia Sensor Networks (WMSNs)%一种适用于WMSNs的多信道快速数据收集算法

    张龙妹; 史浩山; 杨俊刚; 陆伟


    为了提高WMSNs中多个源节点到sink节点的数据收集效率,文章提出了一种基于树型拓扑结构的多信道快速数据收集算法.该算法有三个主要特点:基于接收方的信道分配算法有效地消除了信道间的干扰;TDMA机制消除了节点间的竞争和冲突;节点度受限的平衡路由树的构建,消除了由于单个节点度太深所造成的调度瓶颈.通过在不同节点配置密度下的深入仿真,验证了文中提出的多信道调度算法与同样基于树的多信道调度协议TMCP相比,具有更快的调度收集性能,同时,采用平衡路由树进一步缩短了收集调度长度.%The introduction of the full paper reviews some papers in the open literature and then proposes the exploration of a better scheduling algorithm, which is explained in sections 1 and 2.Section 1 addresses the interferences and conflicts in WMSNs and introduce the data collection scheduling problem.Section 2 proposes a hybrid multichannel scheduling algorithm on tree-based WMSNs to eliminate the interferences and conflicts in WMSNs; its core consists of: (1) we present a receiver-based multi-channel assignment algorithm and give its implementation; (2) we implement a breadth-first-search time slot assignment algorithm to avoid conflicts; (3) we construct a degreeconstrained balanced routing tree to further enhance the data collection rate.Section 3 simulates our receiver-based multi-channel assignment algorithm and compares it with a recent tree-based multi-channel protocol (TMCP); the simulation results, given in Figs.3 and 4, and their analysis show preliminarily that: ( 1 ) our receiver-based channel assignment algorithm can outperform much the TMCP in terms of data collection; (2) the combination of multiple channels with balanced routing trees can reduce the schedule length greatly compared with the unbalanced routing tree single channel.

  10. Exploration of a capability-focused aerospace system of systems architecture alternative with bilayer design space, based on RST-SOM algorithmic methods.

    Li, Zhifei; Qin, Dongliang; Yang, Feng


    In defense related programs, the use of capability-based analysis, design, and acquisition has been significant. In order to confront one of the most challenging features of a huge design space in capability based analysis (CBA), a literature review of design space exploration was first examined. Then, in the process of an aerospace system of systems design space exploration, a bilayer mapping method was put forward, based on the existing experimental and operating data. Finally, the feasibility of the foregoing approach was demonstrated with an illustrative example. With the data mining RST (rough sets theory) and SOM (self-organized mapping) techniques, the alternative to the aerospace system of systems architecture was mapping from P-space (performance space) to C-space (configuration space), and then from C-space to D-space (design space), respectively. Ultimately, the performance space was mapped to the design space, which completed the exploration and preliminary reduction of the entire design space. This method provides a computational analysis and implementation scheme for large-scale simulation.

  11. Parallel Architectures and Bioinspired Algorithms

    Pérez, José; Lanchares, Juan


    This monograph presents examples of best practices when combining bioinspired algorithms with parallel architectures. The book includes recent work by leading researchers in the field and offers a map with the main paths already explored and new ways towards the future. Parallel Architectures and Bioinspired Algorithms will be of value to both specialists in Bioinspired Algorithms, Parallel and Distributed Computing, as well as computer science students trying to understand the present and the future of Parallel Architectures and Bioinspired Algorithms.

  12. Algorithm design

    Kleinberg, Jon


    Algorithm Design introduces algorithms by looking at the real-world problems that motivate them. The book teaches students a range of design and analysis techniques for problems that arise in computing applications. The text encourages an understanding of the algorithm design process and an appreciation of the role of algorithms in the broader field of computer science.

  13. Genetic algorithms

    Wang, Lui; Bayer, Steven E.


    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  14. 无偿献血者血液筛查策略探讨%To Explore the Screening Algorithm for Non-remunerate Voluntary Blood Donations

    刘宇宁; 阮玉琦; 苏志华


    目的 评估不同血液筛查模式对筛查效果的影响,为适应新版《血站技术操作规程》,制订适合本地区的献血者血液筛查策略提供依据.方法 选择2010~2011年上海市奉贤区血站37 136例无偿献血标本,其中预约献血标本13 693例,预约后采血标本10 389例,无预约采血标本13 054例.同时在无预约采血者中选择了2 705例在采血前先行ALT快速法,合格后再采血,分析不同的采血模式对筛查不合格率的影响.结果 无论是预约还是无预约献血,ALT为主要不合格原因.预约后献血者所有检测指标不合格率(2.70%)明显较无预约者低(5.62%),采血前筛查ALT能有效降低ALT不合格率(从6.33%降为2.88%),而预约后采血能显著降低HBsAg和抗-HCV等其他指标的不合格率(从1.85%降为0.73%).结论 对于计划性的团体无偿献血者的血液筛查,以预约献血结合采血前ALT干化学法筛查的策略为佳.%Objective To assess the effect of different blood screening algorithms on the screening results, to stipulate for a proper blood screening policy under the guidance of the new version of Blood Bank Technique Regulations of China. Methods Blood screening results of 37 136 ran-remunerate voluntary blood donations from Shanghai Fengxian Blood Bank in 2010 to 2011 were selected, among which 13 693 were pre-donated blood samples from appointment blood donors, 10 398 were donation samples from appointment blood donors, and 13 054 were donation samples from un-appointment blood donors. 2 705 donation samples from un-appointment donors with ALT pre-screened were also been selected. The blood screening results were analyzed to enclose the effect of different screening algorithms. Results Abnormal ALT were the main unqualified reasons both in appointment and in un-appointment blood donations. The unqualified rate of appointment donations (2. 70%) was significant lower than that of un-appointment donations (5. 62%). ALT

  15. Algorithmic cryptanalysis

    Joux, Antoine


    Illustrating the power of algorithms, Algorithmic Cryptanalysis describes algorithmic methods with cryptographically relevant examples. Focusing on both private- and public-key cryptographic algorithms, it presents each algorithm either as a textual description, in pseudo-code, or in a C code program.Divided into three parts, the book begins with a short introduction to cryptography and a background chapter on elementary number theory and algebra. It then moves on to algorithms, with each chapter in this section dedicated to a single topic and often illustrated with simple cryptographic applic

  16. Algorithmic mathematics

    Hougardy, Stefan


    Algorithms play an increasingly important role in nearly all fields of mathematics. This book allows readers to develop basic mathematical abilities, in particular those concerning the design and analysis of algorithms as well as their implementation. It presents not only fundamental algorithms like the sieve of Eratosthenes, the Euclidean algorithm, sorting algorithms, algorithms on graphs, and Gaussian elimination, but also discusses elementary data structures, basic graph theory, and numerical questions. In addition, it provides an introduction to programming and demonstrates in detail how to implement algorithms in C++. This textbook is suitable for students who are new to the subject and covers a basic mathematical lecture course, complementing traditional courses on analysis and linear algebra. Both authors have given this "Algorithmic Mathematics" course at the University of Bonn several times in recent years.

  17. Quantum Algorithms

    Abrams, D.; Williams, C.


    This thesis describes several new quantum algorithms. These include a polynomial time algorithm that uses a quantum fast Fourier transform to find eigenvalues and eigenvectors of a Hamiltonian operator, and that can be applied in cases for which all know classical algorithms require exponential time.

  18. Total algorithms

    Tel, G.


    We define the notion of total algorithms for networks of processes. A total algorithm enforces that a "decision" is taken by a subset of the processes, and that participation of all processes is required to reach this decision. Total algorithms are an important building block in the design of distri


    ZHAO Xinchao


    The greedy algorithm is a strong local searching algorithm. The genetica lgorithm is generally applied to the global optimization problems. In this paper, we combine the greedy idea and the genetic algorithm to propose the greedy genetic algorithm which incorporates the global exploring ability of the genetic algorithm and the local convergent ability of the greedy algorithm. Experimental results show that greedy genetic algorithm gives much better results than the classical genetic algorithm.

  20. A Collaborative Solving Algorithm for Dynamic Distributed Constraint Optimization Problem%一种动态分布式约束优化问题协同求解算法*

    葛方振; 魏臻; 陆阳; 邱述威; 李丽香


    A large number of problems in the multiagent collaboration process can be modeled under the framework of distributed constraint optimization problem ( DCOP) . However, DCOP framework is limited to the issue of planning, and the agents in DCOP generally require a complete and accurate reward function. To resolve this issue, a dynamic distributed constraint optimization problem ( DDCOP ) is defined, and DDCOP's crucial operations, exploration and exploitation, are analyzed. Furthermore, a chaotic ant based collaborative solving algorithm for dynamic distributed constraint optimization problem ( CA-DDCOP ) is proposed. The CA-DDCOP algorithm is established based on chaotic behavior of a single ant and self-organizing behavior of ant colony, thereby the exploration and exploitation are realized. The proposed algorithm achieves the collaboration of exploration and exploitation according to the Boltzmann distribution. Then a channel allocation in multi-radio multi-channel Ad Hoc networks is solved by the CA-DDCOP algorithm. The simulation results show that the CA-DDCOP algorithm performs effectively.%多Agent协作过程中的许多问题都可在分布式约束优化问题( DCOP)框架下建模,但多局限于规划问题,且一般需Agent具有完全、准确收益函数。针对DCOP局限性,定义动态分布式约束优化问题(DDCOP),分析求解它的两个关键操作:Exploration和Exploitation,提出基于混沌蚂蚁的DDCOP协同求解算法( CA-DDCOP)。该算法借鉴单只蚂蚁的混沌行为和蚁群的自组织行为,实现Exploration和Exploitation,根据玻尔兹曼分布,建立平衡Explora-tion和Exploitation的协同方法。通过多射频多信道无线Ad Hoc网络的信道分配验证该算法的有效性。

  1. Algorithms and parallel computing

    Gebali, Fayez


    There is a software gap between the hardware potential and the performance that can be attained using today's software parallel program development tools. The tools need manual intervention by the programmer to parallelize the code. Programming a parallel computer requires closely studying the target algorithm or application, more so than in the traditional sequential programming we have all learned. The programmer must be aware of the communication and data dependencies of the algorithm or application. This book provides the techniques to explore the possible ways to

  2. Diversity-Guided Evolutionary Algorithms

    Ursem, Rasmus Kjær


    Population diversity is undoubtably a key issue in the performance of evolutionary algorithms. A common hypothesis is that high diversity is important to avoid premature convergence and to escape local optima. Various diversity measures have been used to analyze algorithms, but so far few...... algorithms have used a measure to guide the search. The diversity-guided evolutionary algorithm (DGEA) uses the wellknown distance-to-average-point measure to alternate between phases of exploration (mutation) and phases of exploitation (recombination and selection). The DGEA showed remarkable results...

  3. Learning Intelligent Genetic Algorithms Using Japanese Nonograms

    Tsai, Jinn-Tsong; Chou, Ping-Yi; Fang, Jia-Cen


    An intelligent genetic algorithm (IGA) is proposed to solve Japanese nonograms and is used as a method in a university course to learn evolutionary algorithms. The IGA combines the global exploration capabilities of a canonical genetic algorithm (CGA) with effective condensed encoding, improved fitness function, and modified crossover and…

  4. Learning Intelligent Genetic Algorithms Using Japanese Nonograms

    Tsai, Jinn-Tsong; Chou, Ping-Yi; Fang, Jia-Cen


    An intelligent genetic algorithm (IGA) is proposed to solve Japanese nonograms and is used as a method in a university course to learn evolutionary algorithms. The IGA combines the global exploration capabilities of a canonical genetic algorithm (CGA) with effective condensed encoding, improved fitness function, and modified crossover and…


    陶志勇; 蒋守凤


    Based on ant colony algorithm we introduced the concept of fuzzy theory, and proposed a routing algorithm which is based on fuzzy theory and ant colony.In the algorithm, the forward ant selects the next hop node using fuzzy comprehensive evaluation method when exploring the path.In the update process of pheromone, the backward ants whom successfully reach the aggregation node and converted will enhance the path pheromones according to the network information carried by the forward ants, while whom failed to reach have to weaken the pheromone.In data transmission the algorithm uses low energy nodes dormant working mechanism to achieve the goal of balancing the energy consumption of network nodes.Simulation results showed that comparing with energy-efficient ant-based routing ( EEABR) algorithm, this algorithm effectively reduced the average energy consumption of the network and enhanced the survival of network nodes under the same condi-tions.%在蚁群算法基础上引入模糊理论的概念,提出基于模糊理论和蚁群BFTAC( Based on Fuzzy Theory and Ant Colony)的路由算法。 BFTAC算法前向蚂蚁在路径探索中,通过模糊综合评判法选择下一跳节点;信息素更新过程中,成功到达汇聚节点转化的后向蚂蚁根据对应的前向蚂蚁携带的网络信息增强路径信息素,而未成功到达的要削弱信息素;数据传输时,采用低能量节点休眠工作机制,以此达到均衡网络节点的能耗的目的。仿真实验表明,与基于能量有效蚁群算法( EEABR )进行比较,相同条件下BFTAC算法有效地减少了网络平均能量消耗,增强了网络节点的存活率。

  6. The enhanced algorithms and its implementation for the abnormal response characteristics in electrical exploration%电法勘探中异常响应特征的增强算法及其实现

    戴前伟; 陈德鹏; 陈勇雄; 侯智超


      In the electrical exploration, the abnormal response of the geological body may have some problems such as weak signal, not evident feature or fuzzy contour. These affect the correctness and accuracy of the geological inter-pretation. In this paper, solving the first or secondary derivative to the abnormal response contour along the direction of the gradient direction and using gradient operator or Laplace operator are applied to enhance the feature of abnor-mal response. The result of numerical forward modeling shows that this algorithm can make the abnormal center of the abnormal response contour more definite, the range of contour clearer, it proves that these methods have a positive effect on the correctness and accuracy of geological interpretation.%  地球物理电法勘探中所获得的地下地质异常体的异常响应有时存在信号不强、特征不明显或范围轮廓模糊不清,影响了地质解释与判读的正确性和精确度。通过对异常响应等值线图沿梯度带变化剧烈的方向求一次导数、二次导数及执行梯度算子和拉普拉斯算子来增强异常响应特征,能使异常响应等值线图的异常中心更加明确,异常响应范围边缘更加清晰,这对地质解释与判读的正确性和精确度有积极作用。

  7. Combinatorial algorithms

    Hu, T C


    Newly enlarged, updated second edition of a valuable text presents algorithms for shortest paths, maximum flows, dynamic programming and backtracking. Also discusses binary trees, heuristic and near optimums, matrix multiplication, and NP-complete problems. 153 black-and-white illus. 23 tables.Newly enlarged, updated second edition of a valuable, widely used text presents algorithms for shortest paths, maximum flows, dynamic programming and backtracking. Also discussed are binary trees, heuristic and near optimums, matrix multiplication, and NP-complete problems. New to this edition: Chapter 9

  8. Autodriver algorithm

    Anna Bourmistrova


    Full Text Available The autodriver algorithm is an intelligent method to eliminate the need of steering by a driver on a well-defined road. The proposed method performs best on a four-wheel steering (4WS vehicle, though it is also applicable to two-wheel-steering (TWS vehicles. The algorithm is based on coinciding the actual vehicle center of rotation and road center of curvature, by adjusting the kinematic center of rotation. The road center of curvature is assumed prior information for a given road, while the dynamic center of rotation is the output of dynamic equations of motion of the vehicle using steering angle and velocity measurements as inputs. We use kinematic condition of steering to set the steering angles in such a way that the kinematic center of rotation of the vehicle sits at a desired point. At low speeds the ideal and actual paths of the vehicle are very close. With increase of forward speed the road and tire characteristics, along with the motion dynamics of the vehicle cause the vehicle to turn about time-varying points. By adjusting the steering angles, our algorithm controls the dynamic turning center of the vehicle so that it coincides with the road curvature center, hence keeping the vehicle on a given road autonomously. The position and orientation errors are used as feedback signals in a closed loop control to adjust the steering angles. The application of the presented autodriver algorithm demonstrates reliable performance under different driving conditions.

  9. Exploration architecturale pour la conception d'un système sur puce de vision robotique, adéquation algorithme-architecture d'un système embarqué temps-réel

    Lefebvre, Thomas


    This Ph.D Thesis stands at the crossroads of three scientific domains : algorithm-architecture adequacy, bio-inspired vision systems in mobile robotics, and image processing.The goal is to make a robot autonomous in its visual perception, by the integration to the robot of this cognitive task, usually executed on remote processing servers.To achieve this goal, the design approach follows a path of algorithm architecture adequacy, where the different image processing steps of the vision system...

  10. Parallel algorithms

    Casanova, Henri; Robert, Yves


    ""…The authors of the present book, who have extensive credentials in both research and instruction in the area of parallelism, present a sound, principled treatment of parallel algorithms. … This book is very well written and extremely well designed from an instructional point of view. … The authors have created an instructive and fascinating text. The book will serve researchers as well as instructors who need a solid, readable text for a course on parallelism in computing. Indeed, for anyone who wants an understandable text from which to acquire a current, rigorous, and broad vi

  11. Algorithm 865

    Gustavson, Fred G.; Reid, John K.; Wasniewski, Jerzy


    variables, and the speed is usually better than that of the LAPACK algorithm that uses full storage (n2 variables). Included are subroutines for rearranging a matrix whose upper or lower-triangular part is packed by columns to this format and for the inverse rearrangement. Also included is a kernel......We present subroutines for the Cholesky factorization of a positive-definite symmetric matrix and for solving corresponding sets of linear equations. They exploit cache memory by using the block hybrid format proposed by the authors in a companion article. The matrix is packed into n(n + 1)/2 real...

  12. Exploration technology

    Roennevik, H.C. [Saga Petroleum A/S, Forus (Norway)


    The paper evaluates exploration technology. Topics discussed are: Visions; the subsurface challenge; the creative tension; the exploration process; seismic; geology; organic geochemistry; seismic resolution; integration; drilling; value creation. 4 refs., 22 figs.

  13. Exploration Review

    Wilburn, D.R.; Stanley, K.A.


    This summary of international mineral exploration activities for 2012 draws upon information from industry sources, published literature and U.S. Geological Survey (USGS) specialists. The summary provides data on exploration budgets by region and mineral commodity, identifies significant mineral discoveries and areas of mineral exploration, discusses government programs affecting the mineral exploration industry and presents analyses of exploration activities performed by the mineral industry. Three sources of information are reported and analyzed in this annual review of international exploration for 2012: 1) budgetary statistics expressed in U.S. nominal dollars provided by SNL Metals Economics Group (MEG) of Halifax, Nova Scotia; 2) regional and site-specific exploration activities that took place in 2012 as compiled by the USGS and 3) regional events including economic, social and political conditions that affected exploration activities, which were derived from published sources and unpublished discussions with USGS and industry specialists.

  14. Exploration Geophysics

    Savit, Carl H.


    Expansion of activity and confirmation of new technological directions characterized several fields of exploration geophysics in 1977. Advances in seismic-reflection exploration have been especially important. (Author/MA)

  15. Efficient Learning Algorithms with Limited Information

    De, Anindya


    The thesis explores efficient learning algorithms in settings which are more restrictive than the PAC model of learning (Valiant) in one of the following two senses: (i) The learning algorithm has a very weak access to the unknown function, as in, it does not get labeled samples for the unknown function (ii) The error guarantee required from the…

  16. Exploration architecturale pour la conception d'un système sur puce de vision robotique, adéquation algorithme-architecture d'un système embarqué temps-réel

    Lefebvre, Thomas


    This Ph.D Thesis stands at the crossroads of three scientific domains : algorithmarchitecture adequacy, bio-inspired vision systems in mobile robotics, and image processing. The goal is to make a robot autonomous in its visual perception, by the integration to the robot of this cognitive task, usually executed on remote processing servers. To achieve this goal, the design approach follows a path of algorithm architecture adequacy, where the different image processing steps of the vision syste...

  17. Comparison of fast discrete wavelet transform algorithms

    MENG Shu-ping; TIAN Feng-chun; XU Xin


    This paper presents an analysis on and experimental comparison of several typical fast algorithms for discrete wavelet transform (DWT) and their implementation in image compression, particularly the Mallat algorithm, FFT-based algorithm, Short-length based algorithm and Lifting algorithm. The principles, structures and computational complexity of these algorithms are explored in details respectively. The results of the experiments for comparison are consistent to those simulated by MATLAB. It is found that there are limitations in the implementation of DWT. Some algorithms are workable only for special wavelet transform, lacking in generality. Above all, the speed of wavelet transform, as the governing element to the speed of image processing, is in fact the retarding factor for real-time image processing.

  18. Evolutionary Graph Drawing Algorithms

    Huang Jing-wei; Wei Wen-fang


    In this paper, graph drawing algorithms based on genetic algorithms are designed for general undirected graphs and directed graphs. As being shown, graph drawing algorithms designed by genetic algorithms have the following advantages: the frames of the algorithms are unified, the method is simple, different algorithms may be attained by designing different objective functions, therefore enhance the reuse of the algorithms. Also, aesthetics or constrains may be added to satisfy different requirements.

  19. Algorithmic chemistry

    Fontana, W.


    In this paper complex adaptive systems are defined by a self- referential loop in which objects encode functions that act back on these objects. A model for this loop is presented. It uses a simple recursive formal language, derived from the lambda-calculus, to provide a semantics that maps character strings into functions that manipulate symbols on strings. The interaction between two functions, or algorithms, is defined naturally within the language through function composition, and results in the production of a new function. An iterated map acting on sets of functions and a corresponding graph representation are defined. Their properties are useful to discuss the behavior of a fixed size ensemble of randomly interacting functions. This function gas'', or Turning gas'', is studied under various conditions, and evolves cooperative interaction patterns of considerable intricacy. These patterns adapt under the influence of perturbations consisting in the addition of new random functions to the system. Different organizations emerge depending on the availability of self-replicators.

  20. Creative Exploration


    Children are naturally curious and explore in order to make sense of the world; play and exploration are vital to their learning and development. Space and support for children to think, ask questions, make predictions, experiment, look for explanations and draw conclusions is essential in primary science. This ‘children’s science’ emerges naturally as they seek to learn about the world around them (Johnston 2008) and develop creative explanations of natural phenomena. Adopting such an explor...



    <正>20072782 Dong Sheng(East China Academy of Metallurgical Geological Exploration,Hefei 230022,China)Regional Geochemical Characteristics of Guichi Area in Anhui Province and Their Ore-Prospecting Significance(Geophysical and Geochemical Exploration,ISSN1000-8918,CN11-1906/P,30(3),2006,p.215-219,223,3 illus.,7 refs.)Key words:polymetallic deposits,regional geological exploration,Anhui Province Controlled by unique geological conditions,



    <正>20082879 Chen Yaoyu(No.3 Geology and Mineral Exploration Team,Gansu Provincial Bureau of Geology and Mineral Exploration and Development,Lanzhou 730050,China); Gong Quansheng Discussion on the Division of Deposit Scale and the Index of Ore Prospecting(Gansu Geology,ISSN 1004—4116,CN62—1191/P,16(3),2007,p.6—11,4 tables,6 refs.) Key words:prospecting and exploration of mineral



    <正>20122626 Li Dongfeng ( Liaoning Institute of Mineral Resources Exploration,Shenyang 110032,China ) Application of Comprehensive Geophysical-Geochemical Method in Toudao-yingzi Gold Field ( Journal of Liaoning Technical University ( Natural Sciences ), ISSN1008-0562,CN21-1379 / N,30 ( 6 ), 2011,p.849-852,1illus.,2tables,10refs. ) Key words:gold ores,geophysical exploration,geochemical exploration,Liaoning Province

  4. On quantum algorithms for noncommutative hidden subgroups

    Ettinger, M. [Los Alamos National Lab., NM (United States); Hoeyer, P. [Odense Univ. (Denmark)


    Quantum algorithms for factoring and discrete logarithm have previously been generalized to finding hidden subgroups of finite Abelian groups. This paper explores the possibility of extending this general viewpoint to finding hidden subgroups of noncommutative groups. The authors present a quantum algorithm for the special case of dihedral groups which determines the hidden subgroup in a linear number of calls to the input function. They also explore the difficulties of developing an algorithm to process the data to explicitly calculate a generating set for the subgroup. A general framework for the noncommutative hidden subgroup problem is discussed and they indicate future research directions.

  5. Distributed Algorithms for Time Optimal Reachability Analysis

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand


    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general....



    <正>20142564Chen Mingxing(Beijing Research Institute of Survey and Design,China Hydropower Engineering Consulting Group Co.,Beijing 100024,China);Chen Baoguo Application of Drilling Deviation Correcting and Deflecting Techniques in Geological Exploration at Songta Hydropower Station(Exploration Engineering,ISSN1672-7428,CN11-5063/TD,

  7. Algorithm Animation with Galant.

    Stallmann, Matthias F


    Although surveys suggest positive student attitudes toward the use of algorithm animations, it is not clear that they improve learning outcomes. The Graph Algorithm Animation Tool, or Galant, challenges and motivates students to engage more deeply with algorithm concepts, without distracting them with programming language details or GUIs. Even though Galant is specifically designed for graph algorithms, it has also been used to animate other algorithms, most notably sorting algorithms.

  8. Clonal Selection Based Memetic Algorithm for Job Shop Scheduling Problems

    Jin-hui Yang; Liang Sun; Heow Pueh Lee; Yun Qian; Yan-chun Liang


    A clonal selection based memetic algorithm is proposed for solving job shop scheduling problems in this paper. In the proposed algorithm, the clonal selection and the local search mechanism are designed to enhance exploration and exploitation. In the clonal selection mechanism, clonal selection, hypermutation and receptor edit theories are presented to construct an evolutionary searching mechanism which is used for exploration. In the local search mechanism, a simulated annealing local search algorithm based on Nowicki and Smutnicki's neighborhood is presented to exploit local optima. The proposed algorithm is examined using some well-known benchmark problems. Numerical results validate the effectiveness of the proposed algorithm.

  9. Algorithmic Reflections on Choreography

    Pablo Ventura


    Full Text Available In 1996, Pablo Ventura turned his attention to the choreography software Life Forms to find out whether the then-revolutionary new tool could lead to new possibilities of expression in contemporary dance. During the next 2 decades, he devised choreographic techniques and custom software to create dance works that highlight the operational logic of computers, accompanied by computer-generated dance and media elements. This article provides a firsthand account of how Ventura’s engagement with algorithmic concepts guided and transformed his choreographic practice. The text describes the methods that were developed to create computer-aided dance choreographies. Furthermore, the text illustrates how choreography techniques can be applied to correlate formal and aesthetic aspects of movement, music, and video. Finally, the text emphasizes how Ventura’s interest in the wider conceptual context has led him to explore with choreographic means fundamental issues concerning the characteristics of humans and machines and their increasingly profound interdependencies.

  10. Space exploration

    Chris Moore


      Here, Moore presents a year in review on space exploration programs. This 2012 NASA's strategy of stimulating the development of commercial capabilities to launch crew and cargo to the ISS began to pay off...



    <正>20141074Bao Xijie(Research Institute of Exploration and Development,Daqing Oilfield Company,PetroChina,Daqing 163712,China)Gather Optimal Processing and Application Effect of Prestack AVA Instantaneous Inversion



    <正>20090712 Ge Mingjun(General Institution of Mineral Exploration & Development in Qiqihaer of Heilongjiang Province,Qiqihaer 161006,China) Application of Emulsified Diesel Oil Drilling Fluid in Under-Balanced Drilling(Exploration Engineering(Rock & Soil Drilling and Tunneling),ISSN1672-7428,CN11-5063/TD,34(11),2007,p.43-45,1 illus.,2 tables,4 refs.)



    <正>20072109 An Yong(Key Lab of Geophysics Exploration under CNPC,China University of Petroleum,Beijing 102249,China);Wei Lichun Most Homogeneous Dip-Scanning Method Using Edge Preserving Smoothing for Seismic Noise Attenuation(Applied Geophysics,ISSN1672-7975,CN11-5212/O,3(4),2006,p.210-217,17 illus.,3 refs.)Key words:seismic exploration,denoising

  14. 基于H-K算法的MIDI文件主旋律音轨提取探讨%Exploration of MIDI Files Based on H-K Algorithm Extract Melody Tracks

    马青阁; 曹西征; 赵宛; 汪旭彬; 关金晨


    主旋律是音乐旋律信息的重要组成部分,文章通过对表征音乐旋律特征向量的提取,采用H- K分类算法构建音轨分类器模型,对MIDI片旋律音轨和伴奏旋律音轨进行分类。最后通过候选音轨提取出主旋律音轨,研究表明,这种方法对10个音轨内的主旋律提取准确率由75%提升到90%,对于2个音轨以内的主旋律提取可以达到95%的准确率。%Theme is an important part of music melody information, based on the characterization of music melody feature vector extraction, the H-K audio classiifcation algorithm to construct classiifer model, MIDI audio piece of melody and accompaniment melody track for classiifcation. Finally through the candidate track extract melody track, research has shown that this approach within the 10 tracks on the main melody extraction accuracy increase from 75%to 75%, the two tracks of less than the main melody of extraction can reach 95%accuracy.

  15. Chaotic ant based decentralized task allocation in wireless sensor networks%基于混沌蚂蚁的传感器网络分布式任务分配

    葛方振; 魏臻; 陆阳; 吴其林; 李丽香


    受蚂蚁的混沌行为和自组织行为启发,提出了一种基于混沌蚂蚁的无线传感器网络分布式任务分配算法,以延长无线传感器网络生命期、节省能量消耗和均衡网络负载,该算法的目标函数考虑了任务能耗和任务执行可靠性.任务分配的优化解通过任务映射、通信路由路径分配和任务分配方案优化3个步骤获得,任务映射由蚂蚁的混沌行为产生,通信路由路径分配由蚂蚁的邻居选择方法确定,用A*算法实现,任务分配方案优化由蚁群的自组织能力实现.通过仿真实验和应用实例比较与分析,表明了该算法能有效地均衡网络负载和延长网络生命期.%Inspired by chaotic and self-organization behaviors of ants, we present a decentralized task allocation algorithm in wireless sensor networks based on chaotic ant swarm (CAS-DTA) so as to prolong the network lifetime, reduce the energy consumption and balance the network load effectively. The objective function of this algorithm is established according to the energy consumption and reliability of entire task execution. The optimal solution can be achieved through task mapping, communication routing and task allocation selection by means of the framework of chaotic ant swarm. Task mapping is carried out with ant chaotic behaviors, communication routing is established with neighbor selection method and searched with A * algorithm, while task allocation selection is implemented with the self-organization capability of ant colony. The performance of the algorithm is evaluated through simulation and an application example, and compared with that of other task processing methods; and experimental results show the superiority of the algorithm in terms of both load balancing and lifetime of wireless sensor networks.

  16. The algorithm design manual

    Skiena, Steven S


    Explaining designing algorithms, and analyzing their efficacy and efficiency, this book covers combinatorial algorithms technology, stressing design over analysis. It presents instruction on methods for designing and analyzing computer algorithms. It contains the catalog of algorithmic resources, implementations and a bibliography

  17. Hamiltonian Algorithm Sound Synthesis

    大矢, 健一


    Hamiltonian Algorithm (HA) is an algorithm for searching solutions is optimization problems. This paper introduces a sound synthesis technique using Hamiltonian Algorithm and shows a simple example. "Hamiltonian Algorithm Sound Synthesis" uses phase transition effect in HA. Because of this transition effect, totally new waveforms are produced.



    <正>20091853 An Jinzhen(School of Earth and Space Sciences,Peking University,Beijing 100871,China);Zhou Pinggen Experiments on Exploring and Monitoring Landslip-Mass Using Geoelectric Resistivity Observations(Acta Seismologica Sinica,ISSN0253-3782,CN11-2021/P,30(3),2008,p.254-261,6 illus.,1 table,19 refs.)Key words:resistivity methods,landslidesIn the experiments,a high-density resistivity method is used to explore the electric structure of landslip mass,and a resistivity-changing anisotropy method is used to monitor the orientation and speed of main fracture extending of landslip mass.The results are as follows:1)the exploring experiments have verified a part of creep deformation borderline,the depth and thickness of groundwater horizon,and the property of super strata in the landslip mass investigated formerly,which have proved that the landslip belts contain rich groundwater

  19. Farside explorer

    Mimoun, David; Wieczorek, Mark A.; Alkalai, Leon


    Farside Explorer is a proposed Cosmic Vision medium-size mission to the farside of the Moon consisting of two landers and an instrumented relay satellite. The farside of the Moon is a unique scientific platform in that it is shielded from terrestrial radio-frequency interference, it recorded...... the primary differentiation and evolution of the Moon, it can be continuously monitored from the Earth-Moon L2 Lagrange point, and there is a complete lack of reflected solar illumination from the Earth. Farside Explorer will exploit these properties and make the first radio-astronomy measurements from...... the most radio-quiet region of near-Earth space, determine the internal structure and thermal evolution of the Moon, from crust to core, and quantify impact hazards in near-Earth space by the measurement of flashes generated by impact events. The Farside Explorer flight system includes two identical solar...

  20. The Algorithmic Imaginary

    Bucher, Taina


    of algorithms affect people's use of these platforms, if at all? To help answer these questions, this article examines people's personal stories about the Facebook algorithm through tweets and interviews with 25 ordinary users. To understand the spaces where people and algorithms meet, this article develops...... the notion of the algorithmic imaginary. It is argued that the algorithmic imaginary – ways of thinking about what algorithms are, what they should be and how they function – is not just productive of different moods and sensations but plays a generative role in moulding the Facebook algorithm itself...

  1. The BR eigenvalue algorithm

    Geist, G.A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Howell, G.W. [Florida Inst. of Tech., Melbourne, FL (United States). Dept. of Applied Mathematics; Watkins, D.S. [Washington State Univ., Pullman, WA (United States). Dept. of Pure and Applied Mathematics


    The BR algorithm, a new method for calculating the eigenvalues of an upper Hessenberg matrix, is introduced. It is a bulge-chasing algorithm like the QR algorithm, but, unlike the QR algorithm, it is well adapted to computing the eigenvalues of the narrowband, nearly tridiagonal matrices generated by the look-ahead Lanczos process. This paper describes the BR algorithm and gives numerical evidence that it works well in conjunction with the Lanczos process. On the biggest problems run so far, the BR algorithm beats the QR algorithm by a factor of 30--60 in computing time and a factor of over 100 in matrix storage space.

  2. Safe Exploration in Markov Decision Processes

    Moldovan, Teodor Mihai


    In environments with uncertain dynamics exploration is necessary to learn how to perform well. Existing reinforcement learning algorithms provide strong exploration guarantees, but they tend to rely on an ergodicity assumption. The essence of ergodicity is that any state is eventually reachable from any other state by following a suitable policy. This assumption allows for exploration algorithms that operate by simply favoring states that have rarely been visited before. For most physical systems this assumption is impractical as the systems would break before any reasonable exploration has taken place, i.e., most physical systems don't satisfy the ergodicity assumption. In this paper we address the need for safe exploration methods in Markov decision processes. We first propose a general formulation of safety through ergodicity. We show that imposing safety by restricting attention to the resulting set of guaranteed safe policies is NP-hard. We then present an efficient algorithm for guaranteed safe, but pot...



    <正>20122758 Chen Huiming ( No.8 Geology Team of Fujian Province,Longyan 364000,China ) Application Research on Drilling Technology Process Combination for Deep Explora-tion in an Iron Mine of Fujian Province ( Exploration Engineering,ISSN1672-7428,CN11-5063 / TD,38 ( 9 ), 2011,p.6-9,8ta-bles,6refs. ) Key words:drilling in complicated formation According to the drilling technical problems in deep complex formations of the ironmine surrounding Makeng of Fujian Province ,



    <正>20110462 Chen Furong(Anhui Institute of Geological Survey,Hefei 230001,China)Ore-Search Prospects of Gold and Tungsten Geochemical Anomalies in Ningdun Area,Anhui Province(Geophysical and Geochemical Exploration,ISSN1000-8918,CN11-1906/P,34(2),2010,p.150-153,5 illus.,2 tables,6 refs.)Key words:gold ores,tungsten ores,geochemical exploration,AnhuiGeochemical anomalies of gold and tungsten in Ningdun area are dominated by the element association of Au-As-W-Bi.These anomalies are well coincident with



    <正>20112102 Chen Yiying(Shijiazhuang University of Economics,Shijiazhuang 050031,China);Li Wenbin Automatic Generation of Complicated Fault in Geological Section(Coal Geology & Exploration,ISSN1001-1986,CN61-1155/P,38(5),2010,p.7-12,8 illus.,13 refs.)Key words:faults,map compilation The researches of this paper are the basic theories and essential techniques of simulating complicated faults,and a series of approaches are proposed.Based on the practical geological exploration,data types are analyzed and database is normalized.The strata recovering technique is

  6. Algorithmically specialized parallel computers

    Snyder, Lawrence; Gannon, Dennis B


    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  7. Censored Exploration and the Dark Pool Problem

    Ganchev, Kuzman; Nevmyvaka, Yuriy; Vaughan, Jennifer Wortman


    We introduce and analyze a natural algorithm for multi-venue exploration from censored data, which is motivated by the Dark Pool Problem of modern quantitative finance. We prove that our algorithm converges in polynomial time to a near-optimal allocation policy; prior results for similar problems in stochastic inventory control guaranteed only asymptotic convergence and examined variants in which each venue could be treated independently. Our analysis bears a strong resemblance to that of efficient exploration/ exploitation schemes in the reinforcement learning literature. We describe an extensive experimental evaluation of our algorithm on the Dark Pool Problem using real trading data.



    <正>20070497 Wang Shuangqing (National Research Center of Geoanalysis, Beijing 100037, China); Sun Weilin Review on Methodology in Oil and Gas Geochemical Exploration (Rock and Mineral Analysis, ISSN0254-5357, CN11-2131/TD, 24(4), 2005, p.271-276, 40 refs.) Key words: geochemical prospecting of oil and gas

  9. Exploring Size.

    Brand, Judith, Ed.


    "Exploring" is a magazine of science, art, and human perception that communicates ideas museum exhibits cannot demonstrate easily by using experiments and activities for the classroom. This issue concentrates on size, examining it from a variety of viewpoints. The focus allows students to investigate and discuss interconnections among…



    <正>20131193 Bing Pingping (Key Lab.of Geophysical Exploration of CNPC , China University of Petroleum , Beijing 102249 , China); Cao Siyuan Non-Linear AVO Inversion Based on Support Vector Machine (Chinese Journal of Geophysics , ISSN0001-5733 , CN11-2074/P , 55 (3), 2012 , p.1025-1032 , 4illus. , 26 tables , 2refs.)



    <正>20051144 Gu Jun (Petroleum University, Beijing); Gao Deli Analysis of Mechanic Characterstics for Coal Bed and Drilling Countermeasure in Tuha Basin, Xinjiang, China (Exploration Engineering (Rock & Soil Drilling and Tunneling), ISSN 1672 - 7428, CN11-5063/TD, 31(5), 2004, p. 51-52, 55, 3 tables, 1 ref. , with English abstract) Key words: coal seams, drilling



    <正>20131973 Luo Zhili(Chengdu University of Technology,Chengdu 610059,China);Sun Wei Reviews of the Exploration History of Stratigraphic Wells in the Sichuan Basin and Analysis of the Obtained Geological Effects(Natural Gas Industry,ISSN1000-0976,CN51-1179/TE,32(4),2012,p.9-12,1illus.,10)



    20151884 An Guoying(China Aero Geophysical Survey and Remote Sensing Center for Land and Resources,Beijing100083,China)Regional Geochemistry of Sanjiang Region in Yunnan Province and Its Copper-Polymetallic Prospecting Significance(Geophysical and Geochemical Exploration,ISSN1000-8918,



    <正>20131784 An Guoying(China Aero Geophysical Survey and Remote Sensing Center for Land and Resources,Beijing 100083,China);Lei Yingping Geochemical Characteristics and Metallogenic Prospecting Areas in Yunkai Area,Guangxi(Geophysical and Geochemical Exploration,ISSN1000-8918,CN11-1906/P,36

  15. The Georgi Algorithms of Jet Clustering

    Ge, Shao-Feng


    We reveal the direct link between the jet clustering algorithms recently proposed by Howard Georgi and parton shower kinematics, providing firm foundation from the theoretical side. The kinematics of this class of elegant algorithms is explored systematically for partons with arbitrary masses and the jet function is generalized to $J^{(n)}_\\beta$ with a jet function index $n$ in order to achieve more degrees of freedom. Based on three basic requirements that, the result of jet clustering is p...

  16. Improving Search Algorithms by Using Intelligent Coordinates

    Wolpert, David H.; Tumer, Kagan; Bandari, Esfandiar


    We consider algorithms that maximize a global function G in a distributed manner, using a different adaptive computational agent to set each variable of the underlying space. Each agent eta is self-interested; it sets its variable to maximize its own function g (sub eta). Three factors govern such a distributed algorithm's performance, related to exploration/exploitation, game theory, and machine learning. We demonstrate how to exploit alI three factors by modifying a search algorithm's exploration stage: rather than random exploration, each coordinate of the search space is now controlled by a separate machine-learning-based player engaged in a noncooperative game. Experiments demonstrate that this modification improves simulated annealing (SA) by up to an order of magnitude for bin packing and for a model of an economic process run over an underlying network. These experiments also reveal interesting small-world phenomena.

  17. Uses of clinical algorithms.

    Margolis, C Z


    The clinical algorithm (flow chart) is a text format that is specially suited for representing a sequence of clinical decisions, for teaching clinical decision making, and for guiding patient care. A representative clinical algorithm is described in detail; five steps for writing an algorithm and seven steps for writing a set of algorithms are outlined. Five clinical education and patient care uses of algorithms are then discussed, including a map for teaching clinical decision making and protocol charts for guiding step-by-step care of specific problems. Clinical algorithms are compared as to their clinical usefulness with decision analysis. Three objections to clinical algorithms are answered, including the one that they restrict thinking. It is concluded that methods should be sought for writing clinical algorithms that represent expert consensus. A clinical algorithm could then be written for any area of medical decision making that can be standardized. Medical practice could then be taught more effectively, monitored accurately, and understood better.

  18. Adaptive Routing Algorithm for MANET:TERMITE

    Sharvani G S


    Full Text Available Mobile Ad hoc network consists of a set of mobile nodes. This network does not have anyinfrastructure or central administration, hence it is called infrastructure less network. As the nodes aremobile, it is very difficult to find the path between two end points. This paper presents a solution forfinding path between nodes in mobile ad hoc network. Termite is an innovative algorithm for packetrouting in communication networks. Termite is an adaptive, distributed, mobile-agents-based algorithmwhich was inspired by recent work on the ant colony metaphor. According to this algorithm, a group ofmobile agents (or artificial termites build paths between pair of nodes; exploring the networkconcurrently and exchanging obtained information to update the routing tables. Some of the parametersused to evaluate its performance are packet delays and throughput. The results of this algorithm showsbetter throughput as compared to existing algorithms. So, Termite algorithm is a promising alternativefor routing of data in commercial networks.

  19. Dolphin swarm algorithm

    Tian-qi WU; Min YAO; Jian-hua YANG


    By adopting the distributed problem-solving strategy, swarm intelligence algorithms have been successfully applied to many optimization problems that are difficult to deal with using traditional methods. At present, there are many well-implemented algorithms, such as particle swarm optimization, genetic algorithm, artificial bee colony algorithm, and ant colony optimization. These algorithms have already shown favorable performances. However, with the objects becoming increasingly complex, it is becoming gradually more difficult for these algorithms to meet human’s demand in terms of accuracy and time. Designing a new algorithm to seek better solutions for optimization problems is becoming increasingly essential. Dolphins have many noteworthy biological characteristics and living habits such as echolocation, information exchanges, cooperation, and division of labor. Combining these biological characteristics and living habits with swarm intelligence and bringing them into optimization prob-lems, we propose a brand new algorithm named the ‘dolphin swarm algorithm’ in this paper. We also provide the definitions of the algorithm and specific descriptions of the four pivotal phases in the algorithm, which are the search phase, call phase, reception phase, and predation phase. Ten benchmark functions with different properties are tested using the dolphin swarm algorithm, particle swarm optimization, genetic algorithm, and artificial bee colony algorithm. The convergence rates and benchmark func-tion results of these four algorithms are compared to testify the effect of the dolphin swarm algorithm. The results show that in most cases, the dolphin swarm algorithm performs better. The dolphin swarm algorithm possesses some great features, such as first-slow-then-fast convergence, periodic convergence, local-optimum-free, and no specific demand on benchmark functions. Moreover, the dolphin swarm algorithm is particularly appropriate to optimization problems, with more

  20. Data structures and algorithm analysis in Java

    Shaffer, Clifford A


    With its focus on creating efficient data structures and algorithms, this comprehensive text helps readers understand how to select or design the tools that will best solve specific problems. It uses Java as the programming language and is suitable for second-year data structure courses and computer science courses in algorithm analysis. Techniques for representing data are presented within the context of assessing costs and benefits, promoting an understanding of the principles of algorithm analysis and the effects of a chosen physical medium. The text also explores tradeoff issues, familiari

  1. Data structures and algorithm analysis in C++

    Shaffer, Clifford A


    With its focus on creating efficient data structures and algorithms, this comprehensive text helps readers understand how to select or design the tools that will best solve specific problems. It uses Microsoft C++ as the programming language and is suitable for second-year data structure courses and computer science courses in algorithm analysis.Techniques for representing data are presented within the context of assessing costs and benefits, promoting an understanding of the principles of algorithm analysis and the effects of a chosen physical medium. The text also explores tradeoff issues, f



    <正>20132654Bi Xiaojia(Chengdu University of Technology,Chengdu 610059,China);Miao Fang Lithology Identification and Mapping by Hyperion Hyperspectral Remote Sensing(Computing Techniques for Geophysical and Geochemical Exploration,ISSN1001-1749,CN51-1242/P,34(5),2012,p.599-603,2illus.,14refs.)Key words:geologic mapping,hyperspectral remote sensing,Qinghai Province



    <正>20072985 Bai Mingzhou(Beijing Jiaotong University,Beijing 100044,China);Du Yongqiang Study on Application Technology of Geology Horizontal Drilling in Qiyueshan Tunnel at Yiwan Railway(Exploration Engineering(Rock & Soil Drilling and Tunneling),ISSN1672-7428,CN11-5063/TD,33(4),2006,p.59-61,1 table,3 refs.)Key words:tunnels,horizontal drilling

  4. New focused crawling algorithm

    Su Guiyang; Li Jianhua; Ma Yinghua; Li Shenghong; Song Juping


    Focused carawling is a new research approach of search engine. It restricts information retrieval and provides search service in specific topic area. Focused crawling search algorithm is a key technique of focused crawler which directly affects the search quality. This paper first introduces several traditional topic-specific crawling algorithms, then an inverse link based topic-specific crawling algorithm is put forward. Comparison experiment proves that this algorithm has a good performance in recall, obviously better than traditional Breadth-First and Shark-Search algorithms. The experiment also proves that this algorithm has a good precision.

  5. Symplectic algebraic dynamics algorithm


    Based on the algebraic dynamics solution of ordinary differential equations andintegration of  ,the symplectic algebraic dynamics algorithm sn is designed,which preserves the local symplectic geometric structure of a Hamiltonian systemand possesses the same precision of the na ve algebraic dynamics algorithm n.Computer experiments for the 4th order algorithms are made for five test modelsand the numerical results are compared with the conventional symplectic geometric algorithm,indicating that sn has higher precision,the algorithm-inducedphase shift of the conventional symplectic geometric algorithm can be reduced,and the dynamical fidelity can be improved by one order of magnitude.

  6. Adaptive cockroach swarm algorithm

    Obagbuwa, Ibidun C.; Abidoye, Ademola P.


    An adaptive cockroach swarm optimization (ACSO) algorithm is proposed in this paper to strengthen the existing cockroach swarm optimization (CSO) algorithm. The ruthless component of CSO algorithm is modified by the employment of blend crossover predator-prey evolution method which helps algorithm prevent any possible population collapse, maintain population diversity and create adaptive search in each iteration. The performance of the proposed algorithm on 16 global optimization benchmark function problems was evaluated and compared with the existing CSO, cuckoo search, differential evolution, particle swarm optimization and artificial bee colony algorithms.

  7. Decoherence in Search Algorithms

    Abal, G; Marquezino, F L; Oliveira, A C; Portugal, R


    Recently several quantum search algorithms based on quantum walks were proposed. Those algorithms differ from Grover's algorithm in many aspects. The goal is to find a marked vertex in a graph faster than classical algorithms. Since the implementation of those new algorithms in quantum computers or in other quantum devices is error-prone, it is important to analyze their robustness under decoherence. In this work we analyze the impact of decoherence on quantum search algorithms implemented on two-dimensional grids and on hypercubes.

  8. Software For Genetic Algorithms

    Wang, Lui; Bayer, Steve E.


    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  9. Space exploration


    Space Exploration, is one book in the Britannica Illustrated Science Library Series that is correlated to the science curriculum in grades 5-8. The Britannica Illustrated Science Library is a visually compelling set that covers earth science, life science, and physical science in 16 volumes.  Created for ages 10 and up, each volume provides an overview on a subject and thoroughly explains it through detailed and powerful graphics-more than 1,000 per volume-that turn complex subjects into information that students can grasp.  Each volume contains a glossary with full definitions for vocabulary help and an index.

  10. Space exploration

    90, Sol


    Space Exploration, is one book in the Britannica Illustrated Science Library Series that is correlated to the science curriculum in grades 5-8. The Britannica Illustrated Science Library is a visually compelling set that covers earth science, life science, and physical science in 16 volumes.  Created for ages 10 and up, each volume provides an overview on a subject and thoroughly explains it through detailed and powerful graphics-more than 1,000 per volume-that turn complex subjects into information that students can grasp.  Each volume contains a glossary with full definitions for vocabulary



    <正>20110471 Cai Shaokun(Mechatronics and Automation College,National University of Defense Technology,Changsha 410073,China);Wu Meiping A Comparison of Digital Lowpass FIR-Filters in Airborne Gravimetry(Geophysical and Geochemical Exploration,ISSN1000-8918,CN11-1906/P,34(1),2010,p.74-78,8 illus.,3 tables,14 refs.)Key words:aerogravity surveys,filtersThere is a lot of noise in the data observed by airborne gravimeter.Digital lowpass FIR-filter i

  12. Progressive geometric algorithms

    Sander P.A. Alewijnse


    Full Text Available Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms for two geometric problems: computing the convex hull of a planar point set, and finding popular places in a set of trajectories.

  13. Approximate iterative algorithms

    Almudevar, Anthony Louis


    Iterative algorithms often rely on approximate evaluation techniques, which may include statistical estimation, computer simulation or functional approximation. This volume presents methods for the study of approximate iterative algorithms, providing tools for the derivation of error bounds and convergence rates, and for the optimal design of such algorithms. Techniques of functional analysis are used to derive analytical relationships between approximation methods and convergence properties for general classes of algorithms. This work provides the necessary background in functional analysis a

  14. Grover search algorithm

    Borbely, Eva


    A quantum algorithm is a set of instructions for a quantum computer, however, unlike algorithms in classical computer science their results cannot be guaranteed. A quantum system can undergo two types of operation, measurement and quantum state transformation, operations themselves must be unitary (reversible). Most quantum algorithms involve a series of quantum state transformations followed by a measurement. Currently very few quantum algorithms are known and no general design methodology e...

  15. Competing Sudakov Veto Algorithms

    Kleiss, Ronald


    We present a way to analyze the distribution produced by a Monte Carlo algorithm. We perform these analyses on several versions of the Sudakov veto algorithm, adding a cutoff, a second variable and competition between emission channels. The analysis allows us to prove that multiple, seemingly different competition algorithms, including those that are currently implemented in most parton showers, lead to the same result. Finally, we test their performance and show that there are significantly faster alternatives to the commonly used algorithms.

  16. Autonomous Star Tracker Algorithms

    Betto, Maurizio; Jørgensen, John Leif; Kilsgaard, Søren


    Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances.......Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances....

  17. Graph colouring algorithms

    Husfeldt, Thore


    This chapter presents an introduction to graph colouring algorithms. The focus is on vertex-colouring algorithms that work for general classes of graphs with worst-case performance guarantees in a sequential model of computation. The presentation aims to demonstrate the breadth of available techniques and is organized by algorithmic paradigm.

  18. Optimal Mixing Evolutionary Algorithms

    Thierens, D.; Bosman, P.A.N.; Krasnogor, N.


    A key search mechanism in Evolutionary Algorithms is the mixing or juxtaposing of partial solutions present in the parent solutions. In this paper we look at the efficiency of mixing in genetic algorithms (GAs) and estimation-of-distribution algorithms (EDAs). We compute the mixing probabilities of

  19. Implementation of Parallel Algorithms


    Lecture Notes in Computer Science , Warwich, England, July 16-20... Lecture Notes in Computer Science , Springer-Verlag, Bangalor, India, December 1990. J. Reif, J. Canny, and A. Page, "An Exact Algorithm for Kinodynamic...Parallel Algorithms and its Impact on Computational Geometry, in Optimal Algorithms, H. Djidjev editor, Springer-Verlag Lecture Notes in Computer Science

  20. Semiclassical Shor's Algorithm

    Giorda, P; Sen, S; Sen, S; Giorda, Paolo; Iorio, Alfredo; Sen, Samik; Sen, Siddhartha


    We propose a semiclassical version of Shor's quantum algorithm to factorize integer numbers, based on spin-1/2 SU(2) generalized coherent states. Surprisingly, we find numerical evidence that the algorithm's success probability is not too severely modified by our semiclassical approximation. This suggests that it is worth pursuing practical implementations of the algorithm on semiclassical devices.

  1. Nature-inspired optimization algorithms

    Yang, Xin-She


    Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning

  2. Ouroboros: A Tool for Building Generic, Hybrid, Divide& Conquer Algorithms

    Johnson, J R; Foster, I


    A hybrid divide and conquer algorithm is one that switches from a divide and conquer to an iterative strategy at a specified problem size. Such algorithms can provide significant performance improvements relative to alternatives that use a single strategy. However, the identification of the optimal problem size at which to switch for a particular algorithm and platform can be challenging. We describe an automated approach to this problem that first conducts experiments to explore the performance space on a particular platform and then uses the resulting performance data to construct an optimal hybrid algorithm on that platform. We implement this technique in a tool, ''Ouroboros'', that automatically constructs a high-performance hybrid algorithm from a set of registered algorithms. We present results obtained with this tool for several classical divide and conquer algorithms, including matrix multiply and sorting, and report speedups of up to six times achieved over non-hybrid algorithms.

  3. New Hybrid Genetic Algorithm for Vertex Cover Problems

    霍红卫; 许进


    This paper presents a new hybrid genetic algorithm for the vertex cover problems in which scan-repair and local improvement techniques are used for local optimization. With the hybrid approach, genetic algorithms are used to perform global exploration in a population, while neighborhood search methods are used to perform local exploitation around the chromosomes. The experimental results indicate that hybrid genetic algorithms can obtain solutions of excellent quality to the problem instances with different sizes. The pure genetic algorithms are outperformed by the neighborhood search heuristics procedures combined with genetic algorithms.

  4. A Unified Differential Evolution Algorithm for Global Optimization

    Qiang, Ji; Mitchell, Chad


    Abstract?In this paper, we propose a new unified differential evolution (uDE) algorithm for single objective global optimization. Instead of selecting among multiple mutation strategies as in the conventional differential evolution algorithm, this algorithm employs a single equation as the mutation strategy. It has the virtue of mathematical simplicity and also provides users the flexbility for broader exploration of different mutation strategies. Numerical tests using twelve basic unimodal and multimodal functions show promising performance of the proposed algorithm in comparison to convential differential evolution algorithms.

  5. Algorithms for Quantum Computers

    Smith, Jamie


    This paper surveys the field of quantum computer algorithms. It gives a taste of both the breadth and the depth of the known algorithms for quantum computers, focusing on some of the more recent results. It begins with a brief review of quantum Fourier transform based algorithms, followed by quantum searching and some of its early generalizations. It continues with a more in-depth description of two more recent developments: algorithms developed in the quantum walk paradigm, followed by tensor network evaluation algorithms (which include approximating the Tutte polynomial).

  6. Parallel sorting algorithms

    Akl, Selim G


    Parallel Sorting Algorithms explains how to use parallel algorithms to sort a sequence of items on a variety of parallel computers. The book reviews the sorting problem, the parallel models of computation, parallel algorithms, and the lower bounds on the parallel sorting problems. The text also presents twenty different algorithms, such as linear arrays, mesh-connected computers, cube-connected computers. Another example where algorithm can be applied is on the shared-memory SIMD (single instruction stream multiple data stream) computers in which the whole sequence to be sorted can fit in the



    <正>20092040 Chen Jing(College of Petroleum Engineering,Yangtze University,Jingzhou 434023,China);Xiong Qingshan Technology of Well Cementing with Expandable Tube and Its Application(Exploration Engineering,ISSN1672-7428,CN11-5063/TD,35(8),2008,p.19-21,4 illus.,2 tables,5 refs.)Key words:cementingExpandable tube is a new technology and has been developed oversea.It can be applied in well drilling and completion for deep water,deep well,extended reach well and multilateral well,as well as in oil extraction and workover.This paper briefly introduces the technology of well cementing with

  8. Geoelectrical exploration

    Mostafa Said Barseem


    Full Text Available Sinai development is a goal of successive governments in Egypt. The present study is a geoelectrical exploration to find appropriate solutions of the problems affecting the land of a Research Station in Southeast Al Qantara. This research station is one of the Desert Research Center stations to facilitate the development of desert land for agriculture by introducing applied research. It suffers from some problems which can be summarized in the shortage of irrigation water and water logging. The appropriate solutions of these problems have been delineated by the results of 1D and 2D geoelectrical measurements. Electrical resistivity (ER revealed the subsurface sedimentary sequences and extension of subsurface layers in the horizontal and vertical directions, especially, the water bearing layer. Additionally it helped to choose the most suitable places to drill productive wells with a good condition.



    <正>20111381 Geng Tao(Xi’an Center of Geological Survey of CGS,Xi’an 710054,China);Liu Kuanhou Application of Accurate Inspection of CQG2000 Quasi-Geoid Model to Regional Gravity Survey in Qinghai-Tibetan Plateau(Northwestern Geology,ISSN1009-6248,CN61-1149/P,43(2),2010,p.1-7,1 illus.,2 tables,4 refs.)Key words:gravity exploration,Global Positioning System,Qinghai-Tibetan Plateau During regional gravity survey,accuracy of orthometric height may affect the accuracy of gravity survey directly.The field test in Qinghai-Tibetan Plateau show that the accuracy of CQG2000 quasi-geoid model can satisfy the accuracy of orthometric height during the 1∶200 000 regional gravity survey.Based on the test,the authors summarize the method how the accuracy of height measurement

  10. Exploring ESASky

    De Marchi, Guido; ESASky Team


    ESASky is a science-driven discovery portal for all ESA space astronomy missions. It also includes missions from international partners such as Suzaku and Chandra. The first public release of ESASky features interfaces for sky exploration and for single and multiple target searches. Using the application requires no prior-knowledge of any of the missions involved and gives users world-wide simplified access to high-level science-ready data products from space-based Astronomy missions, plus a number of ESA-produced source catalogues, including the Gaia Data Release 1 catalogue. We highlight here the latest features to be developed, including one that allows the user to project onto the sky the footprints of the JWST instruments, at any chosen position and orientation. This tool has been developed to aid JWST astronomers when they are defining observing proposals. We aim to include other missions and instruments in the near future.

  11. Modified Clipped LMS Algorithm

    Lotfizad Mojtaba


    Full Text Available Abstract A new algorithm is proposed for updating the weights of an adaptive filter. The proposed algorithm is a modification of an existing method, namely, the clipped LMS, and uses a three-level quantization ( scheme that involves the threshold clipping of the input signals in the filter weight update formula. Mathematical analysis shows the convergence of the filter weights to the optimum Wiener filter weights. Also, it can be proved that the proposed modified clipped LMS (MCLMS algorithm has better tracking than the LMS algorithm. In addition, this algorithm has reduced computational complexity relative to the unmodified one. By using a suitable threshold, it is possible to increase the tracking capability of the MCLMS algorithm compared to the LMS algorithm, but this causes slower convergence. Computer simulations confirm the mathematical analysis presented.

  12. Entropy Message Passing Algorithm

    Ilic, Velimir M; Branimir, Todorovic T


    Message passing over factor graph can be considered as generalization of many well known algorithms for efficient marginalization of multivariate function. A specific instance of the algorithm is obtained by choosing an appropriate commutative semiring for the range of the function to be marginalized. Some examples are Viterbi algorithm, obtained on max-product semiring and forward-backward algorithm obtained on sum-product semiring. In this paper, Entropy Message Passing algorithm (EMP) is developed. It operates over entropy semiring, previously introduced in automata theory. It is shown how EMP extends the use of message passing over factor graphs to probabilistic model algorithms such as Expectation Maximization algorithm, gradient methods and computation of model entropy, unifying the work of different authors.

  13. A Hybrid Evolutionary Algorithm for Discrete Optimization

    J. Bhuvana


    Full Text Available Most of the real world multi-objective problems demand us to choose one Pareto optimal solution out of a finite set of choices. Flexible job shop scheduling problem is one such problem whose solutions are required to be selected from a discrete solution space. In this study we have designed a hybrid genetic algorithm to solve this scheduling problem. Hybrid genetic algorithms combine both the aspects of the search, exploration and exploitation of the search space. Proposed algorithm, Hybrid GA with Discrete Local Search, performs global search through the GA and exploits the locality through discrete local search. Proposed hybrid algorithm not only has the ability to generate Pareto optimal solutions and also identifies them with less computation. Five different benchmark test instances are used to evaluate the performance of the proposed algorithm. Results observed shown that the proposed algorithm has produced the known Pareto optimal solutions through exploration and exploitation of the search space with less number of functional evaluations.

  14. Control algorithms for autonomous robot navigation

    Jorgensen, C.C.


    This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced.

  15. Genetic Algorithm for Chinese Postman Problems

    Jiang Hua; Kang Li-shan


    Chinese Postman Problem is an unsettled graphic problem. It was approached seldom by evolutionary computation. Now we use genetic algorithm to solve Chinese Postman Problem in undirected graph and get good results. It could be extended to solve Chinese postman problem in directed graph. We make these efforts for exploring in optimizing the mixed Chinese postman problem.



    <正>20111167 Cao Zhonghuang(Wuhan Iron & Steel Group Minerals Company,Wuhan 430063,China);Luo Xianrong Comparative Study of Copper-Nickel Deposit Exploration by the Geoelectro-chemical Extraction Method in Different Overburden Areas(Geology and Prospecting,ISSN0495-5331,CN11-2043/P,46(3),2010,p.476-482,4 illus.,5 tables,20 refs.)Key words:geo-electrochemical methods,copper ores,nickel ores,Gansu Province,Jilin Province The authors have made a comparative study of quantitative and qualitative analysis and application of the geoelectro-chemical extraction method in different overburden areas in southward extension of Jinchuan in Gansu Province and Hongqiling in Jilin Province.The authors found that this method extracted very few ions in arid areas covered with debris,but the prospecting effect was almost the same as that in moist areas covered with thick overburden.And this method could show objectively differences of geochemical characters



    <正>20102820 Chen Zhongyun(CNOOC Ltd.Shanghai,Shanghai 200030,China);Chen Hua Using Surfer Automation to Plot Contour Maps(Computing Techniques for Geophysical and Geochemical Exploration,ISSN1001-1749,CN51-1242/P,31(4),2009,p.409-412,2 illus.,10 refs.,with English abstract)Key words:digital cartography,isopleth maps20102821 Hu Daogong(Institute of Geomechanics,Chinese Academy of Geological Sciences,Beijing 100081,China);Patrick J.Barosh Inspirations from the Sino-U.S.Cooperative Geological Mapping in the East Kunlun Orogenic Belt:Ideas and Methods(Geological Bulletin of China,ISSN1671-2552,CN11-4648/P,28(10),2009,p.1411-1418,5 illus.,14 refs.)Key words:geologic mapping,China,United StatesOn the basis of the practice of the Sino-U.S.cooperative geological mapping in the East Kunlun Orogenic Belt and through the comparative analysis of several geological mapping examples completed recently by USGS,the authors have a further knowledge of the method and idea of America geological mapping.The concept of "mapping all lithological unites" hasn’t changed within a difficult course of 130 years along with USGS’s evolution.The mapping method of "geological features guid

  18. Project Explorer

    Dannenberg, K. K.; Henderson, A.; Lee, J.; Smith, G.; Stluka, E.


    PROJECT EXPLORER is a program that will fly student-developed experiments onboard the Space Shuttle in NASA's Get-Away Special (GAS) containers. The program is co-sponsored by the Alabama Space and Rocket Center, the Alabama-Mississippi Section of the American Institute of Aeronautics and Astronautics, Alabama A&M University and requires extensive support by the University of Alabama in Huntsville. A unique feature of this project will demonstrate transmissions to ground stations on amateur radio frequencies in English language. Experiments Nos. 1, 2, and 3 use the microgravity of space flight to study the solidification of lead-antimony and aluminum-copper alloys, the growth of potassium-tetracyanoplatinate hydrate crystals in an aqueous solution, and the germination of radish seeds. Flight results will be compared with Earth-based data. Experiment No. 4 features radio transmission and will also provide timing for the start of all other experiments. A microprocessor will obtain real-time data from all experiments as well as temperature and pressure measurements taken inside the canister. These data will be transmitted on previously announced amateur radio frequencies after they have been converted into the English language by a digitalker for general reception.

  19. Space Exploration

    Gallagher, Dennis


    New range Passage Tomb may be the first structure with known astronomical significance. It was built around 3,200 B.C. in Ireland. It's central passage allows light end-to-end for about 2 weeks around winter solstice. The Sun, Moon, Planets, and Stars held significance in early times due to the seasons, significance for food crops, and mythology. Citation: Corel Photography and Windows to the Universe The Greek may be among the first to pursue analytical interpretations of what they saw in the sky. In about 280 B.C. Aristarchus suggested Earth revolves around the Sun and estimated the distance between. Around 130 B.C. Hipparchus developed the first accurate star map. Today still seek to understand how the universe formed and how we came to be and are we alone. Understanding the causes and consequences of climate change using advanced space missions with major Earth science and applications research. center dotFire the public imagination and inspire students to pursue STEM fields. Train college and graduate students to create a U.S. technical workforce with employees that embody the values of competence, innovation, and service. center dotDrive the technical innovations that enable exploration and become the engine of National economic growth. center dotPartner domestically and internationally to leverage resources to extend the reach of research.

  20. Distributed Algorithms for Time Optimal Reachability Analysis

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand


    Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule....... We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general....

  1. Glowworm swarm optimization theory, algorithms, and applications

    Kaipa, Krishnanand N


    This book provides a comprehensive account of the glowworm swarm optimization (GSO) algorithm, including details of the underlying ideas, theoretical foundations, algorithm development, various applications, and MATLAB programs for the basic GSO algorithm. It also discusses several research problems at different levels of sophistication that can be attempted by interested researchers. The generality of the GSO algorithm is evident in its application to diverse problems ranging from optimization to robotics. Examples include computation of multiple optima, annual crop planning, cooperative exploration, distributed search, multiple source localization, contaminant boundary mapping, wireless sensor networks, clustering, knapsack, numerical integration, solving fixed point equations, solving systems of nonlinear equations, and engineering design optimization. The book is a valuable resource for researchers as well as graduate and undergraduate students in the area of swarm intelligence and computational intellige...

  2. Improved Tiled Bitmap Forensic Analysis Algorithm

    C. D. Badgujar, G. N. Dhanokar


    Full Text Available In Computer network world, the needs for securityand proper systems of control are obvious and findout the intruders who do the modification andmodified data. Nowadays Frauds that occurs incompanies are not only by outsiders but also byinsiders. Insider may perform illegal activity & tryto hide illegal activity. Companies would like to beassured that such illegal activity i.e. tampering hasnot occurred, or if it does, it should be quicklydiscovered. Mechanisms now exist that detecttampering of a database, through the use ofcryptographically-strong hash functions. This papercontains a survey which explores the various beliefsupon database forensics through differentmethodologies using forensic algorithms and toolsfor investigations. Forensic analysis algorithms areused to determine who, when, and what data hadbeen tampered. Tiled Bitmap Algorithm introducesthe notion of a candidate set (all possible locationsof detected tampering(s and provides a completecharacterization of the candidate set and itscardinality. Improved tiled bitmap algorithm willcover come the drawbacks of existing tiled bitmapalgorithm.

  3. Differential evolution algorithm for global optimizations in nuclear physics

    Qi, Chong


    We explore the applicability of the differential evolution algorithm in finding the global minima of three typical nuclear structure physics problems: the global deformation minimum in the nuclear potential energy surface, the optimization of mass model parameters and the lowest eigenvalue of a nuclear Hamiltonian. The algorithm works very effectively and efficiently in identifying the minima in all problems we have tested. We also show that the algorithm can be parallelized in a straightforward way.

  4. Nonlinear evaluations of unconditionally stable explicit algorithms

    Shuenn-Yih Chang


    Two explicit integration algorithms with unconditional stability for linear elastic systems have been successfully developed for pseudodynamic testing. Their numerical properties in the solution of a linear elastic system have been well explored and their applications to the pseudodynamic testing of a nonlinear system have been shown to be feasible. However, their numerical properties in the solution of a nonlinear system are not apparent. Therefore, the performance of both algorithms for use in the solution of a nonlinear system has been analytically evaluated after introducing an instantaneous degree of nonlinearity. The two algorithms have roughly the same accuracy for a small value of the product of the natural frequency and step size. Meanwhile, the first algorithm is unconditionally stable when the instantaneous degree of nonlinearity is less than or equal to 1, and it becomes conditionally stable when it is greater than 1. The second algorithm is conditionally stable as the instantaneous degree of nonlinearity is less than 1/9, and becomes unstable when it is greater than I. It can have unconditional stability for the range between I/9 and 1. Based on these evaluations, it was concluded that the first algorithm is superior to the second one. Also, both algorithms were found to require commensurate computational efforts, which are much less than needed for the Newmark explicit method in general structural dynamic problems.

  5. Image segmentation using an improved differential algorithm

    Gao, Hao; Shi, Yujiao; Wu, Dongmei


    Among all the existing segmentation techniques, the thresholding technique is one of the most popular due to its simplicity, robustness, and accuracy (e.g. the maximum entropy method, Otsu's method, and K-means clustering). However, the computation time of these algorithms grows exponentially with the number of thresholds due to their exhaustive searching strategy. As a population-based optimization algorithm, differential algorithm (DE) uses a population of potential solutions and decision-making processes. It has shown considerable success in solving complex optimization problems within a reasonable time limit. Thus, applying this method into segmentation algorithm should be a good choice during to its fast computational ability. In this paper, we first propose a new differential algorithm with a balance strategy, which seeks a balance between the exploration of new regions and the exploitation of the already sampled regions. Then, we apply the new DE into the traditional Otsu's method to shorten the computation time. Experimental results of the new algorithm on a variety of images show that, compared with the EA-based thresholding methods, the proposed DE algorithm gets more effective and efficient results. It also shortens the computation time of the traditional Otsu method.

  6. Exploring Mars

    Breuil, Stéphanie


    Mars is our neighbour planet and has always fascinated humans as it has been seen as a potential abode for life. Knowledge about Mars is huge and was constructed step by step through numerous missions. It could be difficult to describe these missions, the associated technology, the results, the questions they raise, that's why an activity is proposed, that directly interests students. Their production is presented in the poster. Step 1: The main Mars feature and the first Mars explorations using telescope are presented to students. It should be really interesting to present "Mars Canals" from Percival Lowell as it should also warn students against flawed interpretation. Moreover, this study has raised the big question about extra-terrestrial life on Mars for the first time. Using Google Mars is then a good way to show the huge knowledge we have on the planet and to introduce modern missions. Step 2: Students have to choose and describe one of the Mars mission from ESA and NASA. They should work in pairs. Web sites from ESA and NASA are available and the teacher makes sure the main missions will be studied. Step 3: Students have to collect different pieces of information about the mission - When? Which technology? What were the main results? What type of questions does it raise? They prepare an oral presentation in the form they want (role play, academic presentation, using a poster, PowerPoint). They also have to produce playing cards about the mission that could be put on a timeline. Step 4: As a conclusion, the different cards concerning different missions are mixed. Groups of students receive cards and they have to put them on a timeline as fast as possible. It is also possible to play the game "timeline".

  7. Firefly Algorithm for Cardinality Constrained Mean-Variance Portfolio Optimization Problem with Entropy Diversity Constraint

    Nebojsa Bacanin


    portfolio model with entropy constraint. Firefly algorithm is one of the latest, very successful swarm intelligence algorithm; however, it exhibits some deficiencies when applied to constrained problems. To overcome lack of exploration power during early iterations, we modified the algorithm and tested it on standard portfolio benchmark data sets used in the literature. Our proposed modified firefly algorithm proved to be better than other state-of-the-art algorithms, while introduction of entropy diversity constraint further improved results.

  8. Adaptive Alternating Minimization Algorithms

    Niesen, Urs; Wornell, Gregory


    The classical alternating minimization (or projection) algorithm has been successful in the context of solving optimization problems over two variables or equivalently of finding a point in the intersection of two sets. The iterative nature and simplicity of the algorithm has led to its application to many areas such as signal processing, information theory, control, and finance. A general set of sufficient conditions for the convergence and correctness of the algorithm is quite well-known when the underlying problem parameters are fixed. In many practical situations, however, the underlying problem parameters are changing over time, and the use of an adaptive algorithm is more appropriate. In this paper, we study such an adaptive version of the alternating minimization algorithm. As a main result of this paper, we provide a general set of sufficient conditions for the convergence and correctness of the adaptive algorithm. Perhaps surprisingly, these conditions seem to be the minimal ones one would expect in ...

  9. Exploration and extension of an improved Riemann track fitting algorithm

    Strandlie, A.; Frühwirth, R.


    Recently, a new Riemann track fit which operates on translated and scaled measurements has been proposed. This study shows that the new Riemann fit is virtually as precise as popular approaches such as the Kalman filter or an iterative non-linear track fitting procedure, and significantly more precise than other, non-iterative circular track fitting approaches over a large range of measurement uncertainties. The fit is then extended in two directions: first, the measurements are allowed to lie on plane sensors of arbitrary orientation; second, the full error propagation from the measurements to the estimated circle parameters is computed. The covariance matrix of the estimated track parameters can therefore be computed without recourse to asymptotic properties, and is consequently valid for any number of observation. It does, however, assume normally distributed measurement errors. The calculations are validated on a simulated track sample and show excellent agreement with the theoretical expectations.

  10. Fast algorithm for exploring and compressing of large hyperspectral images

    Kucheryavskiy, Sergey


    A new method for calculation of latent variable space for exploratory analysis and dimension reduction of large hyperspectral images is proposed. The method is based on significant downsampling of image pixels with preservation of pixels’ structure in feature (variable) space. To achieve this, in...

  11. Exploring the consequences of pairing algorithms for binary stars

    Kouwenhoven, M B N; Goodwin, S P; Zwart, S F Portegies; Kaper, L


    Knowledge of the binary population in stellar groupings provides important information about the outcome of the star forming process in different environments (see, e.g., Blaauw 1991, and references therein). Binarity is also a key ingredient in stellar population studies, and is a prerequisite to calibrate the binary evolution channels. In this paper we present an overview of several commonly used methods to pair individual stars into binary systems, which we refer to as pairing functions. These pairing functions are frequently used by observers and computational astronomers, either for their mathematical convenience, or because they roughly describe the expected outcome of the star forming process. We discuss the consequences of each pairing function for the interpretation of observations and numerical simulations. The binary fraction and mass ratio distribution generally depend strongly on the selection of the range in primary spectral type in a sample. The mass ratio distribution and binary fraction deriv...

  12. Fast algorithm for exploring and compressing of large hyperspectral images

    Kucheryavskiy, Sergey


    A new method for calculation of latent variable space for exploratory analysis and dimension reduction of large hyperspectral images is proposed. The method is based on significant downsampling of image pixels with preservation of pixels’ structure in feature (variable) space. To achieve this, in...... can be used first of all for fast compression of large data arrays with principal component analysis or similar projection techniques....

  13. Recursive forgetting algorithms

    Parkum, Jens; Poulsen, Niels Kjølstad; Holst, Jan


    In the first part of the paper, a general forgetting algorithm is formulated and analysed. It contains most existing forgetting schemes as special cases. Conditions are given ensuring that the basic convergence properties will hold. In the second part of the paper, the results are applied...... to a specific algorithm with selective forgetting. Here, the forgetting is non-uniform in time and space. The theoretical analysis is supported by a simulation example demonstrating the practical performance of this algorithm...

  14. SIMAS ADM XBT Algorithm


    REFERENCE ONLY NAVAL U~DERWATER SYSTEMS CENTER NEW LONDON LABORATORY NEW LONDON, CONNECTICUT 06320 Technical Memorandum SIMAS ADM XBT ALGORITHM ...REPORT TYPE Technical Memo 3. DATES COVERED 05-12-1984 to 05-12-1984 4. TITLE AND SUBTITLE SIMAS ADM XBT Algorithm 5a. CONTRACT NUMBER 5b...NOTES NUWC2015 14. ABSTRACT An algorithm has been developed for the detection and correction of surface ship launched expendable bathythermograph

  15. Static Analysis Numerical Algorithms


    STATIC ANALYSIS OF NUMERICAL ALGORITHMS KESTREL TECHNOLOGY, LLC APRIL 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...3. DATES COVERED (From - To) NOV 2013 – NOV 2015 4. TITLE AND SUBTITLE STATIC ANALYSIS OF NUMERICAL ALGORITHMS 5a. CONTRACT NUMBER FA8750-14-C... algorithms , linear digital filters and integrating accumulators, modifying existing versions of Honeywell’s HiLiTE model-based development system and

  16. Recursive forgetting algorithms

    Parkum, Jens; Poulsen, Niels Kjølstad; Holst, Jan


    In the first part of the paper, a general forgetting algorithm is formulated and analysed. It contains most existing forgetting schemes as special cases. Conditions are given ensuring that the basic convergence properties will hold. In the second part of the paper, the results are applied...... to a specific algorithm with selective forgetting. Here, the forgetting is non-uniform in time and space. The theoretical analysis is supported by a simulation example demonstrating the practical performance of this algorithm...

  17. Fingerprint Feature Extraction Algorithm

    Mehala. G


    Full Text Available The goal of this paper is to design an efficient Fingerprint Feature Extraction (FFE algorithm to extract the fingerprint features for Automatic Fingerprint Identification Systems (AFIS. FFE algorithm, consists of two major subdivisions, Fingerprint image preprocessing, Fingerprint image postprocessing. A few of the challenges presented in an earlier are, consequently addressed, in this paper. The proposed algorithm is able to enhance the fingerprint image and also extracting true minutiae.

  18. Fingerprint Feature Extraction Algorithm

    Mehala. G


    The goal of this paper is to design an efficient Fingerprint Feature Extraction (FFE) algorithm to extract the fingerprint features for Automatic Fingerprint Identification Systems (AFIS). FFE algorithm, consists of two major subdivisions, Fingerprint image preprocessing, Fingerprint image postprocessing. A few of the challenges presented in an earlier are, consequently addressed, in this paper. The proposed algorithm is able to enhance the fingerprint image and also extractin...

  19. Redesigning linear algebra algorithms

    Dongarra, J.J.


    Many of the standard algorithms in linear algebra as implemented in FORTRAN do not achieve maximum performance on today's large-scale vector computers. The author examines the problem and constructs alternative formulations of algorithms that do not lose the clarity of the original algorithm or sacrifice the FORTRAN portable environment, but do gain the performance attainable on these supercomputers. The resulting implementation not only performs well on vector computers but also increases performance on conventional sequential computers. 13 references.

  20. Redesigning linear algebra algorithms

    Dongarra, J.J.


    Many of the standard algorithms in linear algebra as implemented in FORTRAN do not achieve maximum performance on today's large-scale vector computers. In this paper we examine the problem and construct alternative formulations of algorithms that do not lose the clarity of the original algorithm or sacrifice the Fortran portable environment, but do gain the performance attainable on these supercomputers. The resulting implementation not only performs well on vector computers but also increases performance on conventional sequential computers.

  1. Structural Analysis Extended with Active Fault Isolation - Methods and Algorithms

    Gelso, Esteban R.; Blanke, Mogens


    on system inputs can considerably enhance fault isolability. This paper investigates this possibility of active fault isolation from a structural point of view. While such extension of the structural analysis approach was suggested earlier, algorithms and case studies were needed to explore this theory....... The paper develops algorithms for investigation of the possibilities of active structural isolation and it offers illustrative examples and a larger case study to explore the properties of active structural isolability ideas....

  2. Algorithms in Algebraic Geometry

    Dickenstein, Alicia; Sommese, Andrew J


    In the last decade, there has been a burgeoning of activity in the design and implementation of algorithms for algebraic geometric computation. Some of these algorithms were originally designed for abstract algebraic geometry, but now are of interest for use in applications and some of these algorithms were originally designed for applications, but now are of interest for use in abstract algebraic geometry. The workshop on Algorithms in Algebraic Geometry that was held in the framework of the IMA Annual Program Year in Applications of Algebraic Geometry by the Institute for Mathematics and Its

  3. Distributed Minimum Hop Algorithms


    acknowledgement), node d starts iteration i+1, and otherwise the algorithm terminates. A detailed description of the algorithm is given in pidgin algol...precise behavior of the algorithm under these circumstances is described by the pidgin algol program in the appendix which is executed by each node. The...l) < N!(2) for each neighbor j, and thus by induction,J -1 N!(2-1) < n-i + (Z-1) + N!(Z-1), completing the proof. Algorithm Dl in Pidgin Algol It is

  4. Explaining algorithms using metaphors

    Forišek, Michal


    There is a significant difference between designing a new algorithm, proving its correctness, and teaching it to an audience. When teaching algorithms, the teacher's main goal should be to convey the underlying ideas and to help the students form correct mental models related to the algorithm. This process can often be facilitated by using suitable metaphors. This work provides a set of novel metaphors identified and developed as suitable tools for teaching many of the 'classic textbook' algorithms taught in undergraduate courses worldwide. Each chapter provides exercises and didactic notes fo

  5. License plate detection algorithm

    Broitman, Michael; Klopovsky, Yuri; Silinskis, Normunds


    A novel algorithm for vehicle license plates localization is proposed. The algorithm is based on pixel intensity transition gradient analysis. Near to 2500 natural-scene gray-level vehicle images of different backgrounds and ambient illumination was tested. The best set of algorithm's parameters produces detection rate up to 0.94. Taking into account abnormal camera location during our tests and therefore geometrical distortion and troubles from trees this result could be considered as passable. Correlation between source data, such as license Plate dimensions and texture, cameras location and others, and parameters of algorithm were also defined.

  6. Spectral Decomposition Algorithm (SDA)

    National Aeronautics and Space Administration — Spectral Decomposition Algorithm (SDA) is an unsupervised feature extraction technique similar to PCA that was developed to better distinguish spectral features in...

  7. An Adaptive Unified Differential Evolution Algorithm for Global Optimization

    Qiang, Ji; Mitchell, Chad


    In this paper, we propose a new adaptive unified differential evolution algorithm for single-objective global optimization. Instead of the multiple mutation strate- gies proposed in conventional differential evolution algorithms, this algorithm employs a single equation unifying multiple strategies into one expression. It has the virtue of mathematical simplicity and also provides users the flexibility for broader exploration of the space of mutation operators. By making all control parameters in the proposed algorithm self-adaptively evolve during the process of optimization, it frees the application users from the burden of choosing appro- priate control parameters and also improves the performance of the algorithm. In numerical tests using thirteen basic unimodal and multimodal functions, the proposed adaptive unified algorithm shows promising performance in compari- son to several conventional differential evolution algorithms.

  8. Analysis of Diversity-Preserving Mechanisms for Global Exploration

    Friedrich, Tobias; Oliveto, Pietro S.; Sudholt, Dirk


    Maintaining diversity is important for the performance of evolutionary algorithms. Diversity-preserving mechanisms can enhance global exploration of the search space and enable crossover to find dissimilar individuals for recombination. We focus on the global exploration capabilities of mutation......-based algorithms. Using a simple bimodal test function and rigorous runtime analyses, we compare well-known diversity-preserving mechanisms like deterministic crowding, fitness sharing, and others with a plain algorithm without diversification. We show that diversification is necessary for global exploration...

  9. An Effective Hybrid Artificial Bee Colony Algorithm for Nonnegative Linear Least Squares Problems

    Xiangyu Kong


    Full Text Available An effective hybrid artificial bee colony algorithm is proposed in this paper for nonnegative linear least squares problems. To further improve the performance of algorithm, orthogonal initialization method is employed to generate the initial swarm. Furthermore, to balance the exploration and exploitation abilities, a new search mechanism is designed. The performance of this algorithm is verified by using 27 benchmark functions and 5 nonnegative linear least squares test problems. And the comparison analyses are given between the proposed algorithm and other swarm intelligence algorithms. Numerical results demonstrate that the proposed algorithm displays a high performance compared with other algorithms for global optimization problems and nonnegative linear least squares problems.

  10. Network-Oblivious Algorithms

    Bilardi, Gianfranco; Pietracaprina, Andrea; Pucci, Geppino


    A framework is proposed for the design and analysis of network-oblivious algorithms, namely algorithms that can run unchanged, yet efficiently, on a variety of machines characterized by different degrees of parallelism and communication capabilities. The framework prescribes that a network-oblivi...

  11. Shape formation algorithm


    This project concerns the implementation of a decentralized algorithm for shape formation. The first idea was to test this algorithm with a swarm of autonomous drones but, due to the lack of time and the complexity of the project, the work was just developed in 2D and in simulation.

  12. Graph Colouring Algorithms

    Husfeldt, Thore


    This chapter presents an introduction to graph colouring algorithms. The focus is on vertex-colouring algorithms that work for general classes of graphs with worst-case performance guarantees in a sequential model of computation. The presentation aims to demonstrate the breadth of available...

  13. Algorithms for finite rings

    Ciocanea Teodorescu I.,


    In this thesis we are interested in describing algorithms that answer questions arising in ring and module theory. Our focus is on deterministic polynomial-time algorithms and rings and modules that are finite. The first main result of this thesis is a solution to the module isomorphism problem in

  14. Implementing Vehicle Routing Algorithms


    Multiple Depot Vehicle Dispatch Problem," presented at the URSA/TIMS meeting San Juan, Puerto Rico, Oct. 1974. 28. Gillett, B., and Miller, L., " A Heuristic Algorithm for...45. Lam, T., "Comments on a Heuristic Algorithm for the Multiple Terminal Delivery Problem," Transportation Science, Vol. 4, No. 4, Nov. 1970, p. 403

  15. Parallel scheduling algorithms

    Dekel, E.; Sahni, S.


    Parallel algorithms are given for scheduling problems such as scheduling to minimize the number of tardy jobs, job sequencing with deadlines, scheduling to minimize earliness and tardiness penalties, channel assignment, and minimizing the mean finish time. The shared memory model of parallel computers is used to obtain fast algorithms. 26 references.

  16. Algorithms for finite rings

    Ciocanea Teodorescu I.,


    In this thesis we are interested in describing algorithms that answer questions arising in ring and module theory. Our focus is on deterministic polynomial-time algorithms and rings and modules that are finite. The first main result of this thesis is a solution to the module isomorphism problem in

  17. More On Grover's Algorithm

    Loo, K


    The goals of this paper are to show the following. First, Grover's algorithm can be viewed as a digital approximation to the analog quantum algorithm proposed in "An Analog Analogue of a Digital Quantum Computation", by E. Farhi and S. Gutmann, Phys.Rev. A 57, 2403 - 2406 (1998), quant-ph/9612026. We will call the above analog algorithm the Grover-Farhi-Gutmann or GFG algorithm. Second, the propagator of the GFG algorithm can be written as a sum-over-paths formula and given a sum-over-path interpretation, i.e., a Feynman path sum/integral. We will use nonstandard analysis to do this. Third, in the semi-classical limit $\\hbar\\to 0$, both the Grover and the GFG algorithms (viewed in the setting of the approximation in this paper) must run instantaneously. Finally, we will end the paper with an open question. In "Semiclassical Shor's Algorithm", by P. Giorda, et al, Phys. Rev.A 70, 032303 (2004), quant-ph/0303037, the authors proposed building semi-classical quantum computers to run Shor's algorithm because the ...

  18. Governance by algorithms

    Francesca Musiani


    Full Text Available Algorithms are increasingly often cited as one of the fundamental shaping devices of our daily, immersed-in-information existence. Their importance is acknowledged, their performance scrutinised in numerous contexts. Yet, a lot of what constitutes 'algorithms' beyond their broad definition as “encoded procedures for transforming input data into a desired output, based on specified calculations” (Gillespie, 2013 is often taken for granted. This article seeks to contribute to the discussion about 'what algorithms do' and in which ways they are artefacts of governance, providing two examples drawing from the internet and ICT realm: search engine queries and e-commerce websites’ recommendations to customers. The question of the relationship between algorithms and rules is likely to occupy an increasingly central role in the study and the practice of internet governance, in terms of both institutions’ regulation of algorithms, and algorithms’ regulation of our society.

  19. Network-Oblivious Algorithms

    Bilardi, Gianfranco; Pietracaprina, Andrea; Pucci, Geppino


    A framework is proposed for the design and analysis of network-oblivious algorithms, namely algorithms that can run unchanged, yet efficiently, on a variety of machines characterized by different degrees of parallelism and communication capabilities. The framework prescribes that a network......-oblivious algorithm be specified on a parallel model of computation where the only parameter is the problem’s input size, and then evaluated on a model with two parameters, capturing parallelism granularity and communication latency. It is shown that for a wide class of network-oblivious algorithms, optimality...... of cache hierarchies, to the realm of parallel computation. Its effectiveness is illustrated by providing optimal network-oblivious algorithms for a number of key problems. Some limitations of the oblivious approach are also discussed....

  20. A New Modified Firefly Algorithm

    Medha Gupta


    Full Text Available Nature inspired meta-heuristic algorithms studies the emergent collective intelligence of groups of simple agents. Firefly Algorithm is one of the new such swarm-based metaheuristic algorithm inspired by the flashing behavior of fireflies. The algorithm was first proposed in 2008 and since then has been successfully used for solving various optimization problems. In this work, we intend to propose a new modified version of Firefly algorithm (MoFA and later its performance is compared with the standard firefly algorithm along with various other meta-heuristic algorithms. Numerical studies and results demonstrate that the proposed algorithm is superior to existing algorithms.

  1. Mapped Landmark Algorithm for Precision Landing

    Johnson, Andrew; Ansar, Adnan; Matthies, Larry


    A report discusses a computer vision algorithm for position estimation to enable precision landing during planetary descent. The Descent Image Motion Estimation System for the Mars Exploration Rovers has been used as a starting point for creating code for precision, terrain-relative navigation during planetary landing. The algorithm is designed to be general because it handles images taken at different scales and resolutions relative to the map, and can produce mapped landmark matches for any planetary terrain of sufficient texture. These matches provide a measurement of horizontal position relative to a known landing site specified on the surface map. Multiple mapped landmarks generated per image allow for automatic detection and elimination of bad matches. Attitude and position can be generated from each image; this image-based attitude measurement can be used by the onboard navigation filter to improve the attitude estimate, which will improve the position estimates. The algorithm uses normalized correlation of grayscale images, producing precise, sub-pixel images. The algorithm has been broken into two sub-algorithms: (1) FFT Map Matching (see figure), which matches a single large template by correlation in the frequency domain, and (2) Mapped Landmark Refinement, which matches many small templates by correlation in the spatial domain. Each relies on feature selection, the homography transform, and 3D image correlation. The algorithm is implemented in C++ and is rated at Technology Readiness Level (TRL) 4.

  2. A new algorithm of brain volume contours segmentation

    吴建明; 施鹏飞


    This paper explores brain CT slices segmentation technique and some related problems, including contours segmentation algorithms, edge detector, algorithm evaluation and experimental results. This article describes a method for contour-based segmentation of anatomical structures in 3D medical data sets. With this method, the user manually traces one or more 2D contours of an anatomical structure of interest on parallel planes arbitrarily cutting the data set. The experimental results showes the segmentation based on 3D brain volume and 2D CT slices. The main creative contributions in this paper are: (1) contours segmentation algorithm; (2) edge detector; (3) algorithm evaluation.


    王燕军; 张可村


    In this paper we present a transformation path algorithm for Unconstrained Signomial Geometric Programming (USGP). The algorithm is proposed from a new point of view based on exploring the characteristics of USGP problem. Firstly by some stable transformations, a particular subproblem is derived which is very easy to solve.Secondly, a special path is formed conveniently. And then the step of the algorithm consists in finding a "good" point to the current iterate by choosing it along the special path and within a trust region. It is proved that the algorithm is globally convergent.

  4. A New Class of Hybrid Particle Swarm Optimization Algorithm

    Da-Qing Guo; Yong-Jin Zhao; Hui Xiong; Xiao Li


    A new class of hybrid particle swarm optimization (PSO) algorithm is developed for solving the premature convergence caused by some particles in standard PSO fall into stagnation. In this algorithm, the linearly decreasing inertia weight technique (LDIW) and the mutative scale chaos optimization algorithm (MSCOA) are combined with standard PSO, which are used to balance the global and local exploration abilities and enhance the local searching abilities, respectively. In order to evaluate the performance of the new method, three benchmark functions are used. The simulation results confirm the proposed algorithm can greatly enhance the searching ability and effectively improve the premature convergence.

  5. Algorithms in Singular

    Hans Schonemann


    Full Text Available Some algorithms for singularity theory and algebraic geometry The use of Grobner basis computations for treating systems of polynomial equations has become an important tool in many areas. This paper introduces of the concept of standard bases (a generalization of Grobner bases and the application to some problems from algebraic geometry. The examples are presented as SINGULAR commands. A general introduction to Grobner bases can be found in the textbook [CLO], an introduction to syzygies in [E] and [St1]. SINGULAR is a computer algebra system for computing information about singularities, for use in algebraic geometry. The basic algorithms in SINGULAR are several variants of a general standard basis algorithm for general monomial orderings (see [GG]. This includes wellorderings (Buchberger algorithm ([B1], [B2] and tangent cone orderings (Mora algorithm ([M1], [MPT] as special cases: It is able to work with non-homogeneous and homogeneous input and also to compute in the localization of the polynomial ring in 0. Recent versions include algorithms to factorize polynomials and a factorizing Grobner basis algorithm. For a complete description of SINGULAR see [Si].

  6. Optimal Pid Controller Design Using Adaptive Vurpso Algorithm

    Zirkohi, Majid Moradi


    The purpose of this paper is to improve theVelocity Update Relaxation Particle Swarm Optimization algorithm (VURPSO). The improved algorithm is called Adaptive VURPSO (AVURPSO) algorithm. Then, an optimal design of a Proportional-Integral-Derivative (PID) controller is obtained using the AVURPSO algorithm. An adaptive momentum factor is used to regulate a trade-off between the global and the local exploration abilities in the proposed algorithm. This operation helps the system to reach the optimal solution quickly and saves the computation time. Comparisons on the optimal PID controller design confirm the superiority of AVURPSO algorithm to the optimization algorithms mentioned in this paper namely the VURPSO algorithm, the Ant Colony algorithm, and the conventional approach. Comparisons on the speed of convergence confirm that the proposed algorithm has a faster convergence in a less computation time to yield a global optimum value. The proposed AVURPSO can be used in the diverse areas of optimization problems such as industrial planning, resource allocation, scheduling, decision making, pattern recognition and machine learning. The proposed AVURPSO algorithm is efficiently used to design an optimal PID controller.

  7. Middle matching mining algorithm

    GUO Ping; CHEN Li


    A new algorithm for fast discovery of sequential patterns to solve the problems of too many candidate sets made by SPADE is presented, which is referred to as middle matching algorithm. Experiments on a large customer transaction database consisting of customer_id, transaction time, and transaction items demonstrate that the proposed algorithm performs better than SPADE attributed to its philosophy to generate a candidate set by matching two sequences in the middle place so as to reduce the number of the candidate sets.

  8. Kidney-inspired algorithm for optimization problems

    Jaddi, Najmeh Sadat; Alvankarian, Jafar; Abdullah, Salwani


    In this paper, a population-based algorithm inspired by the kidney process in the human body is proposed. In this algorithm the solutions are filtered in a rate that is calculated based on the mean of objective functions of all solutions in the current population of each iteration. The filtered solutions as the better solutions are moved to filtered blood and the rest are transferred to waste representing the worse solutions. This is a simulation of the glomerular filtration process in the kidney. The waste solutions are reconsidered in the iterations if after applying a defined movement operator they satisfy the filtration rate, otherwise it is expelled from the waste solutions, simulating the reabsorption and excretion functions of the kidney. In addition, a solution assigned as better solution is secreted if it is not better than the worst solutions simulating the secreting process of blood in the kidney. After placement of all the solutions in the population, the best of them is ranked, the waste and filtered blood are merged to become a new population and the filtration rate is updated. Filtration provides the required exploitation while generating a new solution and reabsorption gives the necessary exploration for the algorithm. The algorithm is assessed by applying it on eight well-known benchmark test functions and compares the results with other algorithms in the literature. The performance of the proposed algorithm is better on seven out of eight test functions when it is compared with the most recent researches in literature. The proposed kidney-inspired algorithm is able to find the global optimum with less function evaluations on six out of eight test functions. A statistical analysis further confirms the ability of this algorithm to produce good-quality results.

  9. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Jianyong Liu


    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  10. Exploration vs Exploitation in Bayesian Optimization

    Jalali, Ali; Fern, Xiaoli


    The problem of optimizing unknown costly-to-evaluate functions has been studied for a long time in the context of Bayesian Optimization. Algorithms in this field aim to find the optimizer of the function by asking only a few function evaluations at locations carefully selected based on a posterior model. In this paper, we assume the unknown function is Lipschitz continuous. Leveraging the Lipschitz property, we propose an algorithm with a distinct exploration phase followed by an exploitation phase. The exploration phase aims to select samples that shrink the search space as much as possible. The exploitation phase then focuses on the reduced search space and selects samples closest to the optimizer. Considering the Expected Improvement (EI) as a baseline, we empirically show that the proposed algorithm significantly outperforms EI.

  11. Conceptual exploration package for data fusion

    Jousselme, Anne-Laure; Grenier, Dominic; Bosse, Eloi


    In this paper, we present a software package designed to explore data fusion area applied to different contexts. This tool, called CEPfuse (Conceptual Exploration Package for Data Fusion) provides a good support to become familiar with all concepts and vocabulary linked to data fusion. Developed with Matlab 5.2, it's also a good tool to test, compare and analyze algorithms. Although the core of this package is evidential reasoning and identity information fusion, it has been conceived to develop all the interesting part of the Multi-Sensor Data Fusion system. Actually, because we concentrate our research work on identity information fusion, the principal included algorithms are Dempster- Shafer rules of combination, Shafer-Logan algorithms for hierarchical structures, and several decision rules.

  12. Unsupervised learning algorithms

    Aydin, Kemal


    This book summarizes the state-of-the-art in unsupervised learning. The contributors discuss how with the proliferation of massive amounts of unlabeled data, unsupervised learning algorithms, which can automatically discover interesting and useful patterns in such data, have gained popularity among researchers and practitioners. The authors outline how these algorithms have found numerous applications including pattern recognition, market basket analysis, web mining, social network analysis, information retrieval, recommender systems, market research, intrusion detection, and fraud detection. They present how the difficulty of developing theoretically sound approaches that are amenable to objective evaluation have resulted in the proposal of numerous unsupervised learning algorithms over the past half-century. The intended audience includes researchers and practitioners who are increasingly using unsupervised learning algorithms to analyze their data. Topics of interest include anomaly detection, clustering,...

  13. Diagnostic Algorithm Benchmarking

    Poll, Scott


    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  14. Inclusive Flavour Tagging Algorithm

    Likhomanenko, Tatiana; Derkach, Denis; Rogozhnikov, Alex


    Identifying the flavour of neutral B mesons production is one of the most important components needed in the study of time-dependent CP violation. The harsh environment of the Large Hadron Collider makes it particularly hard to succeed in this task. We present an inclusive flavour-tagging algorithm as an upgrade of the algorithms currently used by the LHCb experiment. Specifically, a probabilistic model which efficiently combines information from reconstructed vertices and tracks using machine learning is proposed. The algorithm does not use information about underlying physics process. It reduces the dependence on the performance of lower level identification capacities and thus increases the overall performance. The proposed inclusive flavour-tagging algorithm is applicable to tag the flavour of B mesons in any proton-proton experiment.

  15. Algorithmic phase diagrams

    Hockney, Roger


    Algorithmic phase diagrams are a neat and compact representation of the results of comparing the execution time of several algorithms for the solution of the same problem. As an example, the recent results are shown of Gannon and Van Rosendale on the solution of multiple tridiagonal systems of equations in the form of such diagrams. The act of preparing these diagrams has revealed an unexpectedly complex relationship between the best algorithm and the number and size of the tridiagonal systems, which was not evident from the algebraic formulae in the original paper. Even so, for a particular computer, one diagram suffices to predict the best algorithm for all problems that are likely to be encountered the prediction being read directly from the diagram without complex calculation.

  16. Web Classification Using DYN FP Algorithm

    Bhanu Pratap Singh


    Full Text Available Web mining is the application of data mining techniques to extract knowledge from Web. Web mining has been explored to a vast degree and different techniques have been proposed for a variety of applications that includes Web Search, Classification and Personalization etc. The primary goal of the web site is to provide the relevant information to the users. Web mining technique is used to categorize users and pages by analyzing users behavior, the content of pages and order of URLs accessed. In this paper, proposes an auto-classification algorithm of web pages using data mining techniques. The problem of discovering association rules between terms in a set of web pages belonging to a category in a search engine database, and present an auto – classification algorithm for solving this problem that are fundamentally based on FP-growth algorithm

  17. Concurrent bisimulation algorithm

    Kułakowski, Konrad


    The coarsest bisimulation-finding problem plays an important role in the formal analysis of concurrent systems. For example, solving this problem allows the behavior of different processes to be compared or specifications to be verified. Hence, in this paper an efficient concurrent bisimulation algorithm is presented. It is based on the sequential Paige and Tarjan algorithm and the concept of the state signatures. The original solution follows Hopcroft's principle "process the smaller half". ...

  18. Parallel Wolff Cluster Algorithms

    Bae, S.; Ko, S. H.; Coddington, P. D.

    The Wolff single-cluster algorithm is the most efficient method known for Monte Carlo simulation of many spin models. Due to the irregular size, shape and position of the Wolff clusters, this method does not easily lend itself to efficient parallel implementation, so that simulations using this method have thus far been confined to workstations and vector machines. Here we present two parallel implementations of this algorithm, and show that one gives fairly good performance on a MIMD parallel computer.

  19. Implementation of Parallel Algorithms


    their socia ’ relations or to achieve some goals. For example, we define a pair-wise force law of i epulsion and attraction for a group of identical...quantization based compression schemes. Photo-refractive crystals, which provide high density recording in real time, are used as our holographic media . The...of Parallel Algorithms (J. Reif, ed.). Kluwer Academic Pu’ ishers, 1993. (4) "A Dynamic Separator Algorithm", D. Armon and J. Reif. To appear in

  20. Quantum algorithmic information theory

    Svozil, Karl


    The agenda of quantum algorithmic information theory, ordered `top-down,' is the quantum halting amplitude, followed by the quantum algorithmic information content, which in turn requires the theory of quantum computation. The fundamental atoms processed by quantum computation are the quantum bits which are dealt with in quantum information theory. The theory of quantum computation will be based upon a model of universal quantum computer whose elementary unit is a two-port interferometer capa...

  1. On Quantum Algorithms

    Cleve, R; Henderson, L; Macchiavello, C; Mosca, M


    Quantum computers use the quantum interference of different computational paths to enhance correct outcomes and suppress erroneous outcomes of computations. In effect, they follow the same logical paradigm as (multi-particle) interferometers. We show how most known quantum algorithms, including quantum algorithms for factorising and counting, may be cast in this manner. Quantum searching is described as inducing a desired relative phase between two eigenvectors to yield constructive interference on the sought elements and destructive interference on the remaining terms.

  2. BESⅢ track fitting algorithm

    WANG Ji-Ke; MAO Ze-Pu; BIAN Jian-Ming; CAO Guo-Fu; CAO Xue-Xiang; CHEN Shen-Jian; DENG Zi-Yan; FU Cheng-Dong; GAO Yuan-Ning; HE Kang-Lin; HE Miao; HUA Chun-Fei; HUANG Bin; HUANG Xing-Tao; JI Xiao-Sin; LI Fei; LI Hai-Bo; LI Wei-Dong; LIANG Yu-Tie; LIU Chun-Xiu; LIU Huai-Min; LIU Suo; LIU Ying-Jie; MA Qiu-Mei; MA Xiang; MAO Ya-Jun; MO Xiao-Hu; PAN Ming-Hua; PANG Cai-Ying; PING Rong-Gang; QIN Ya-Hong; QIU Jin-Fa; SUN Sheng-Sen; SUN Yong-Zhao; WANG Liang-Liang; WEN Shuo-Pin; WU Ling-Hui; XIE Yu-Guang; XU Min; YAN Liang; YOU Zheng-Yun; YUAN Chang-Zheng; YUAN Ye; ZHANG Bing-Yun; ZHANG Chang-Chun; ZHANG Jian-Yong; ZHANG Xue-Yao; ZHANG Yao; ZHENG Yang-Heng; ZHU Ke-Jun; ZHU Yong-Sheng; ZHU Zhi-Li; ZOU Jia-Heng


    A track fitting algorithm based on the Kalman filter method has been developed for BESⅢ of BEPCⅡ.The effects of multiple scattering and energy loss when the charged particles go through the detector,non-uniformity of magnetic field (NUMF) and wire sag, etc., have been carefully handled.This algorithm works well and the performance satisfies the physical requirements tested by the simulation data.

  3. Optimization algorithms and applications

    Arora, Rajesh Kumar


    Choose the Correct Solution Method for Your Optimization ProblemOptimization: Algorithms and Applications presents a variety of solution techniques for optimization problems, emphasizing concepts rather than rigorous mathematical details and proofs. The book covers both gradient and stochastic methods as solution techniques for unconstrained and constrained optimization problems. It discusses the conjugate gradient method, Broyden-Fletcher-Goldfarb-Shanno algorithm, Powell method, penalty function, augmented Lagrange multiplier method, sequential quadratic programming, method of feasible direc

  4. MAKHA—A New Hybrid Swarm Intelligence Global Optimization Algorithm

    Ahmed M.E. Khalil


    Full Text Available The search for efficient and reliable bio-inspired optimization methods continues to be an active topic of research due to the wide application of the developed methods. In this study, we developed a reliable and efficient optimization method via the hybridization of two bio-inspired swarm intelligence optimization algorithms, namely, the Monkey Algorithm (MA and the Krill Herd Algorithm (KHA. The hybridization made use of the efficient steps in each of the two original algorithms and provided a better balance between the exploration/diversification steps and the exploitation/intensification steps. The new hybrid algorithm, MAKHA, was rigorously tested with 27 benchmark problems and its results were compared with the results of the two original algorithms. MAKHA proved to be considerably more reliable and more efficient in tested problems.

  5. The Motif Tracking Algorithm


    The search for patterns or motifs in data represents a problem area of key interest to finance and economic researchers. In this paper, we introduce the motif tracking algorithm (MTA), a novel immune inspired (IS) pattern identification tool that is able to identify unknown motifs of a non specified length which repeat within time series data. The power of the algorithm comes from the fact that it uses a small number of parameters with minimal assumptions regarding the data being examined or the underlying motifs. Our interest lies in applying the algorithm to financial time series data to identify unknown patterns that exist. The algorithm is tested using three separate data sets. Particular suitability to financial data is shown by applying it to oil price data. In all cases, the algorithm identifies the presence of a motif population in a fast and efficient manner due to the utilization of an intuitive symbolic representation.The resulting population of motifs is shown to have considerable potential value for other applications such as forecasting and algorithm seeding.

  6. Fast Local Computation Algorithms

    Rubinfeld, Ronitt; Vardi, Shai; Xie, Ning


    For input $x$, let $F(x)$ denote the set of outputs that are the "legal" answers for a computational problem $F$. Suppose $x$ and members of $F(x)$ are so large that there is not time to read them in their entirety. We propose a model of {\\em local computation algorithms} which for a given input $x$, support queries by a user to values of specified locations $y_i$ in a legal output $y \\in F(x)$. When more than one legal output $y$ exists for a given $x$, the local computation algorithm should output in a way that is consistent with at least one such $y$. Local computation algorithms are intended to distill the common features of several concepts that have appeared in various algorithmic subfields, including local distributed computation, local algorithms, locally decodable codes, and local reconstruction. We develop a technique, based on known constructions of small sample spaces of $k$-wise independent random variables and Beck's analysis in his algorithmic approach to the Lov{\\'{a}}sz Local Lemma, which und...

  7. RFID Location Algorithm

    Wang Zi Min


    Full Text Available With the development of social services, people’s living standards improve further requirements, there is an urgent need for a way to adapt to the complex situation of the new positioning technology. In recent years, RFID technology have a wide range of applications in all aspects of life and production, such as logistics tracking, car alarm, security and other items. The use of RFID technology to locate, it is a new direction in the eyes of the various research institutions and scholars. RFID positioning technology system stability, the error is small and low-cost advantages of its location algorithm is the focus of this study.This article analyzes the layers of RFID technology targeting methods and algorithms. First, RFID common several basic methods are introduced; Secondly, higher accuracy to political network location method; Finally, LANDMARC algorithm will be described. Through this it can be seen that advanced and efficient algorithms play an important role in increasing RFID positioning accuracy aspects.Finally, the algorithm of RFID location technology are summarized, pointing out the deficiencies in the algorithm, and put forward a follow-up study of the requirements, the vision of a better future RFID positioning technology.

  8. An algorithm for visual fields.

    Trobe, J D; Acosta, P C


    The authors present an algorithm, or sequential strategy, for the performance and interpretation of visual fields. Based on the principle that particular areas of the visual field are relatively vulnerable and that particular defect configurations are relatively more diagnostic, the strategy proposes certain "core maneuvers" for all initial examinations, followed by selective exploration of the vertical meridian "rectangle" and central "keyhole." This constitutes the "qualitative" portion of perimetry and combines threshold kinetic and suprathreshold static techniques. If time and patience permit, qualitative perimetry is followed by quantiative definition of defect size, depth and slope, using the requisite number of stimuli. This approach may be adapted to all visual field instruments. The interpretation of the field defects is based on assessing their localizing features.

  9. Predicting mining activity with parallel genetic algorithms

    Talaie, S.; Leigh, R.; Louis, S.J.; Raines, G.L.; Beyer, H.G.; O'Reilly, U.M.; Banzhaf, Arnold D.; Blum, W.; Bonabeau, C.; Cantu-Paz, E.W.; ,; ,


    We explore several different techniques in our quest to improve the overall model performance of a genetic algorithm calibrated probabilistic cellular automata. We use the Kappa statistic to measure correlation between ground truth data and data predicted by the model. Within the genetic algorithm, we introduce a new evaluation function sensitive to spatial correctness and we explore the idea of evolving different rule parameters for different subregions of the land. We reduce the time required to run a simulation from 6 hours to 10 minutes by parallelizing the code and employing a 10-node cluster. Our empirical results suggest that using the spatially sensitive evaluation function does indeed improve the performance of the model and our preliminary results also show that evolving different rule parameters for different regions tends to improve overall model performance. Copyright 2005 ACM.

  10. An Ant-Based Model for Multiple Sequence Alignment

    Guinand, Frédéric


    Multiple sequence alignment is a key process in today's biology, and finding a relevant alignment of several sequences is much more challenging than just optimizing some improbable evaluation functions. Our approach for addressing multiple sequence alignment focuses on the building of structures in a new graph model: the factor graph model. This model relies on block-based formulation of the original problem, formulation that seems to be one of the most suitable ways for capturing evolutionary aspects of alignment. The structures are implicitly built by a colony of ants laying down pheromones in the factor graphs, according to relations between blocks belonging to the different sequences.

  11. DAMEWARE - Data Mining & Exploration Web Application Resource

    Brescia, Massimo; Esposito, Francesco; Fiore, Michelangelo; Garofalo, Mauro; Guglielmo, Magda; Longo, Giuseppe; Manna, Francesco; Nocella, Alfonso; Vellucci, Civita


    Astronomy is undergoing through a methodological revolution triggered by an unprecedented wealth of complex and accurate data. DAMEWARE (DAta Mining & Exploration Web Application and REsource) is a general purpose, Web-based, Virtual Observatory compliant, distributed data mining framework specialized in massive data sets exploration with machine learning methods. We present the DAMEWARE (DAta Mining & Exploration Web Application REsource) which allows the scientific community to perform data mining and exploratory experiments on massive data sets, by using a simple web browser. DAMEWARE offers several tools which can be seen as working environments where to choose data analysis functionalities such as clustering, classification, regression, feature extraction etc., together with models and algorithms.

  12. Adaptive symbiotic organisms search (SOS algorithm for structural design optimization

    Ghanshyam G. Tejani


    Full Text Available The symbiotic organisms search (SOS algorithm is an effective metaheuristic developed in 2014, which mimics the symbiotic relationship among the living beings, such as mutualism, commensalism, and parasitism, to survive in the ecosystem. In this study, three modified versions of the SOS algorithm are proposed by introducing adaptive benefit factors in the basic SOS algorithm to improve its efficiency. The basic SOS algorithm only considers benefit factors, whereas the proposed variants of the SOS algorithm, consider effective combinations of adaptive benefit factors and benefit factors to study their competence to lay down a good balance between exploration and exploitation of the search space. The proposed algorithms are tested to suit its applications to the engineering structures subjected to dynamic excitation, which may lead to undesirable vibrations. Structure optimization problems become more challenging if the shape and size variables are taken into account along with the frequency. To check the feasibility and effectiveness of the proposed algorithms, six different planar and space trusses are subjected to experimental analysis. The results obtained using the proposed methods are compared with those obtained using other optimization methods well established in the literature. The results reveal that the adaptive SOS algorithm is more reliable and efficient than the basic SOS algorithm and other state-of-the-art algorithms.

  13. Algebraic dynamics algorithm: Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    WANG ShunJin; ZHANG Hua


    Based on the exact analytical solution of ordinary differential equations,a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm.A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models.The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision,and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  14. Algebraic dynamics algorithm:Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm


    Based on the exact analytical solution of ordinary differential equations, a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm. A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models. The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision, and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  15. Parallelization of game theoretic centrality algorithms

    M Vishnu Sankar; Balaraman Ravindran


    Communication has become a lot easier with the advent of easy and cheap means of reaching people across the globe. This has allowed the development of large networked communities and, with the technology available to track them, has opened up the study of social networks at unprecedented scales. This has necessitated the scaling up of various network analysis algorithms that have been proposed earlier in the literature. While some algorithms can be readily adapted to large networks, in many cases the adaptation is not trivial. In this work, we explore the scaling up of a class of node centrality algorithms based on cooperative game theory. These were proposed earlier as an efficient alternatives to traditional measure of information diffusion centrality.We present here distributed versions of these algorithms in a Map-Reduce framework, currently the most popular distributed computing paradigm. We empirically demonstrate the scaling behavior of our algorithm on very large synthetic networks thereby establishing the utility of these methods in settings such as online social networks.

  16. Focused Crawler Optimization Using Genetic Algorithm

    Hartanto Kusuma Wardana


    Full Text Available As the size of the Web continues to grow, searching it for useful information has become more difficult. Focused crawler intends to explore the Web conform to a specific topic. This paper discusses the problems caused by local searching algorithms. Crawler can be trapped within a limited Web community and overlook suitable Web pages outside its track. A genetic algorithm as a global searching algorithm is modified to address the problems. The genetic algorithm is used to optimize Web crawling and to select more suitable Web pages to be fetched by the crawler. Several evaluation experiments are conducted to examine the effectiveness of the approach. The crawler delivers collections consist of 3396 Web pages from 5390 links which had been visited, or filtering rate of Roulette-Wheel selection at 63% and precision level at 93% in 5 different categories. The result showed that the utilization of genetic algorithm had empowered focused crawler to traverse the Web comprehensively, despite it relatively small collections. Furthermore, it brought up a great potential for building an exemplary collections compared to traditional focused crawling methods.

  17. Hybridizing rapidly exploring random trees and basin hopping yields an improved exploration of energy landscapes.

    Roth, Christine-Andrea; Dreyfus, Tom; Robert, Charles H; Cazals, Frédéric


    The number of local minima of the potential energy landscape (PEL) of molecular systems generally grows exponentially with the number of degrees of freedom, so that a crucial property of PEL exploration algorithms is their ability to identify local minima, which are low lying and diverse. In this work, we present a new exploration algorithm, retaining the ability of basin hopping (BH) to identify local minima, and that of transition based rapidly exploring random trees (T-RRT) to foster the exploration of yet unexplored regions. This ability is obtained by interleaving calls to the extension procedures of BH and T-RRT, and we show tuning the balance between these two types of calls allows the algorithm to focus on low lying regions. Computational efficiency is obtained using state-of-the art data structures, in particular for searching approximate nearest neighbors in metric spaces. We present results for the BLN69, a protein model whose conformational space has dimension 207 and whose PEL has been studied exhaustively. On this system, we show that the propensity of our algorithm to explore low lying regions of the landscape significantly outperforms those of BH and T-RRT.

  18. Dynamic Harmony Search with Polynomial Mutation Algorithm for Valve-Point Economic Load Dispatch

    M. Karthikeyan


    mutation (DHSPM algorithm to solve ORPD problem. In DHSPM algorithm the key parameters of HS algorithm like harmony memory considering rate (HMCR and pitch adjusting rate (PAR are changed dynamically and there is no need to predefine these parameters. Additionally polynomial mutation is inserted in the updating step of HS algorithm to favor exploration and exploitation of the search space. The DHSPM algorithm is tested with three power system cases consisting of 3, 13, and 40 thermal units. The computational results show that the DHSPM algorithm is more effective in finding better solutions than other computational intelligence based methods.

  19. Agency and Algorithms

    Hanns Holger Rutz


    Full Text Available Although the concept of algorithms has been established a long time ago, their current topicality indicates a shift in the discourse. Classical definitions based on logic seem to be inadequate to describe their aesthetic capabilities. New approaches stress their involvement in material practices as well as their incompleteness. Algorithmic aesthetics can no longer be tied to the static analysis of programs, but must take into account the dynamic and experimental nature of coding practices. It is suggested that the aesthetic objects thus produced articulate something that could be called algorithmicity or the space of algorithmic agency. This is the space or the medium – following Luhmann’s form/medium distinction – where human and machine undergo mutual incursions. In the resulting coupled “extimate” writing process, human initiative and algorithmic speculation cannot be clearly divided out any longer. An observation is attempted of defining aspects of such a medium by drawing a trajectory across a number of sound pieces. The operation of exchange between form and medium I call reconfiguration and it is indicated by this trajectory. 

  20. The Motif Tracking Algorithm

    Wilson, William; Aickelin, Uwe; 10.1007/s11633.008.0032.0


    The search for patterns or motifs in data represents a problem area of key interest to finance and economic researchers. In this paper we introduce the Motif Tracking Algorithm, a novel immune inspired pattern identification tool that is able to identify unknown motifs of a non specified length which repeat within time series data. The power of the algorithm comes from the fact that it uses a small number of parameters with minimal assumptions regarding the data being examined or the underlying motifs. Our interest lies in applying the algorithm to financial time series data to identify unknown patterns that exist. The algorithm is tested using three separate data sets. Particular suitability to financial data is shown by applying it to oil price data. In all cases the algorithm identifies the presence of a motif population in a fast and efficient manner due to the utilisation of an intuitive symbolic representation. The resulting population of motifs is shown to have considerable potential value for other ap...

  1. A Parallel Butterfly Algorithm

    Poulson, Jack


    The butterfly algorithm is a fast algorithm which approximately evaluates a discrete analogue of the integral transform (Equation Presented.) at large numbers of target points when the kernel, K(x, y), is approximately low-rank when restricted to subdomains satisfying a certain simple geometric condition. In d dimensions with O(Nd) quasi-uniformly distributed source and target points, when each appropriate submatrix of K is approximately rank-r, the running time of the algorithm is at most O(r2Nd logN). A parallelization of the butterfly algorithm is introduced which, assuming a message latency of α and per-process inverse bandwidth of β, executes in at most (Equation Presented.) time using p processes. This parallel algorithm was then instantiated in the form of the open-source DistButterfly library for the special case where K(x, y) = exp(iΦ(x, y)), where Φ(x, y) is a black-box, sufficiently smooth, real-valued phase function. Experiments on Blue Gene/Q demonstrate impressive strong-scaling results for important classes of phase functions. Using quasi-uniform sources, hyperbolic Radon transforms, and an analogue of a three-dimensional generalized Radon transform were, respectively, observed to strong-scale from 1-node/16-cores up to 1024-nodes/16,384-cores with greater than 90% and 82% efficiency, respectively. © 2014 Society for Industrial and Applied Mathematics.

  2. Quantum CPU and Quantum Algorithm

    Wang, An Min


    Making use of an universal quantum network -- QCPU proposed by me\\upcite{My1}, it is obtained that the whole quantum network which can implement some the known quantum algorithms including Deutsch algorithm, quantum Fourier transformation, Shor's algorithm and Grover's algorithm.

  3. Law and Order in Algorithmics

    Fokkinga, M.M.


    An algorithm is the input-output effect of a computer program; mathematically, the notion of algorithm comes close to the notion of function. Just as arithmetic is the theory and practice of calculating with numbers, so is ALGORITHMICS the theory and practice of calculating with algorithms. Just as

  4. The experimental results on the quality of clustering diverse set of data using a modified algorithm chameleon

    Татьяна Борисовна Шатовская


    Full Text Available In this work results of modified Chameleon algorithm are discussed. Hierarchical multilevel algorithms consist of several stages: building the graph, coarsening, partitioning, recovering. Exploring of clustering quality for different data sets with different combinations of algorithms on different stages of the algorithm is the main aim of the article. And also aim is improving the construction phase through the optimization algorithm of choice k in the building the graph k-nearest neighbors

  5. Handbook of Memetic Algorithms

    Cotta, Carlos; Moscato, Pablo


    Memetic Algorithms (MAs) are computational intelligence structures combining multiple and various operators in order to address optimization problems.  The combination and interaction amongst operators evolves and promotes the diffusion of the most successful units and generates an algorithmic behavior which can handle complex objective functions and hard fitness landscapes.   “Handbook of Memetic Algorithms” organizes, in a structured way, all the the most important results in the field of MAs since their earliest definition until now.  A broad review including various algorithmic solutions as well as successful applications is included in this book. Each class of optimization problems, such as constrained optimization, multi-objective optimization, continuous vs combinatorial problems, uncertainties, are analysed separately and, for each problem,  memetic recipes for tackling the difficulties are given with some successful examples. Although this book contains chapters written by multiple authors, ...

  6. Kernel Affine Projection Algorithms

    José C. Príncipe


    Full Text Available The combination of the famed kernel trick and affine projection algorithms (APAs yields powerful nonlinear extensions, named collectively here, KAPA. This paper is a follow-up study of the recently introduced kernel least-mean-square algorithm (KLMS. KAPA inherits the simplicity and online nature of KLMS while reducing its gradient noise, boosting performance. More interestingly, it provides a unifying model for several neural network techniques, including kernel least-mean-square algorithms, kernel adaline, sliding-window kernel recursive-least squares (KRLS, and regularization networks. Therefore, many insights can be gained into the basic relations among them and the tradeoff between computation complexity and performance. Several simulations illustrate its wide applicability.

  7. Kernel Affine Projection Algorithms

    Liu, Weifeng; Príncipe, José C.


    The combination of the famed kernel trick and affine projection algorithms (APAs) yields powerful nonlinear extensions, named collectively here, KAPA. This paper is a follow-up study of the recently introduced kernel least-mean-square algorithm (KLMS). KAPA inherits the simplicity and online nature of KLMS while reducing its gradient noise, boosting performance. More interestingly, it provides a unifying model for several neural network techniques, including kernel least-mean-square algorithms, kernel adaline, sliding-window kernel recursive-least squares (KRLS), and regularization networks. Therefore, many insights can be gained into the basic relations among them and the tradeoff between computation complexity and performance. Several simulations illustrate its wide applicability.

  8. The Retina Algorithm

    CERN. Geneva; PUNZI, Giovanni


    Charge particle reconstruction is one of the most demanding computational tasks found in HEP, and it becomes increasingly important to perform it in real time. We envision that HEP would greatly benefit from achieving a long-term goal of making track reconstruction happen transparently as part of the detector readout ("detector-embedded tracking"). We describe here a track-reconstruction approach based on a massively parallel pattern-recognition algorithm, inspired by studies of the processing of visual images by the brain as it happens in nature ('RETINA algorithm'). It turns out that high-quality tracking in large HEP detectors is possible with very small latencies, when this algorithm is implemented in specialized processors, based on current state-of-the-art, high-speed/high-bandwidth digital devices.

  9. Algorithms in invariant theory

    Sturmfels, Bernd


    J. Kung and G.-C. Rota, in their 1984 paper, write: "Like the Arabian phoenix rising out of its ashes, the theory of invariants, pronounced dead at the turn of the century, is once again at the forefront of mathematics". The book of Sturmfels is both an easy-to-read textbook for invariant theory and a challenging research monograph that introduces a new approach to the algorithmic side of invariant theory. The Groebner bases method is the main tool by which the central problems in invariant theory become amenable to algorithmic solutions. Students will find the book an easy introduction to this "classical and new" area of mathematics. Researchers in mathematics, symbolic computation, and computer science will get access to a wealth of research ideas, hints for applications, outlines and details of algorithms, worked out examples, and research problems.

  10. Flow Based Algorithm

    T. Karpagam


    Full Text Available Problem statement: Network topology design problems find application in several real life scenario. Approach: Most designs in the past either optimize for a single criterion like shortest or cost minimization or maximum flow. Results: This study discussed about solving a multi objective network topology design problem for a realistic traffic model specifically in the pipeline transportation. Here flow based algorithm focusing to transport liquid goods with maximum capacity with shortest distance, this algorithm developed with the sense of basic pert and critical path method. Conclusion/Recommendations: This flow based algorithm helps to give optimal result for transporting maximum capacity with minimum cost. It could be used in the juice factory, milk industry and its best alternate for the vehicle routing problem.

  11. Algorithms for Global Positioning

    Borre, Kai; Strang, Gilbert

    The emergence of satellite technology has changed the lives of millions of people. In particular, GPS has brought an unprecedented level of accuracy to the field of geodesy. This text is a guide to the algorithms and mathematical principles that account for the success of GPS technology and repla......The emergence of satellite technology has changed the lives of millions of people. In particular, GPS has brought an unprecedented level of accuracy to the field of geodesy. This text is a guide to the algorithms and mathematical principles that account for the success of GPS technology....... At the heart of the matter are the positioning algorithms on which GPS technology relies, the discussion of which will affirm the mathematical contents of the previous chapters. Numerous ready-to-use MATLAB codes are included for the reader. This comprehensive guide will be invaluable for engineers...

  12. Detection of algorithmic trading

    Bogoev, Dimitar; Karam, Arzé


    We develop a new approach to reflect the behavior of algorithmic traders. Specifically, we provide an analytical and tractable way to infer patterns of quote volatility and price momentum consistent with different types of strategies employed by algorithmic traders, and we propose two ratios to quantify these patterns. Quote volatility ratio is based on the rate of oscillation of the best ask and best bid quotes over an extremely short period of time; whereas price momentum ratio is based on identifying patterns of rapid upward or downward movement in prices. The two ratios are evaluated across several asset classes. We further run a two-stage Artificial Neural Network experiment on the quote volatility ratio; the first stage is used to detect the quote volatility patterns resulting from algorithmic activity, while the second is used to validate the quality of signal detection provided by our measure.

  13. A Generalized Jacobi Algorithm

    Vissing, S.; Krenk, S.

    An algorithm is developed for the generalized eigenvalue problem (A - λB)φ = O where A and B are real symmetric matrices. The matrices A and B are diagonalized simultaneously by a series of generalized Jacobi transformations and all eigenvalues and eigenvectors are obtained. A criterion expressed...... in terms of the transformation parameters is used to omit transformations leading to very small changes. The algorithm is described in pseudo code for lower triangular matrices A and B and implemented in the programming Language C.......An algorithm is developed for the generalized eigenvalue problem (A - λB)φ = O where A and B are real symmetric matrices. The matrices A and B are diagonalized simultaneously by a series of generalized Jacobi transformations and all eigenvalues and eigenvectors are obtained. A criterion expressed...

  14. Tiled QR factorization algorithms

    Bouwmeester, Henricus; Langou, Julien; Robert, Yves


    This work revisits existing algorithms for the QR factorization of rectangular matrices composed of p-by-q tiles, where p >= q. Within this framework, we study the critical paths and performance of algorithms such as Sameh and Kuck, Modi and Clarke, Greedy, and those found within PLASMA. Although neither Modi and Clarke nor Greedy is optimal, both are shown to be asymptotically optimal for all matrices of size p = q^2 f(q), where f is any function such that \\lim_{+\\infty} f= 0. This novel and important complexity result applies to all matrices where p and q are proportional, p = \\lambda q, with \\lambda >= 1, thereby encompassing many important situations in practice (least squares). We provide an extensive set of experiments that show the superiority of the new algorithms for tall matrices.

  15. An Ordering Linear Unification Algorithm



    In this paper,we present an ordering linear unification algorithm(OLU).A new idea on substituteion of the binding terms is introduced to the algorithm,which is able to overcome some drawbacks of other algorithms,e.g.,MM algorithm[1],RG1 and RG2 algorithms[2],Particularly,if we use the directed eyclie graphs,the algoritm needs not check the binding order,then the OLU algorithm can also be aplied to the infinite tree data struceture,and a higher efficiency can be expected.The paper focuses upon the discussion of OLU algorithm and a partial order structure with respect to the unification algorithm.This algorithm has been implemented in the GKD-PROLOG/VAX 780 interpreting system.Experimental results have shown that the algorithm is very simple and efficient.

  16. A New Augmentation Based Algorithm for Extracting Maximal Chordal Subgraphs.

    Bhowmick, Sanjukta; Chen, Tzu-Yi; Halappanavar, Mahantesh


    A graph is chordal if every cycle of length greater than three contains an edge between non-adjacent vertices. Chordal graphs are of interest both theoretically, since they admit polynomial time solutions to a range of NP-hard graph problems, and practically, since they arise in many applications including sparse linear algebra, computer vision, and computational biology. A maximal chordal subgraph is a chordal subgraph that is not a proper subgraph of any other chordal subgraph. Existing algorithms for computing maximal chordal subgraphs depend on dynamically ordering the vertices, which is an inherently sequential process and therefore limits the algorithms' parallelizability. In this paper we explore techniques to develop a scalable parallel algorithm for extracting a maximal chordal subgraph. We demonstrate that an earlier attempt at developing a parallel algorithm may induce a non-optimal vertex ordering and is therefore not guaranteed to terminate with a maximal chordal subgraph. We then give a new algorithm that first computes and then repeatedly augments a spanning chordal subgraph. After proving that the algorithm terminates with a maximal chordal subgraph, we then demonstrate that this algorithm is more amenable to parallelization and that the parallel version also terminates with a maximal chordal subgraph. That said, the complexity of the new algorithm is higher than that of the previous parallel algorithm, although the earlier algorithm computes a chordal subgraph which is not guaranteed to be maximal. We experimented with our augmentation-based algorithm on both synthetic and real-world graphs. We provide scalability results and also explore the effect of different choices for the initial spanning chordal subgraph on both the running time and on the number of edges in the maximal chordal subgraph.

  17. Explorations in computing an introduction to computer science

    Conery, John S


    Introduction Computation The Limits of Computation Algorithms A Laboratory for Computational ExperimentsThe Ruby WorkbenchIntroducing Ruby and the RubyLabs environment for computational experimentsInteractive Ruby Numbers Variables Methods RubyLabs The Sieve of EratosthenesAn algorithm for finding prime numbersThe Sieve Algorithm The mod Operator Containers Iterators Boolean Values and the delete if Method Exploring the Algorithm The sieve Method A Better Sieve Experiments with the Sieve A Journey of a Thousand MilesIteration as a strategy for solving computational problemsSearching and Sortin

  18. Algorithm for structure constants

    Paiva, F M


    In a $n$-dimensional Lie algebra, random numerical values are assigned by computer to $n(n-1)$ especially selected structure constants. An algorithm is then created, which calculates without ambiguity the remaining constants, obeying the Jacobi conditions. Differently from others, this algorithm is suitable even for poor personal computer. ------------- En $n$-dimensia algebro de Lie, hazardaj numeraj valoroj estas asignitaj per komputilo al $n(n-1)$ speciale elektitaj konstantoj de strukturo. Tiam algoritmo estas kreita, kalkulante senambigue la ceterajn konstantojn, obeante kondicxojn de Jacobi. Malsimile al aliaj algoritmoj, tiu cxi tauxgas ecx por malpotenca komputilo.

  19. Parallel Algorithms and Patterns

    Robey, Robert W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This is a powerpoint presentation on parallel algorithms and patterns. A parallel algorithm is a well-defined, step-by-step computational procedure that emphasizes concurrency to solve a problem. Examples of problems include: Sorting, searching, optimization, matrix operations. A parallel pattern is a computational step in a sequence of independent, potentially concurrent operations that occurs in diverse scenarios with some frequency. Examples are: Reductions, prefix scans, ghost cell updates. We only touch on parallel patterns in this presentation. It really deserves its own detailed discussion which Gabe Rockefeller would like to develop.

  20. Algorithms for Reinforcement Learning

    Szepesvari, Csaba


    Reinforcement learning is a learning paradigm concerned with learning to control a system so as to maximize a numerical performance measure that expresses a long-term objective. What distinguishes reinforcement learning from supervised learning is that only partial feedback is given to the learner about the learner's predictions. Further, the predictions may have long term effects through influencing the future state of the controlled system. Thus, time plays a special role. The goal in reinforcement learning is to develop efficient learning algorithms, as well as to understand the algorithms'

  1. Randomized Filtering Algorithms

    Katriel, Irit; Van Hentenryck, Pascal


    of AllDifferent and is generalization, the Global Cardinality Constraint. The first delayed filtering scheme is a Monte Carlo algorithm: its running time is superior, in the worst case, to that of enforcing are consistency after every domain event, while its filtering effectiveness is analyzed...... in the expected sense. The second scheme is a Las Vegas algorithm using filtering triggers: Its effectiveness is the same as enforcing are consistency after every domain event, while in the expected case it is faster by a factor of m/n, where n and m are, respectively, the number of nodes and edges...

  2. Improved Chaff Solution Algorithm


    Programme de démonstration de technologies (PDT) sur l’intégration de capteurs et de systèmes d’armes embarqués (SISWS), un algorithme a été élaboré...technologies (PDT) sur l’intégration de capteurs et de systèmes d’armes embarqués (SISWS), un algorithme a été élaboré pour déterminer automatiquement...0Z4 2. SECURITY CLASSIFICATION (Overall security classification of the document including special warning terms if applicable .) UNCLASSIFIED

  3. Wireless communications algorithmic techniques

    Vitetta, Giorgio; Colavolpe, Giulio; Pancaldi, Fabrizio; Martin, Philippa A


    This book introduces the theoretical elements at the basis of various classes of algorithms commonly employed in the physical layer (and, in part, in MAC layer) of wireless communications systems. It focuses on single user systems, so ignoring multiple access techniques. Moreover, emphasis is put on single-input single-output (SISO) systems, although some relevant topics about multiple-input multiple-output (MIMO) systems are also illustrated.Comprehensive wireless specific guide to algorithmic techniquesProvides a detailed analysis of channel equalization and channel coding for wi

  4. Secondary Vertex Finder Algorithm

    Heer, Sebastian; The ATLAS collaboration


    If a jet originates from a b-quark, a b-hadron is formed during the fragmentation process. In its dominant decay modes, the b-hadron decays into a c-hadron via the electroweak interaction. Both b- and c-hadrons have lifetimes long enough, to travel a few millimetres before decaying. Thus displaced vertices from b- and subsequent c-hadron decays provide a strong signature for a b-jet. Reconstructing these secondary vertices (SV) and their properties is the aim of this algorithm. The performance of this algorithm is studied with tt̄ events, requiring at least one lepton, simulated at 13 TeV.

  5. A Subspace Algorithm

    Vissing, S.; Hededal, O.

    -dimensional subspace in order to establish and solve a symmetric generalized eigenvalue problem in the subspace. The algorithm is described in pseudo code and implemented in the C programming language for lower triangular matrices A and B. The implementation includes procedures for selecting initial iteration vectors......An algorithm is presented for computing the m smallest eigenvalues and corresponding eigenvectors of the generalized eigenvalue problem (A - λB)Φ = 0 where A and B are real n x n symmetric matrices. In an iteration scheme the matrices A and B are projected simultaneously onto an m...

  6. New Optimization Algorithms in Physics

    Hartmann, Alexander K


    Many physicists are not aware of the fact that they can solve their problems by applying optimization algorithms. Since the number of such algorithms is steadily increasing, many new algorithms have not been presented comprehensively until now. This presentation of recently developed algorithms applied in physics, including demonstrations of how they work and related results, aims to encourage their application, and as such the algorithms selected cover concepts and methods from statistical physics to optimization problems emerging in theoretical computer science.

  7. Birth Control Explorer

    ... STIs Media Facebook Twitter Tumblr Shares · 467 Birth Control Explorer Sort by all methods most effective methods ... You are here Home » Birth Control Explorer Birth Control Explorer If you’re having sex —or if ...

  8. Automatic design of decision-tree algorithms with evolutionary algorithms.

    Barros, Rodrigo C; Basgalupp, Márcio P; de Carvalho, André C P L F; Freitas, Alex A


    This study reports the empirical analysis of a hyper-heuristic evolutionary algorithm that is capable of automatically designing top-down decision-tree induction algorithms. Top-down decision-tree algorithms are of great importance, considering their ability to provide an intuitive and accurate knowledge representation for classification problems. The automatic design of these algorithms seems timely, given the large literature accumulated over more than 40 years of research in the manual design of decision-tree induction algorithms. The proposed hyper-heuristic evolutionary algorithm, HEAD-DT, is extensively tested using 20 public UCI datasets and 10 microarray gene expression datasets. The algorithms automatically designed by HEAD-DT are compared with traditional decision-tree induction algorithms, such as C4.5 and CART. Experimental results show that HEAD-DT is capable of generating algorithms which are significantly more accurate than C4.5 and CART.

  9. Application of Imperialist Competitive Algorithm on Solving the Traveling Salesman Problem

    Shuhui Xu


    Full Text Available The imperialist competitive algorithm (ICA is a new heuristic algorithm proposed for continuous optimization problems. The research about its application on solving the traveling salesman problem (TSP is still very limited. Aiming to explore its ability on solving TSP, we present a discrete imperialist competitive algorithm in this paper. The proposed algorithm modifies the original rules of the assimilation and introduces the 2-opt algorithm into the revolution process. To examine its performance, we tested the proposed algorithm on 10 small-scale and 2 large-scale standard benchmark instances from the TSPLIB and compared the experimental results with that obtained by two other ICA-based algorithms and six other existing algorithms. The proposed algorithm shows excellent performance in the experiments and comparisons.

  10. An ensemble symbiosis organisms search algorithm and its application to real world problems

    Sukanta Nama


    Full Text Available In this study, an ensemble algorithm has been proposed, called Quasi-Oppositional Symbiosis Organisms Search (QOSOS algorithms, by incorporating the quasi-oppositional based learning (QOBL strategy into the newly proposed Symbiosis Organisms Search (SOS algorithm for solving unconstrained global optimization problems. The QOBL is incorporated into the basic SOS algorithm due to the balance of the exploration capability of QOBL and the exploitation potential of SOS algorithm. To validate the efficiency and robustness of the proposed Quasi-Oppositional Symbiosis Organisms Search (QOSOS algorithms, it is applied to solve unconstrained global optimization problems. Also, the proposed QOSOS algorithm is applied to solve two real world global optimization problems. One is gas transmission compressor design optimization problem and another is optimal capacity of the gas production facilities optimization problem. The performance of the QOSOS algorithm is extensively evaluated and compares favorably with many progressive algorithms.

  11. Multiple parts process planning in serial–parallel flexible flow lines: part II—solution method based on genetic algorithms with fixed- and variable-length chromosomes

    Musharavati, Farayi; Hamouda, Abdelmagid Salem


    Multiple parts process planning (MPPP) is a hard optimization problem that requires the rigor and intensity of metaheuristic-based algorithms such as simulated annealing and genetic algorithms. In this paper, a solution method for this problem is developed based on genetic algorithms. Genetic algorithms solve problems by exploring a given search space. To do this, a landscape over which the search traverses is constructed based on a number of algorithm choices. Key algorithm choices include (...

  12. Intelligent Unmanned Explorer for Deep Space Exploration

    Kubota, T


    asteroids or comets have received remarkable attention in the world. In small body explorations, especially, detailed in-situ surface exploration by tiny rover is one of effective and fruitful means and is expected to make strong contributions towards scientific studies. JAXA ISAS is promoting MUSES C mission, which is the worlds first sample and return attempt to or from the near earth asteroid. Hayabusa spacecraft in MUSES C mission took the tiny rover, which was expected to perform the in-situ surface exploration by hopping. This paper describes the system design, mobility and intelligence of the developed unmanned explorer. This paper also presents the ground experimental results and the flight results.

  13. Python algorithms mastering basic algorithms in the Python language

    Hetland, Magnus Lie


    Python Algorithms, Second Edition explains the Python approach to algorithm analysis and design. Written by Magnus Lie Hetland, author of Beginning Python, this book is sharply focused on classical algorithms, but it also gives a solid understanding of fundamental algorithmic problem-solving techniques. The book deals with some of the most important and challenging areas of programming and computer science in a highly readable manner. It covers both algorithmic theory and programming practice, demonstrating how theory is reflected in real Python programs. Well-known algorithms and data struc

  14. Comprehensive eye evaluation algorithm

    Agurto, C.; Nemeth, S.; Zamora, G.; Vahtel, M.; Soliz, P.; Barriga, S.


    In recent years, several research groups have developed automatic algorithms to detect diabetic retinopathy (DR) in individuals with diabetes (DM), using digital retinal images. Studies have indicated that diabetics have 1.5 times the annual risk of developing primary open angle glaucoma (POAG) as do people without DM. Moreover, DM patients have 1.8 times the risk for age-related macular degeneration (AMD). Although numerous investigators are developing automatic DR detection algorithms, there have been few successful efforts to create an automatic algorithm that can detect other ocular diseases, such as POAG and AMD. Consequently, our aim in the current study was to develop a comprehensive eye evaluation algorithm that not only detects DR in retinal images, but also automatically identifies glaucoma suspects and AMD by integrating other personal medical information with the retinal features. The proposed system is fully automatic and provides the likelihood of each of the three eye disease. The system was evaluated in two datasets of 104 and 88 diabetic cases. For each eye, we used two non-mydriatic digital color fundus photographs (macula and optic disc centered) and, when available, information about age, duration of diabetes, cataracts, hypertension, gender, and laboratory data. Our results show that the combination of multimodal features can increase the AUC by up to 5%, 7%, and 8% in the detection of AMD, DR, and glaucoma respectively. Marked improvement was achieved when laboratory results were combined with retinal image features.

  15. de Casteljau's Algorithm Revisited

    Gravesen, Jens


    It is demonstrated how all the basic properties of Bezier curves can be derived swiftly and efficiently without any reference to the Bernstein polynomials and essentially with only geometric arguments. This is achieved by viewing one step in de Casteljau's algorithm as an operator (the de Casteljau...

  16. Algorithmic information theory

    P.D. Grünwald; P.M.B. Vitányi


    We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are

  17. Algorithmic information theory

    P.D. Grünwald; P.M.B. Vitányi


    We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining 'information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are

  18. Algorithmic information theory

    Grünwald, P.D.; Vitányi, P.M.B.


    We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are

  19. Algorithmic information theory

    Grünwald, P.D.; Vitányi, P.M.B.; Adriaans, P.; van Benthem, J.


    We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining 'information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are

  20. Incremental algorithms on lists

    Jeuring, J.T.


    Incremental computations can improve the performance of interactive programs such as spreadsheet programs, program development environments, text editors, etc. Incremental algorithms describe how to compute a required value depending on the input, after the input has been edited. By considering the

  1. Logic of Algorithmic Knowledge

    Surowik Dariusz


    Full Text Available In this paper we consider the construction of a LAK system of temporal-epistemic logic which is used to formally describe algorithmic knowledge. We propose an axiom system of LAK and discuss the basic properties of this logic.

  2. Modular Regularization Algorithms

    Jacobsen, Michael


    The class of linear ill-posed problems is introduced along with a range of standard numerical tools and basic concepts from linear algebra, statistics and optimization. Known algorithms for solving linear inverse ill-posed problems are analyzed to determine how they can be decomposed...

  3. Modular Regularization Algorithms

    Jacobsen, Michael


    The class of linear ill-posed problems is introduced along with a range of standard numerical tools and basic concepts from linear algebra, statistics and optimization. Known algorithms for solving linear inverse ill-posed problems are analyzed to determine how they can be decomposed into indepen......The class of linear ill-posed problems is introduced along with a range of standard numerical tools and basic concepts from linear algebra, statistics and optimization. Known algorithms for solving linear inverse ill-posed problems are analyzed to determine how they can be decomposed...... into independent modules. These modules are then combined to form new regularization algorithms with other properties than those we started out with. Several variations are tested using the Matlab toolbox MOORe Tools created in connection with this thesis. Object oriented programming techniques are explained...... and used to set up the illposed problems in the toolbox. Hereby, we are able to write regularization algorithms that automatically exploit structure in the ill-posed problem without being rewritten explicitly. We explain how to implement a stopping criteria for a parameter choice method based upon...

  4. The Xmath Integration Algorithm

    Bringslid, Odd


    The projects Xmath (Bringslid and Canessa, 2002) and dMath (Bringslid, de la Villa and Rodriguez, 2007) were supported by the European Commission in the so called Minerva Action (Xmath) and The Leonardo da Vinci programme (dMath). The Xmath eBook (Bringslid, 2006) includes algorithms into a wide range of undergraduate mathematical issues embedded…

  5. CORDIC Algorithm for WLAN

    Mr. Gadekar S. R


    Full Text Available This research aims to implement CORDIC Algorithm for WLAN. The design is coded using VHDL language and for the hardware implementation XILINX Spartan-3FPGA is used. VHDL implementation is based on results obtained from Xilinx ISE simulation.

  6. The Copenhagen Triage Algorithm

    Hasselbalch, Rasmus Bo; Plesner, Louis Lind; Pries-Heje, Mia


    is non-inferior to an existing triage model in a prospective randomized trial. METHODS: The Copenhagen Triage Algorithm (CTA) study is a prospective two-center, cluster-randomized, cross-over, non-inferiority trial comparing CTA to the Danish Emergency Process Triage (DEPT). We include patients ≥16 years...

  7. Convergence of DFP algorithm



    The DFP method is one of the most famous numerical algorithms for unconstrained optimization. For uniformly convex objective functions convergence properties of the DFP method are studied. Several conditions that can ensure the global convergence of the DFP method are given.

  8. A propositional CONEstrip algorithm

    Quaeghebeur, E.; Laurent, A.; Strauss, O.; Bouchon-Meunier, B.; Yager, R.R.


    We present a variant of the CONEstrip algorithm for checking whether the origin lies in a finitely generated convex cone that can be open, closed, or neither. This variant is designed to deal efficiently with problems where the rays defining the cone are specified as linear combinations of propositi

  9. Algorithm Theory - SWAT 2006

    This book constitutes the refereed proceedings of the 10th Scandinavian Workshop on Algorithm Theory, SWAT 2006, held in Riga, Latvia, in July 2006. The 36 revised full papers presented together with 3 invited papers were carefully reviewed and selected from 154 submissions. The papers address all...

  10. Reduction of Unneccessary Dotted Rules in the Earley—Algorithm



    With the high developed hardware from the PC's today,there arise possibilities to implement programming environments on such kind of computers.To rduces the amount of calculaiton time and required memory space from implemented optimization approaches in the algorithm design are demanded.The purpose of this work is to explore and analyse possibilities to reduce the required memory space through elimination of superfluonus grammar rules created during the process of recognition.

  11. T-Algorithm-Based Logic Simulation on Distributed Systems

    Sundaram, S; Patnaik, LM


    Increase in the complexity of VLSI digital circuit it sign demands faster logic simulation techniques than those currently available. One of the ways of speeding up existing logic simulataon algorithms is by exploiting the inherent parallelism an the sequentaal versaon. In this paper, we explore the possibility of mapping a T-algoriihm based logac samulataon algorithm onto a cluster of workstation interconnected by an ethernet. The set of gates at a particular level as partitioned by the hias...

  12. A Compositional Sweep-Line State Space Exploration Method

    Kristensen, Lars Michael; Mailund, Thomas


    State space exploration is a main approach to verification of finite-state systems. The sweep-line method exploits a certain kind of progress present in many systems to reduce peak memory usage during state space exploration. We present a new sweep-line algorithm for a compositional setting where...

  13. Benchmarking monthly homogenization algorithms

    V. K. C. Venema


    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  14. Exploring Consumer Behavior: Use of Association Rules

    Pavel Turčínek


    Full Text Available This paper focuses on problematic of use of association rules in exploring consumer behavior and presents selected results of applied data analyses on data collected via questionnaire survey on a sample of 1127 Czech respondents with structure close to representative sample of population the Czech Republic. The questionnaire survey deals with problematic of shopping for meat products. The objective was to explore possibilities of less frequently used data-mining techniques in processing of customer preference. For the data analyses, two methods for generating association rules are used: Apriori algorithm and FP-grow algorithm. Both of them were executed in Weka software. The Apriori algorithm seemed to be a better tool, because it has provided finer data, due to the fact that FP-growth algorithm needed reduction of preference scale to only two extreme values, because the input data must be binary. For consumer preferences we also calculated their means. This paper explores the different preferences and expectations of what customers’ favorite outlet should provide, and offer. Customers based on the type of their outlet loyalty were divided into five segments and further explored in more detail. Some of the found best association rules suggest similar patterns across the whole sample, e.g. the results suggest that the respondents for whom a quality of merchandise is a very important factor typically also base their outlet selection on freshness of products. This finding applies to all types of retail loyalty categores. Other rules seem to indicate a behavior more specific for a particular segment of customers. The results suggest that application of association rules in customer research can provide more insight and can be a good supplementary analysis for consumer data exploration when Likert scales were used.

  15. The Deterministic Dendritic Cell Algorithm

    Greensmith, Julie


    The Dendritic Cell Algorithm is an immune-inspired algorithm orig- inally based on the function of natural dendritic cells. The original instantiation of the algorithm is a highly stochastic algorithm. While the performance of the algorithm is good when applied to large real-time datasets, it is difficult to anal- yse due to the number of random-based elements. In this paper a deterministic version of the algorithm is proposed, implemented and tested using a port scan dataset to provide a controllable system. This version consists of a controllable amount of parameters, which are experimented with in this paper. In addition the effects are examined of the use of time windows and variation on the number of cells, both which are shown to influence the algorithm. Finally a novel metric for the assessment of the algorithms output is introduced and proves to be a more sensitive metric than the metric used with the original Dendritic Cell Algorithm.


    黄磊; 刘建业; 曾庆化


    Traditional coning algorithms are based on the first-order coning correction reference model .Usually they reduce the algorithm error of coning axis (z) by increasing the sample numbers in one iteration interval .But the increase of sample numbers requires the faster output rates of sensors .Therefore ,the algorithms are often lim-ited in practical use .Moreover ,the noncommutivity error of rotation usually exists on all three axes and the in-crease of sample numbers has little positive effect on reducing the algorithm errors of orthogonal axes (x ,y) . Considering the errors of orthogonal axes cannot be neglected in the high-precision applications ,a coning algorithm with an additional second-order coning correction term is developed to further improve the performance of coning algorithm .Compared with the traditional algorithms ,the new second-order coning algorithm can effectively reduce the algorithm error without increasing the sample numbers .Theoretical analyses validate that in a coning environ-ment with low frequency ,the new algorithm has the better performance than the traditional time-series and fre-quency-series coning algorithms ,while in a maneuver environment the new algorithm has the same order accuracy as the traditional time-series and frequency-series algorithms .Finally ,the practical feasibility of the new coning al-gorithm is demonstrated by digital simulations and practical turntable tests .

  17. An improved harmony search algorithm with dynamically varying bandwidth

    Kalivarapu, J.; Jain, S.; Bag, S.


    The present work demonstrates a new variant of the harmony search (HS) algorithm where bandwidth (BW) is one of the deciding factors for the time complexity and the performance of the algorithm. The BW needs to have both explorative and exploitative characteristics. The ideology is to use a large BW to search in the full domain and to adjust the BW dynamically closer to the optimal solution. After trying a series of approaches, a methodology inspired by the functioning of a low-pass filter showed satisfactory results. This approach was implemented in the self-adaptive improved harmony search (SIHS) algorithm and tested on several benchmark functions. Compared to the existing HS algorithm and its variants, SIHS showed better performance on most of the test functions. Thereafter, the algorithm was applied to geometric parameter optimization of a friction stir welding tool.

  18. Earth Observation Satellites Scheduling Based on Decomposition Optimization Algorithm

    Feng Yao


    Full Text Available A decomposition-based optimization algorithm was proposed for solving Earth Observation Satellites scheduling problem. The problem was decomposed into task assignment main problem and single satellite scheduling sub-problem. In task assignment phase, the tasks were allocated to the satellites, and each satellite would schedule the task respectively in single satellite scheduling phase. We adopted an adaptive ant colony optimization algorithm to search the optimal task assignment scheme. Adaptive parameter adjusting strategy and pheromone trail smoothing strategy were introduced to balance the exploration and the exploitation of search process. A heuristic algorithm and a very fast simulated annealing algorithm were proposed to solve the single satellite scheduling problem. The task assignment scheme was valued by integrating the observation scheduling result of multiple satellites. The result was responded to the ant colony optimization algorithm, which can guide the search process of ant colony optimization. Computation results showed that the approach was effective to the satellites observation scheduling problem.

  19. Tree exploration for Bayesian RL exploration

    Dimitrakakis, C.; Mohammadian, M.


    Research in reinforcement learning has produced algo-rithms for optimal decision making under uncertainty thatfall within two main types. The first employs a Bayesianframework, where optimality improves with increased com-putational time. This is because the resulting planning tasktakes the form of

  20. General Video Game Evaluation Using Relative Algorithm Performance Profiles

    Nielsen, Thorbjørn; Barros, Gabriella; Togelius, Julian;


    In order to generate complete games through evolution we need generic and reliably evaluation functions for games. It has been suggested that game quality could be characterised through playing a game with different controllers and comparing their performance. This paper explores that idea through...... investigating the relative performance of different general game-playing algorithms. Seven game-playing algorithms was used to play several hand-designed, mutated and randomly generated VGDL game descriptions. Results discussed appear to support the conjecture that well-designed games have, in average, a higher...... performance difference between better and worse game-playing algorithms....

  1. Improved Runtime Analysis of the Simple Genetic Algorithm

    Oliveto, Pietro S.; Witt, Carsten


    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations of our previous one. Firstly...... improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore the limits...

  2. Improved time complexity analysis of the Simple Genetic Algorithm

    Oliveto, Pietro S.; Witt, Carsten


    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm with population size μ≤n1/8−ε requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations...... this is a major improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore...

  3. An efficient cuckoo search algorithm for numerical function optimization

    Ong, Pauline; Zainuddin, Zarita


    Cuckoo search algorithm which reproduces the breeding strategy of the best known brood parasitic bird, the cuckoos has demonstrated its superiority in obtaining the global solution for numerical optimization problems. However, the involvement of fixed step approach in its exploration and exploitation behavior might slow down the search process considerably. In this regards, an improved cuckoo search algorithm with adaptive step size adjustment is introduced and its feasibility on a variety of benchmarks is validated. The obtained results show that the proposed scheme outperforms the standard cuckoo search algorithm in terms of convergence characteristic while preserving the fascinating features of the original method.

  4. Genetic Algorithms and Local Search

    Whitley, Darrell


    The first part of this presentation is a tutorial level introduction to the principles of genetic search and models of simple genetic algorithms. The second half covers the combination of genetic algorithms with local search methods to produce hybrid genetic algorithms. Hybrid algorithms can be modeled within the existing theoretical framework developed for simple genetic algorithms. An application of a hybrid to geometric model matching is given. The hybrid algorithm yields results that improve on the current state-of-the-art for this problem.


    S. V. Bibikov


    Full Text Available The paper deals with detection algorithm for rail vibroacoustic waves caused by approaching train on the background of increased noise. The urgency of algorithm development for train detection in view of increased rail noise, when railway lines are close to roads or road intersections is justified. The algorithm is based on the method of weak signals detection in a noisy environment. The information statistics ultimate expression is adjusted. We present the results of algorithm research and testing of the train approach alarm device that implements the proposed algorithm. The algorithm is prepared for upgrading the train approach alarm device “Signalizator-P".

  6. Isosurfaces geometry, topology, and algorithms

    Wenger, Rephael


    Ever since Lorensen and Cline published their paper on the Marching Cubes algorithm, isosurfaces have been a standard technique for the visualization of 3D volumetric data. Yet there is no book exclusively devoted to isosurfaces. Isosurfaces: Geometry, Topology, and Algorithms represents the first book to focus on basic algorithms for isosurface construction. It also gives a rigorous mathematical perspective on some of the algorithms and results. In color throughout, the book covers the Marching Cubes algorithm and variants, dual contouring algorithms, multilinear interpolation, multiresolutio

  7. Algorithms for verbal autopsies: a validation study in Kenyan children.

    Quigley, M. A.; Armstrong Schellenberg, J. R.; Snow, R. W.


    The verbal autopsy (VA) questionnaire is a widely used method for collecting information on cause-specific mortality where the medical certification of deaths in childhood is incomplete. This paper discusses review by physicians and expert algorithms as approaches to ascribing cause of deaths from the VA questionnaire and proposes an alternative, data-derived approach. In this validation study, the relatives of 295 children who had died in hospital were interviewed using a VA questionnaire. The children were assigned causes of death using data-derived algorithms obtained under logistic regression and using expert algorithms. For most causes of death, the data-derived algorithms and expert algorithms yielded similar levels of diagnostic accuracy. However, a data-derived algorithm for malaria gave a sensitivity of 71% (95% Cl: 58-84%), which was significantly higher than the sensitivity of 47% obtained under an expert algorithm. The need for exploring this and other ways in which the VA technique can be improved are discussed. The implications of less-than-perfect sensitivity and specificity are explored using numerical examples. Misclassification bias should be taken into consideration when planning and evaluating epidemiological studies. PMID:8706229

  8. Consensus algorithm in smart grid and communication networks

    Alfagee, Husain Abdulaziz

    On a daily basis, consensus theory attracts more and more researches from different areas of interest, to apply its techniques to solve technical problems in a way that is faster, more reliable, and even more precise than ever before. A power system network is one of those fields that consensus theory employs extensively. The use of the consensus algorithm to solve the Economic Dispatch and Load Restoration Problems is a good example. Instead of a conventional central controller, some researchers have explored an algorithm to solve the above mentioned problems, in a distribution manner, using the consensus algorithm, which is based on calculation methods, i.e., non estimation methods, for updating the information consensus matrix. Starting from this point of solving these types of problems mentioned, specifically, in a distribution fashion, using the consensus algorithm, we have implemented a new advanced consensus algorithm. It is based on the adaptive estimation techniques, such as the Gradient Algorithm and the Recursive Least Square Algorithm, to solve the same problems. This advanced work was tested on different case studies that had formerly been explored, as seen in references 5, 7, and 18. Three and five generators, or agents, with different topologies, correspond to the Economic Dispatch Problem and the IEEE 16-Bus power system corresponds to the Load Restoration Problem. In all the cases we have studied, the results met our expectations with extreme accuracy, and completely matched the results of the previous researchers. There is little question that this research proves the capability and dependability of using the consensus algorithm, based on the estimation methods as the Gradient Algorithm and the Recursive Least Square Algorithm to solve such power problems.

  9. A New Genetic Algorithm Methodology for Design Optimization of Truss Structures: Bipopulation-Based Genetic Algorithm with Enhanced Interval Search

    Tugrul Talaslioglu


    Full Text Available A new genetic algorithm (GA methodology, Bipopulation-Based Genetic Algorithm with Enhanced Interval Search (BGAwEIS, is introduced and used to optimize the design of truss structures with various complexities. The results of BGAwEIS are compared with those obtained by the sequential genetic algorithm (SGA utilizing a single population, a multipopulation-based genetic algorithm (MPGA proposed for this study and other existing approaches presented in literature. This study has two goals: outlining BGAwEIS's fundamentals and evaluating the performances of BGAwEIS and MPGA. Consequently, it is demonstrated that MPGA shows a better performance than SGA taking advantage of multiple populations, but BGAwEIS explores promising solution regions more efficiently than MPGA by exploiting the feasible solutions. The performance of BGAwEIS is confirmed by better quality degree of its optimal designations compared to algorithms proposed here and described in literature.

  10. Enhanced Dynamic Algorithm of Genome Sequence Alignments

    Arabi E. keshk


    Full Text Available The merging of biology and computer science has created a new field called computational biology that explore the capacities of computers to gain knowledge from biological data, bioinformatics. Computational biology is rooted in life sciences as well as computers, information sciences, and technologies. The main problem in computational biology is sequence alignment that is a way of arranging the sequences of DNA, RNA or protein to identify the region of similarity and relationship between sequences. This paper introduces an enhancement of dynamic algorithm of genome sequence alignment, which called EDAGSA. It is filling the three main diagonals without filling the entire matrix by the unused data. It gets the optimal solution with decreasing the execution time and therefore the performance is increased. To illustrate the effectiveness of optimizing the performance of the proposed algorithm, it is compared with the traditional methods such as Needleman-Wunsch, Smith-Waterman and longest common subsequence algorithms. Also, database is implemented for using the algorithm in multi-sequence alignments for searching the optimal sequence that matches the given sequence.

  11. Convex hull ranking algorithm for multi-objective evolutionary algorithms

    Davoodi Monfrared, M.; Mohades, A.; Rezaei, J.


    Due to many applications of multi-objective evolutionary algorithms in real world optimization problems, several studies have been done to improve these algorithms in recent years. Since most multi-objective evolutionary algorithms are based on the non-dominated principle, and their complexity depen

  12. Evolutionary algorithm based index assignment algorithm for noisy channel

    李天昊; 余松煜


    A globally optimal solution to vector quantization (VQ) index assignment on noisy channel, the evolutionary algorithm based index assignment algorithm (EAIAA), is presented. The algorithm yields a significant reduction in average distortion due to channel errors, over conventional arbitrary index assignment, as confirmed by experimental results over the memoryless binary symmetric channel (BSC) for any bit error.

  13. Partitional clustering algorithms


    This book summarizes the state-of-the-art in partitional clustering. Clustering, the unsupervised classification of patterns into groups, is one of the most important tasks in exploratory data analysis. Primary goals of clustering include gaining insight into, classifying, and compressing data. Clustering has a long and rich history that spans a variety of scientific disciplines including anthropology, biology, medicine, psychology, statistics, mathematics, engineering, and computer science. As a result, numerous clustering algorithms have been proposed since the early 1950s. Among these algorithms, partitional (nonhierarchical) ones have found many applications, especially in engineering and computer science. This book provides coverage of consensus clustering, constrained clustering, large scale and/or high dimensional clustering, cluster validity, cluster visualization, and applications of clustering. Examines clustering as it applies to large and/or high-dimensional data sets commonly encountered in reali...

  14. Randomized robot navigation algorithms

    Berman, P. [Penn State Univ., University Park, PA (United States); Blum, A. [Carnegie Mellon Univ., Pittsburgh, PA (United States); Fiat, A. [Tel-Aviv Univ. (Israel)] [and others


    We consider the problem faced by a mobile robot that has to reach a given target by traveling through an unmapped region in the plane containing oriented rectangular obstacles. We assume the robot has no prior knowledge about the positions or sizes of the obstacles, and acquires such knowledge only when obstacles are encountered. Our goal is to minimize the distance the robot must travel, using the competitive ratio as our measure. We give a new randomized algorithm for this problem whose competitive ratio is O(n4/9 log n), beating the deterministic {Omega}({radical}n) lower bound of [PY], and answering in the affirmative an open question of [BRS] (which presented an optimal deterministic algorithm). We believe the techniques introduced here may prove useful in other on-line situations in which information gathering is part of the on-line process.

  15. An Algorithmic Diversity Diet?

    Sørensen, Jannick Kirk; Schmidt, Jan-Hinrik


    With the growing influence of personalized algorithmic recommender systems on the exposure of media content to users, the relevance of discussing the diversity of recommendations increases, particularly as far as public service media (PSM) is concerned. An imagined implementation of a diversity...... diet system however triggers not only the classic discussion of the reach – distinctiveness balance for PSM, but also shows that ‘diversity’ is understood very differently in algorithmic recommender system communities than it is editorially and politically in the context of PSM. The design...... of a diversity diet system generates questions not just about editorial power, personal freedom and techno-paternalism, but also about the embedded politics of recommender systems as well as the human skills affiliated with PSM editorial work and the nature of PSM content....

  16. Genetic algorithm essentials

    Kramer, Oliver


    This book introduces readers to genetic algorithms (GAs) with an emphasis on making the concepts, algorithms, and applications discussed as easy to understand as possible. Further, it avoids a great deal of formalisms and thus opens the subject to a broader audience in comparison to manuscripts overloaded by notations and equations. The book is divided into three parts, the first of which provides an introduction to GAs, starting with basic concepts like evolutionary operators and continuing with an overview of strategies for tuning and controlling parameters. In turn, the second part focuses on solution space variants like multimodal, constrained, and multi-objective solution spaces. Lastly, the third part briefly introduces theoretical tools for GAs, the intersections and hybridizations with machine learning, and highlights selected promising applications.


    Li Hongbo


    In an inner-product space, an invertible vector generates a reflection with re-spect to a hyperplane, and the Clifford product of several invertible vectors, called a versor in Clifford algebra, generates the composition of the corresponding reflections, which is an orthogonal transformation. Given a versor in a Clifford algebra, finding another sequence of invertible vectors of strictly shorter length but whose Clifford product still equals the input versor, is called versor compression. Geometrically, versor compression is equivalent to decomposing an orthogoual transformation into a shorter sequence of reflections. This paper proposes a simple algorithm of compressing versors of symbolic form in Clifford algebra. The algorithm is based on computing the intersections of lines with planes in the corresponding Grassmann-Cayley algebra, and is complete in the case of Euclidean or Minkowski inner-product space.

  18. An Improved Apriori Algorithm

    LIU Shan; LIAO Yongyi


    In this paper,We study the Apriori and FP-growth algorithm in mining association rules and give a method for computing all the frequent item-sets in a database.Its basic idea is giving a concept based on the boolean vector business product,which be computed between all the businesses,then we can get all the two frequent item-sets (min_sup=2).We basis their inclusive relation to construct a set-tree of item-sets in database transaction,and then traverse path in it and get all the frequent item-sets.Therefore,we can get minimal frequent item sets between transactions and items in the database without scanning the database and iteratively computing in Apriori algorithm.

  19. Fatigue Evaluation Algorithms: Review

    Passipoularidis, Vaggelis; Brøndsted, Povl

    A progressive damage fatigue simulator for variable amplitude loads named FADAS is discussed in this work. FADAS (Fatigue Damage Simulator) performs ply by ply stress analysis using classical lamination theory and implements adequate stiffness discount tactics based on the failure criterion of Puck......, to model the degradation caused by failure events in ply level. Residual strength is incorporated as fatigue damage accumulation metric. Once the typical fatigue and static properties of the constitutive ply are determined,the performance of an arbitrary lay-up under uniaxial and/or multiaxial load time...... blade construction. Two versions of the algorithm, the one using single-step and the other using incremental application of each load cycle (in case of ply failure) are implemented and compared. Simulation results confirm the ability of the algorithm to take into account load sequence effects...

  20. Quantum Prediction Algorithms

    Kent, A; Kent, Adrian; Elwaine, Jim Mc


    The consistent histories formulation of the quantum theory of a closed system with pure initial state defines an infinite number of incompatible consistent sets, each of which gives a possible description of the physics. We investigate the possibility of using the properties of the Schmidt decomposition to define an algorithm which selects a single, physically natural, consistent set. We explain the problems which arise, set out some possible algorithms, and explain their properties with the aid of simple models. Though the discussion is framed in the language of the consistent histories approach, it is intended to highlight the difficulty in making any interpretation of quantum theory based on decoherence into a mathematically precise theory.

  1. Evaluating Discourse Processing Algorithms

    Walker, M A


    In order to take steps towards establishing a methodology for evaluating Natural Language systems, we conducted a case study. We attempt to evaluate two different approaches to anaphoric processing in discourse by comparing the accuracy and coverage of two published algorithms for finding the co-specifiers of pronouns in naturally occurring texts and dialogues. We present the quantitative results of hand-simulating these algorithms, but this analysis naturally gives rise to both a qualitative evaluation and recommendations for performing such evaluations in general. We illustrate the general difficulties encountered with quantitative evaluation. These are problems with: (a) allowing for underlying assumptions, (b) determining how to handle underspecifications, and (c) evaluating the contribution of false positives and error chaining.

  2. DAL Algorithms and Python

    Aydemir, Bahar


    The Trigger and Data Acquisition (TDAQ) system of the ATLAS detector at the Large Hadron Collider (LHC) at CERN is composed of a large number of distributed hardware and software components. TDAQ system consists of about 3000 computers and more than 25000 applications which, in a coordinated manner, provide the data-taking functionality of the overall system. There is a number of online services required to configure, monitor and control the ATLAS data taking. In particular, the configuration service is used to provide configuration of above components. The configuration of the ATLAS data acquisition system is stored in XML-based object database named OKS. DAL (Data Access Library) allowing to access it's information by C++, Java and Python clients in a distributed environment. Some information has quite complicated structure, so it's extraction requires writing special algorithms. Algorithms available on C++ programming language and partially reimplemented on Java programming language. The goal of the projec...

  3. Fluid Genetic Algorithm (FGA

    Ruholla Jafari-Marandi


    Full Text Available Genetic Algorithm (GA has been one of the most popular methods for many challenging optimization problems when exact approaches are too computationally expensive. A review of the literature shows extensive research attempting to adapt and develop the standard GA. Nevertheless, the essence of GA which consists of concepts such as chromosomes, individuals, crossover, mutation, and others rarely has been the focus of recent researchers. In this paper method, Fluid Genetic Algorithm (FGA, some of these concepts are changed, removed, and furthermore, new concepts are introduced. The performance of GA and FGA are compared through seven benchmark functions. FGA not only shows a better success rate and better convergence control, but it can be applied to a wider range of problems including multi-objective and multi-level problems. Also, the application of FGA for a real engineering problem, Quadric Assignment Problem (AQP, is shown and experienced.

  4. The cyclic reduction algorithm

    Bini, Dario; Meini, Beatrice


    Cyclic reduction is an algorithm invented by G.H. Golub and R. W. Hockney in the mid 1960s for solving linear systems related to the finite differences discretization of the Poisson equation over a rectangle. Among the algorithms of Gene Golub, it is one of the most versatile and powerful ever created. Recently, it has been applied to solve different problems from different applicative areas. In this paper we survey the main features of cyclic reduction, relate it to properties of analytic functions, recall its extension to solving more general finite and infinite linear systems, and different kinds of nonlinear matrix equations, including algebraic Riccati equations, with applications to Markov chains, queueing models and transport theory. Some new results concerning the convergence properties of cyclic reduction and its applicability are proved under very weak assumptions. New formulae for overcoming breakdown are provided.

  5. Anyone for an algorithm?

    Liz Wager; freelance writer; trainer; publications consultant


    @@ I have a fondness for flowcharts. I also attempt to teach doctors to prefer short words when they are writing. So, when I found myself exchanging emails with an American doctor who insisted on referring to the COPE flowcharts as algorithms, I was determined to teach him the error of his ways. But first, I needed some evidence. And now I am in a dilemma, because I also love obscure and confused word origins.

  6. Parallel Algorithms Derivation


    Lecture Notes in Computer Science , Warwich, England, July 16.20, 1990. J. Reif and J. Storer, "A Parallel Architecture for...34, The 10th Conference on Foundations of Software Technology and Theoretical Computer Science, Lecture Notes in Computer Science , Springer-Verlag...Geometry, in Optimal Algorithms, H. Djidjev editor, Springer-Verlag Lecture Notes in Computer Science 401, 1989, 1.8.. J. Reif, R. Paturi, and S.

  7. Algorithmic information theory

    Grünwald, P.D.; Vitányi, P.M.B.; Adriaans, P.; van Benthem, J.


    We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining 'information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are fundamentally different. We indicate how recent developments within the theory allow one to formally distinguish between 'structural' (meaningful) and 'random' information as measured by the Kolmo...

  8. Algorithmic information theory

    Grünwald, P.D.; Vitányi, P.M.B.


    We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are fundamentally different. We indicate how recent developments within the theory allow one to formally distinguish between `structural' (meaningful) and `random' information as measured by the Kolmo...

  9. Boosting foundations and algorithms

    Schapire, Robert E


    Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate "rules of thumb." A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

  10. Genetic Algorithm for Optimization: Preprocessor and Algorithm

    Sen, S. K.; Shaykhian, Gholam A.


    Genetic algorithm (GA) inspired by Darwin's theory of evolution and employed to solve optimization problems - unconstrained or constrained - uses an evolutionary process. A GA has several parameters such the population size, search space, crossover and mutation probabilities, and fitness criterion. These parameters are not universally known/determined a priori for all problems. Depending on the problem at hand, these parameters need to be decided such that the resulting GA performs the best. We present here a preprocessor that achieves just that, i.e., it determines, for a specified problem, the foregoing parameters so that the consequent GA is a best for the problem. We stress also the need for such a preprocessor both for quality (error) and for cost (complexity) to produce the solution. The preprocessor includes, as its first step, making use of all the information such as that of nature/character of the function/system, search space, physical/laboratory experimentation (if already done/available), and the physical environment. It also includes the information that can be generated through any means - deterministic/nondeterministic/graphics. Instead of attempting a solution of the problem straightway through a GA without having/using the information/knowledge of the character of the system, we would do consciously a much better job of producing a solution by using the information generated/created in the very first step of the preprocessor. We, therefore, unstintingly advocate the use of a preprocessor to solve a real-world optimization problem including NP-complete ones before using the statistically most appropriate GA. We also include such a GA for unconstrained function optimization problems.

  11. Special Issue on Graph Algorithms


    This special issue of Algorithms is devoted to the design and analysis of algorithms for solving combinatorial problems of a theoretical or practical nature involving graphs, with a focus on computational complexity.

  12. Iterative Algorithms for Nonexpansive Mappings

    Yao Yonghong


    Full Text Available Abstract We suggest and analyze two new iterative algorithms for a nonexpansive mapping in Banach spaces. We prove that the proposed iterative algorithms converge strongly to some fixed point of .

  13. Building Better Nurse Scheduling Algorithms

    Aickelin, Uwe


    The aim of this research is twofold: Firstly, to model and solve a complex nurse scheduling problem with an integer programming formulation and evolutionary algorithms. Secondly, to detail a novel statistical method of comparing and hence building better scheduling algorithms by identifying successful algorithm modifications. The comparison method captures the results of algorithms in a single figure that can then be compared using traditional statistical techniques. Thus, the proposed method of comparing algorithms is an objective procedure designed to assist in the process of improving an algorithm. This is achieved even when some results are non-numeric or missing due to infeasibility. The final algorithm outperforms all previous evolutionary algorithms, which relied on human expertise for modification.

  14. Algorithms for builder guidelines

    Balcomb, J D; Lekov, A B


    The Builder Guidelines are designed to make simple, appropriate guidelines available to builders for their specific localities. Builders may select from passive solar and conservation strategies with different performance potentials. They can then compare the calculated results for their particular house design with a typical house in the same location. Algorithms used to develop the Builder Guidelines are described. The main algorithms used are the monthly solar ratio (SLR) method for winter heating, the diurnal heat capacity (DHC) method for temperature swing, and a new simplified calculation method (McCool) for summer cooling. This paper applies the algorithms to estimate the performance potential of passive solar strategies, and the annual heating and cooling loads of various combinations of conservation and passive solar strategies. The basis of the McCool method is described. All three methods are implemented in a microcomputer program used to generate the guideline numbers. Guidelines for Denver, Colorado, are used to illustrate the results. The structure of the guidelines and worksheet booklets are also presented. 5 refs., 3 tabs.

  15. The Hip Restoration Algorithm

    Stubbs, Allston Julius; Atilla, Halis Atil


    Summary Background Despite the rapid advancement of imaging and arthroscopic techniques about the hip joint, missed diagnoses are still common. As a deep joint and compared to the shoulder and knee joints, localization of hip symptoms is difficult. Hip pathology is not easily isolated and is often related to intra and extra-articular abnormalities. In light of these diagnostic challenges, we recommend an algorithmic approach to effectively diagnoses and treat hip pain. Methods In this review, hip pain is evaluated from diagnosis to treatment in a clear decision model. First we discuss emergency hip situations followed by the differentiation of intra and extra-articular causes of the hip pain. We differentiate the intra-articular hip as arthritic and non-arthritic and extra-articular pain as surrounding or remote tissue generated. Further, extra-articular hip pain is evaluated according to pain location. Finally we summarize the surgical treatment approach with an algorithmic diagram. Conclusion Diagnosis of hip pathology is difficult because the etiologies of pain may be various. An algorithmic approach to hip restoration from diagnosis to rehabilitation is crucial to successfully identify and manage hip pathologies. Level of evidence: V. PMID:28066734

  16. Large scale tracking algorithms

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  17. Multipurpose audio watermarking algorithm

    Ning CHEN; Jie ZHU


    To make audio watermarking accomplish both copyright protection and content authentication with localization, a novel multipurpose audio watermarking scheme is proposed in this paper. The zero-watermarking idea is introduced into the design of robust watermarking algorithm to ensure the transparency and to avoid the interference between the robust watermark and the semi-fragile watermark. The property of natural audio that the VQ indices of DWT-DCT coefficients among neighboring frames tend to be very similar is utilized to extract essential feature from the host audio, which is then used for watermark extraction. And, the chaotic mapping based semi-fragile watermark is embedded in the detail wavelet coefficients based on the instantaneous mixing model of the independent component analysis (ICA) system. Both the robust and semi-fragile watermarks can be extracted blindly and the semi-fragile watermarking algorithm can localize the tampering accurately. Simulation results demonstrate the effectiveness of our algorithm in terms of transparency, security, robustness and tampering localization ability.

  18. RADFLO physics and algorithms

    Symbalisty, E.M.D.; Zinn, J.; Whitaker, R.W.


    This paper describes the history, physics, and algorithms of the computer code RADFLO and its extension HYCHEM. RADFLO is a one-dimensional, radiation-transport hydrodynamics code that is used to compute early-time fireball behavior for low-altitude nuclear bursts. The primary use of the code is the prediction of optical signals produced by nuclear explosions. It has also been used to predict thermal and hydrodynamic effects that are used for vulnerability and lethality applications. Another closely related code, HYCHEM, is an extension of RADFLO which includes the effects of nonequilibrium chemistry. Some examples of numerical results will be shown, along with scaling expressions derived from those results. We describe new computations of the structures and luminosities of steady-state shock waves and radiative thermal waves, which have been extended to cover a range of ambient air densities for high-altitude applications. We also describe recent modifications of the codes to use a one-dimensional analog of the CAVEAT fluid-dynamics algorithm in place of the former standard Richtmyer-von Neumann algorithm.

  19. Large scale tracking algorithms.

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry


    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  20. A Novel Rule Induction Algorithm

    ZHENG Jianguo; LIU Fang; WANG Lei; JIAO Licheng


    Knowledge discovery in databases is concerned with extracting useful information from databases, and the immune algorithm is a biological theory-based and globally searching algorithm. A specific immune algorithm is designed for discovering a few interesting, high-level prediction rules from databases, rather than discovering classification knowledge as usual in the literatures. Simulations show that this novel algorithm is able to improve the stability of the population, increase the holistic performance and make the rules extracted have higher precision.

  1. Foundations of genetic algorithms 1991


    Foundations of Genetic Algorithms 1991 (FOGA 1) discusses the theoretical foundations of genetic algorithms (GA) and classifier systems.This book compiles research papers on selection and convergence, coding and representation, problem hardness, deception, classifier system design, variation and recombination, parallelization, and population divergence. Other topics include the non-uniform Walsh-schema transform; spurious correlations and premature convergence in genetic algorithms; and variable default hierarchy separation in a classifier system. The grammar-based genetic algorithm; condition

  2. Essential algorithms a practical approach to computer algorithms

    Stephens, Rod


    A friendly and accessible introduction to the most useful algorithms Computer algorithms are the basic recipes for programming. Professional programmers need to know how to use algorithms to solve difficult programming problems. Written in simple, intuitive English, this book describes how and when to use the most practical classic algorithms, and even how to create new algorithms to meet future needs. The book also includes a collection of questions that can help readers prepare for a programming job interview. Reveals methods for manipulating common data structures s

  3. Algorithms for Optimally Arranging Multicore Memory Structures

    Wei-Che Tseng


    Full Text Available As more processing cores are added to embedded systems processors, the relationships between cores and memories have more influence on the energy consumption of the processor. In this paper, we conduct fundamental research to explore the effects of memory sharing on energy in a multicore processor. We study the Memory Arrangement (MA Problem. We prove that the general case of MA is NP-complete. We present an optimal algorithm for solving linear MA and optimal and heuristic algorithms for solving rectangular MA. On average, we can produce arrangements that consume 49% less energy than an all shared memory arrangement and 14% less energy than an all private memory arrangement for randomly generated instances. For DSP benchmarks, we can produce arrangements that, on average, consume 20% less energy than an all shared memory arrangement and 27% less energy than an all private memory arrangement.

  4. Research on Iris Region Localization Algorithms

    Pravin S.Patil


    Full Text Available Iris recognition is a biometric technique that offers premium performance. Iris localization is critical to the success of an iris recognition system, since data that is falsely represented as iris pattern data will corrupt the biometric templates generated, resulting in poor recognition rates. So far different algorithms for iris localization having been proposed. This paper explored four efficient methods for iris localization, out of these three methods of iris localization in circular form and one methods of unwrapping the iris in to a flat bed. Experimental results are reported to demonstrate performance evaluation of every implemented algorithms. Conclusion based on comparisons can provide most significant information for further research. A CASIA and UPOL iris databases of iris images has been used for implementation of iris localization General Term Biometrics,Iris Recognition,Iris Localization

  5. Trajectory metaheuristic algorithms to optimize problems combinatorics

    Natalia Alancay


    Full Text Available The application of metaheuristic algorithms to optimization problems has been very important during the last decades. The main advantage of these techniques is their flexibility and robustness, which allows them to be applied to a wide range of problems. In this work we concentrate on metaheuristics based on Simulated Annealing, Tabu Search and Variable Neighborhood Search trajectory whose main characteristic is that they start from a point and through the exploration of the neighborhood vary the current solution, forming a trajectory. By means of the instances of the selected combinatorial problems, a computational experimentation is carried out that illustrates the behavior of the algorithmic methods to solve them. The main objective of this work is to perform the study and comparison of the results obtained for the selected trajectories metaheuristics in its application for the resolution of a set of academic problems of combinatorial optimization.

  6. Tate's algorithm and F-theory

    Katz, Sheldon; Morrison, David R.; Schäfer-Nameki, Sakura; Sully, James


    The "Tate forms" for elliptically fibered Calabi-Yau manifolds are reconsidered in order to determine their general validity. We point out that there were some implicit assumptions made in the original derivation of these "Tate forms" from the Tate algorithm. By a careful analysis of the Tate algorithm itself, we deduce that the "Tateforms" (without any futher divisiblity assumptions) do not hold in some instances and have to be replaced by a new type of ansatz. Furthermore, we give examples in which the existence of a "Tate form" can be globally obstructed, i.e., the change of coordinates does not extend globally to sections of the entire base of the elliptic fibration. These results have implications both for model-building and for the exploration of the landscape of F-theory vacua.

  7. P2P网络中声誉信息分发算法%Reputation distribution algorithm in P2P network

    吴慧婷; 郭亚军; 王亮


    基于声誉的信任模型中,节点如何获取所需要的声誉信息是个关键问题.提出了一种基于蚂蚁的声誉信息分发算法.该算法通过模拟蚁群觅食的行为,实现了声誉信息在无集中式控制的P2P环境下的分发,解决了节点为建立信任关系获取声誉信息问题.仿真实验表明,该算法能使节点选择最佳路径获取声誉信息,降低了系统的负载,保证了节点获取声誉信息的可靠性和安全性.%In the reputation-based trust model,how to obtain reputation is a key problem.This paper proposes an algorithm for ant-based reputation distribution.This algorithm realizes reputation distribution in P2P network by modeling the ants' behavior of looking for food.The new approach can obtain reputation for building trust relationships in P2P network.Theoretical analysis and simulations prove that it can effectively improve the reliability and security and decrease network load.

  8. Continuous Media Tasks Scheduling Algorithm

    Myungryun Yoo


    Full Text Available In this paper the authors propose modified proportional share scheduling algorithm considering the characteristics of continuous media such as its continuity and time dependency. Proposed scheduling algorithm shows graceful degradation of performance in overloaded situation and decreases the number of context switching. Proposed scheduling algorithm is evaluated using several numerical tests under various conditions, especially overloaded situation.

  9. Grammar Rules as Computer Algorithms.

    Rieber, Lloyd


    One college writing teacher engaged his class in the revision of a computer program to check grammar, focusing on improvement of the algorithms for identifying inappropriate uses of the passive voice. Process and problems of constructing new algorithms, effects on student writing, and other algorithm applications are discussed. (MSE)

  10. Polyceptron: A Polyhedral Learning Algorithm

    Manwani, Naresh


    In this paper we propose a new algorithm for learning polyhedral classifiers which we call as Polyceptron. It is a Perception like algorithm which updates the parameters only when the current classifier misclassifies any training data. We give both batch and online version of Polyceptron algorithm. Finally we give experimental results to show the effectiveness of our approach.

  11. Multisensor estimation: New distributed algorithms

    K. N. Plataniotis


    Full Text Available The multisensor estimation problem is considered in this paper. New distributed algorithms, which are able to locally process the information and which deliver identical results to those generated by their centralized counterparts are presented. The algorithms can be used to provide robust and computationally efficient solutions to the multisensor estimation problem. The proposed distributed algorithms are theoretically interesting and computationally attractive.

  12. Comparison of Text Categorization Algorithms

    SHI Yong-feng; ZHAO Yan-ping


    This paper summarizes several automatic text categorization algorithms in common use recently, analyzes and compares their advantages and disadvantages.It provides clues for making use of appropriate automatic classifying algorithms in different fields.Finally some evaluations and summaries of these algorithms are discussed, and directions to further research have been pointed out.

  13. Sandbox algorithm for multifractal analysis of complex networks

    Liu, Jin-Long; Anh, Vo


    Complex networks have attracted much attention in diverse areas of science and technology. Multifractal analysis (MFA) is a useful way to systematically describe the spatial heterogeneity of both theoretical and experimental fractal patterns. In this paper, we introduce a new algorithm --- the sandbox (SB) algorithm, for MFA of complex networks. First we compare the SB algorithm with two existing algorithms of MFA for complex networks: the compact-box-burning (CBB) algorithm proposed by Furuya and Yakubo ( Phys. Rev. E, 84 (2011) 036118), and the improved box-counting (BC) algorithm proposed by Li et al. ( J. Stat. Mech.: Theor. Exp., 2014 (2014) P02020) by calculating the mass exponents tau(q) of some deterministic model networks. We make a detailed comparison between the numerical and theoretical results of these model networks. The comparison results show that the SB algorithm is the most effective and feasible algorithm to calculate the mass exponents tau(q) and to explore the multifractal behavior of com...

  14. Opposite Degree Algorithm and Its Applications

    Xiao-Guang Yue


    The opposite (Opposite Degree, referred to as OD) algorithm is an intelligent algorithm proposed by Yue Xiaoguang et al. Opposite degree algorithm is mainly based on the concept of opposite degree, combined with the idea of design of neural network and genetic algorithm and clustering analysis algorithm. The OD algorithm is divided into two sub algorithms, namely: opposite degree - numerical computation (OD-NC) algorithm and opposite degree - Classification computation (OD-CC) algorithm.

  15. Opposite Degree Algorithm and Its Applications

    Xiao-Guang Yue


    Full Text Available The opposite (Opposite Degree, referred to as OD algorithm is an intelligent algorithm proposed by Yue Xiaoguang et al. Opposite degree algorithm is mainly based on the concept of opposite degree, combined with the idea of design of neural network and genetic algorithm and clustering analysis algorithm. The OD algorithm is divided into two sub algorithms, namely: opposite degree - numerical computation (OD-NC algorithm and opposite degree - Classification computation (OD-CC algorithm.

  16. Exploring the Hamiltonian inversion landscape.

    Donovan, Ashley; Rabitz, Herschel


    The identification of quantum system Hamiltonians through the use of experimental data remains an important research goal. Seeking a Hamiltonian that is consistent with experimental measurements constitutes an excursion over a Hamiltonian inversion landscape, which is the quality of reproducing the data as a function of the Hamiltonian parameters. Recent theoretical work showed that with sufficient experimental data there should be local convexity about the true Hamiltonian on the landscape. The present paper builds on this result and performs simulations to test whether such convexity is observed. A gradient-based Hamiltonian search algorithm is incorporated into an inversion routine as a means to explore the local inversion landscape. The simulations consider idealized noise-free as well as noise-ridden experimental data. The results suggest that a sizable convex domain exists about the true Hamiltonian, even with a modest amount of experimental data and in the presence of a reasonable level of noise.

  17. Exploration of Periodically Varying Graphs

    Flocchini, Paola; Santoro, Nicola


    We study the computability and complexity of the exploration problem in a class of highly dynamic graphs: periodically varying (PV) graphs, where the edges exist only at some (unknown) times defined by the periodic movements of carriers. These graphs naturally model highly dynamic infrastructure-less networks such as public transports with fixed timetables, low earth orbiting (LEO) satellite systems, security guards' tours, etc. We establish necessary conditions for the problem to be solved. We also derive lower bounds on the amount of time required in general, as well as for the PV graphs defined by restricted classes of carriers movements: simple routes, and circular routes. We then prove that the limitations on computability and complexity we have established are indeed tight. In fact we prove that all necessary conditions are also sufficient and all lower bounds on costs are tight. We do so constructively presenting two worst case optimal solution algorithms, one for anonymous systems, and one for those w...

  18. Selfish Gene Algorithm Vs Genetic Algorithm: A Review

    Ariff, Norharyati Md; Khalid, Noor Elaiza Abdul; Hashim, Rathiah; Noor, Noorhayati Mohamed


    Evolutionary algorithm is one of the algorithms inspired by the nature. Within little more than a decade hundreds of papers have reported successful applications of EAs. In this paper, the Selfish Gene Algorithms (SFGA), as one of the latest evolutionary algorithms (EAs) inspired from the Selfish Gene Theory which is an interpretation of Darwinian Theory ideas from the biologist Richards Dawkins on 1989. In this paper, following a brief introduction to the Selfish Gene Algorithm (SFGA), the chronology of its evolution is presented. It is the purpose of this paper is to present an overview of the concepts of Selfish Gene Algorithm (SFGA) as well as its opportunities and challenges. Accordingly, the history, step involves in the algorithm are discussed and its different applications together with an analysis of these applications are evaluated.

  19. Polygon Exploration with Discrete Vision

    Fekete, Sandor P


    With the advent of autonomous robots with two- and three-dimensional scanning capabilities, classical visibility-based exploration methods from computational geometry have gained in practical importance. However, real-life laser scanning of useful accuracy does not allow the robot to scan continuously while in motion; instead, it has to stop each time it surveys its environment. This requirement was studied by Fekete, Klein and Nuechter for the subproblem of looking around a corner, but until now has not been considered in an online setting for whole polygonal regions. We give the first algorithmic results for this important algorithmic problem that combines stationary art gallery-type aspects with watchman-type issues in an online scenario: We demonstrate that even for orthoconvex polygons, a competitive strategy can be achieved only for limited aspect ratio A (the ratio of the maximum and minimum edge length of the polygon), i.e., for a given lower bound on the size of an edge; we give a matching upper boun...

  20. 2D multi-objective placement algorithm for free-form components

    Jacquenot, Guillaume; Maisonneuve, Jean-Jacques; Wenger, Philippe


    This article presents a generic method to solve 2D multi-objective placement problem for free-form components. The proposed method is a relaxed placement technique combined with an hybrid algorithm based on a genetic algorithm and a separation algorithm. The genetic algorithm is used as a global optimizer and is in charge of efficiently exploring the search space. The separation algorithm is used to legalize solutions proposed by the global optimizer, so that placement constraints are satisfied. A test case illustrates the application of the proposed method. Extensions for solving the 3D problem are given at the end of the article.


    A. Paulin Florence


    Full Text Available Cloud computing is a model that points at streamlining the on-demand provisioning of software, hardware and data as services and providing end-users with flexible and scalable services accessible through the Internet. The main objective of the proposed approach is to maximize the resource utilization and provide a good balanced load among all the resources in cloud servers. Initially, a load model of every resource will be derived based on several factors such as, memory usage, processing time and access rate. Based on the newly derived load index, the current load will be computed for all the resources shared in virtual machine of cloud servers. Once the load index is computed for all the resources, load balancing operation will be initiated to effectively use the resources dynamically with the process of assigning resources to the corresponding node to reduce the load value. So, assigning of resources to proper nodes is an optimal distribution problem so that many optimization algorithms such as genetic algorithm and modified genetic algorithm are utilized for load balancing. These algorithms are not much effective in providing the neighbour solutions since it does not overcome exploration and exploration problem. So, utilizing the effective optimization procedure instead of genetic algorithm can lead to better load balancing since it is a traditional and old algorithm. Accordingly, I have planned to utilize a recent optimization algorithm, called firefly algorithm to do the load balancing operation in our proposed work. At first, the index table will be maintained by considering the availability of virtual servers and sequence of request. Then, load index will be computed based on the newly derived formulae. Based on load index, load balancing operation will be carried out using firefly algorithm. The performance analysis produced expected results and thus proved the proposed approach is efficient in optimizing schedules by balancing the

  2. Exploration and Mining Roadmap



    This Exploration and Mining Technology Roadmap represents the third roadmap for the Mining Industry of the Future. It is based upon the results of the Exploration and Mining Roadmap Workshop held May 10 ñ 11, 2001.


    Narendran Rajagopalan


    Full Text Available Performance of Wireless LAN can be improved at each layer of the protocol stack with respect to energy efficiency. The Media Access Control layer is responsible for the key functions like access control and flow control. During contention, Backoff algorithm is used to gain access to the medium with minimum probability of collision. After studying different variations of back off algorithms that have been proposed, a new variant called History based Probabilistic Backoff Algorithm is proposed. Through mathematical analysis and simulation results using NS-2, it is seen that proposed History based Probabilistic Backoff algorithm performs better than Binary Exponential Backoff algorithm.

  4. Perceptual hashing algorithms benchmark suite

    Zhang Hui; Schmucker Martin; Niu Xiamu


    Numerous perceptual hashing algorithms have been developed for identification and verification of multimedia objects in recent years. Many application schemes have been adopted for various commercial objects. Developers and users are looking for a benchmark tool to compare and evaluate their current algorithms or technologies. In this paper, a novel benchmark platform is presented. PHABS provides an open framework and lets its users define their own test strategy, perform tests, collect and analyze test data. With PHABS, various performance parameters of algorithms can be tested, and different algorithms or algorithms with different parameters can be evaluated and compared easily.

  5. Fast Algorithms for Model-Based Diagnosis

    Fijany, Amir; Barrett, Anthony; Vatan, Farrokh; Mackey, Ryan


    Two improved new methods for automated diagnosis of complex engineering systems involve the use of novel algorithms that are more efficient than prior algorithms used for the same purpose. Both the recently developed algorithms and the prior algorithms in question are instances of model-based diagnosis, which is based on exploring the logical inconsistency between an observation and a description of a system to be diagnosed. As engineering systems grow more complex and increasingly autonomous in their functions, the need for automated diagnosis increases concomitantly. In model-based diagnosis, the function of each component and the interconnections among all the components of the system to be diagnosed (for example, see figure) are represented as a logical system, called the system description (SD). Hence, the expected behavior of the system is the set of logical consequences of the SD. Faulty components lead to inconsistency between the observed behaviors of the system and the SD. The task of finding the faulty components (diagnosis) reduces to finding the components, the abnormalities of which could explain all the inconsistencies. Of course, the meaningful solution should be a minimal set of faulty components (called a minimal diagnosis), because the trivial solution, in which all components are assumed to be faulty, always explains all inconsistencies. Although the prior algorithms in question implement powerful methods of diagnosis, they are not practical because they essentially require exhaustive searches among all possible combinations of faulty components and therefore entail the amounts of computation that grow exponentially with the number of components of the system.

  6. Exploration cost-cutting

    Huttrer, J.


    This presentation by Jerry Huttrer, President, Geothermal Management Company, discusses the general state of exploration in the geothermal industry today, and mentions some ways to economize and perhaps save costs of geothermal exploration in the future. He suggests an increased use of satellite imagery in the mapping of geothermal resources and the identification of hot spots. Also, coordinating with oil and gas exploration efforts, the efficiency of the exploration task could be optimized.

  7. Impulse denoising using Hybrid Algorithm

    Ms.Arumugham Rajamani


    Full Text Available Many real time images facing a problem of salt and pepper noise contaminated,due to poor illumination and environmental factors. Many filters and algorithms are used to remove salt and pepper noise from the image, but it also removes image information. This paper proposes a new effective algorithm for diagnosing and removing salt and pepper noise is presented. The existing standard algorithms like Median Filter (MF, Weighted Median Filter (WMF, Standard Median Filter (SMF and so on, will yield poor performance particularly at high noise density. The suggested algorithm is compared with the above said standard algorithms using the metrics Mean Square Error (MSE and Peak Signal to Noise Ratio (PSNR value.The proposed algorithm exhibits more competitive performance results at all noise densities. The joint sorting and diagonal averaging algorithm has lower computational time,better quantitative results and improved qualitative result by a better visual appearance at all noise densities.

  8. Algorithm performance evaluation

    Smith, Richard N.; Greci, Anthony M.; Bradley, Philip A.


    Traditionally, the performance of adaptive antenna systems is measured using automated antenna array pattern measuring equipment. This measurement equipment produces a plot of the receive gain of the antenna array as a function of angle. However, communications system users more readily accept and understand bit error rate (BER) as a performance measure. The work reported on here was conducted to characterize adaptive antenna receiver performance in terms of overall communications system performance using BER as a performance measure. The adaptive antenna system selected for this work featured a linear array, least mean square (LMS) adaptive algorithm and a high speed phase shift keyed (PSK) communications modem.

  9. Analysis of Virus Algorithms

    Jyoti Kalyani


    Full Text Available Security of wired and wireless networks is the most challengeable in today's computer world. The aim of this study was to give brief introduction about viruses and worms, their creators and characteristics of algorithms used by viruses. Here wired and wireless network viruses are elaborated. Also viruses are compared with human immune system. On the basis of this comparison four guidelines are given to detect viruses so that more secure systems are made. While concluding this study it is found that the security is most challengeable, thus it is required to make more secure models which automatically detect viruses and prevent the system from its affect.

  10. Big data algorithms, analytics, and applications

    Li, Kuan-Ching; Yang, Laurence T; Cuzzocrea, Alfredo


    Data are generated at an exponential rate all over the world. Through advanced algorithms and analytics techniques, organizations can harness this data, discover hidden patterns, and use the findings to make meaningful decisions. Containing contributions from leading experts in their respective fields, this book bridges the gap between the vastness of big data and the appropriate computational methods for scientific and social discovery. It also explores related applications in diverse sectors, covering technologies for media/data communication, elastic media/data storage, cross-network media/

  11. Contrast data mining concepts, algorithms, and applications

    Dong, Guozhu


    A Fruitful Field for Researching Data Mining Methodology and for Solving Real-Life Problems Contrast Data Mining: Concepts, Algorithms, and Applications collects recent results from this specialized area of data mining that have previously been scattered in the literature, making them more accessible to researchers and developers in data mining and other fields. The book not only presents concepts and techniques for contrast data mining, but also explores the use of contrast mining to solve challenging problems in various scientific, medical, and business domains. Learn from Real Case Studies

  12. Asteroid exploration and utilization: The Hawking explorer

    Carlson, Alan; Date, Medha; Duarte, Manny; Erian, Neil; Gafka, George; Kappler, Peter; Patano, Scott; Perez, Martin; Ponce, Edgar; Radovich, Brian


    The Earth is nearing depletion of its natural resources at a time when human beings are rapidly expanding the frontiers of space. The resources which may exist on asteroids could have enormous potential for aiding and enhancing human space exploration as well as life on Earth. With the possibly limitless opportunities that exist, it is clear that asteroids are the next step for human existence in space. This report comprises the efforts of NEW WORLDS, Inc. to develop a comprehensive design for an asteroid exploration/sample return mission. This mission is a precursor to proof-of-concept missions that will investigate the validity of mining and materials processing on an asteroid. Project STONER (Systematic Transfer of Near Earth Resources) is based on two utilization scenarios: (1) moving an asteroid to an advantageous location for use by Earth; and (2) mining an asteroids and transporting raw materials back to Earth. The asteroid explorer/sample return mission is designed in the context of both scenarios and is the first phase of a long range plane for humans to utilize asteroid resources. The report concentrates specifically on the selection of the most promising asteroids for exploration and the development of an exploration scenario. Future utilization as well as subsystem requirements of an asteroid sample return probe are also addressed.

  13. Online Planning Algorithm

    Rabideau, Gregg R.; Chien, Steve A.


    AVA v2 software selects goals for execution from a set of goals that oversubscribe shared resources. The term goal refers to a science or engineering request to execute a possibly complex command sequence, such as image targets or ground-station downlinks. Developed as an extension to the Virtual Machine Language (VML) execution system, the software enables onboard and remote goal triggering through the use of an embedded, dynamic goal set that can oversubscribe resources. From the set of conflicting goals, a subset must be chosen that maximizes a given quality metric, which in this case is strict priority selection. A goal can never be pre-empted by a lower priority goal, and high-level goals can be added, removed, or updated at any time, and the "best" goals will be selected for execution. The software addresses the issue of re-planning that must be performed in a short time frame by the embedded system where computational resources are constrained. In particular, the algorithm addresses problems with well-defined goal requests without temporal flexibility that oversubscribes available resources. By using a fast, incremental algorithm, goal selection can be postponed in a "just-in-time" fashion allowing requests to be changed or added at the last minute. Thereby enabling shorter response times and greater autonomy for the system under control.

  14. Online Pairwise Learning Algorithms.

    Ying, Yiming; Zhou, Ding-Xuan


    Pairwise learning usually refers to a learning task that involves a loss function depending on pairs of examples, among which the most notable ones are bipartite ranking, metric learning, and AUC maximization. In this letter we study an online algorithm for pairwise learning with a least-square loss function in an unconstrained setting of a reproducing kernel Hilbert space (RKHS) that we refer to as the Online Pairwise lEaRning Algorithm (OPERA). In contrast to existing works (Kar, Sriperumbudur, Jain, & Karnick, 2013 ; Wang, Khardon, Pechyony, & Jones, 2012 ), which require that the iterates are restricted to a bounded domain or the loss function is strongly convex, OPERA is associated with a non-strongly convex objective function and learns the target function in an unconstrained RKHS. Specifically, we establish a general theorem that guarantees the almost sure convergence for the last iterate of OPERA without any assumptions on the underlying distribution. Explicit convergence rates are derived under the condition of polynomially decaying step sizes. We also establish an interesting property for a family of widely used kernels in the setting of pairwise learning and illustrate the convergence results using such kernels. Our methodology mainly depends on the characterization of RKHSs using its associated integral operators and probability inequalities for random variables with values in a Hilbert space.

  15. Algorithmic Relative Complexity

    Daniele Cerra


    Full Text Available Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information theory, and these two approaches share many common traits. In this work, we expand the relations between these two frameworks by introducing algorithmic cross-complexity and relative complexity, counterparts of the cross-entropy and relative entropy (or Kullback-Leibler divergence found in Shannon’s framework. We define the cross-complexity of an object x with respect to another object y as the amount of computational resources needed to specify x in terms of y, and the complexity of x related to y as the compression power which is lost when adopting such a description for x, compared to the shortest representation of x. Properties of analogous quantities in classical information theory hold for these new concepts. As these notions are incomputable, a suitable approximation based upon data compression is derived to enable the application to real data, yielding a divergence measure applicable to any pair of strings. Example applications are outlined, involving authorship attribution and satellite image classification, as well as a comparison to similar established techniques.

  16. Fatigue evaluation algorithms: Review

    Passipoularidis, V.A.; Broendsted, P.


    A progressive damage fatigue simulator for variable amplitude loads named FADAS is discussed in this work. FADAS (Fatigue Damage Simulator) performs ply by ply stress analysis using classical lamination theory and implements adequate stiffness discount tactics based on the failure criterion of Puck, to model the degradation caused by failure events in ply level. Residual strength is incorporated as fatigue damage accumulation metric. Once the typical fatigue and static properties of the constitutive ply are determined,the performance of an arbitrary lay-up under uniaxial and/or multiaxial load time series can be simulated. The predictions are validated against fatigue life data both from repeated block tests at a single stress ratio as well as against spectral fatigue using the WISPER, WISPERX and NEW WISPER load sequences on a Glass/Epoxy multidirectional laminate typical of a wind turbine rotor blade construction. Two versions of the algorithm, the one using single-step and the other using incremental application of each load cycle (in case of ply failure) are implemented and compared. Simulation results confirm the ability of the algorithm to take into account load sequence effects. In general, FADAS performs well in predicting life under both spectral and block loading fatigue. (author)

  17. The Watershed Algorithm for Image Segmentation

    OU Yan; LIN Nan


    This article introduced the watershed algorithm for the segmentation, illustrated the segmation process by implementing this algorithm. By comparing with another three related algorithm, this article revealed both the advantages and drawbacks of the watershed algorithm.

  18. Improved Quantum-Inspired Evolutionary Algorithm for Engineering Design Optimization

    Jinn-Tsong Tsai


    Full Text Available An improved quantum-inspired evolutionary algorithm is proposed for solving mixed discrete-continuous nonlinear problems in engineering design. The proposed Latin square quantum-inspired evolutionary algorithm (LSQEA combines Latin squares and quantum-inspired genetic algorithm (QGA. The novel contribution of the proposed LSQEA is the use of a QGA to explore the optimal feasible region in macrospace and the use of a systematic reasoning mechanism of the Latin square to exploit the better solution in microspace. By combining the advantages of exploration and exploitation, the LSQEA provides higher computational efficiency and robustness compared to QGA and real-coded GA when solving global numerical optimization problems with continuous variables. Additionally, the proposed LSQEA approach effectively solves mixed discrete-continuous nonlinear design optimization problems in which the design variables are integers, discrete values, and continuous values. The computational experiments show that the proposed LSQEA approach obtains better results compared to existing methods reported in the literature.

  19. The Role of Vertex Consistency in Sampling-based Algorithms for Optimal Motion Planning

    Arslan, Oktay


    Motion planning problems have been studied by both the robotics and the controls research communities for a long time, and many algorithms have been developed for their solution. Among them, incremental sampling-based motion planning algorithms, such as the Rapidly-exploring Random Trees (RRTs), and the Probabilistic Road Maps (PRMs) have become very popular recently, owing to their implementation simplicity and their advantages in handling high-dimensional problems. Although these algorithms work very well in practice, the quality of the computed solution is often not good, i.e., the solution can be far from the optimal one. A recent variation of RRT, namely the RRT* algorithm, bypasses this drawback of the traditional RRT algorithm, by ensuring asymptotic optimality as the number of samples tends to infinity. Nonetheless, the convergence rate to the optimal solution may still be slow. This paper presents a new incremental sampling-based motion planning algorithm based on Rapidly-exploring Random Graphs (RRG...

  20. Secure Computation, I/O-Efficient Algorithms and Distributed Signatures

    Damgård, Ivan Bjerre; Kölker, Jonas; Toft, Tomas


    adversary corrupting a constant fraction of the players and servers. Using packed secret sharing, the data can be stored in a compact way but will only be accessible in a block-wise fashion. We explore the possibility of using I/O-efficient algorithms to nevertheless compute on the data as efficiently...

  1. A Paper-and-Pencil gcd Algorithm for Gaussian Integers

    Szabo, Sandor


    As with natural numbers, a greatest common divisor of two Gaussian (complex) integers "a" and "b" is a Gaussian integer "d" that is a common divisor of both "a" and "b". This article explores an algorithm for such gcds that is easy to do by hand.

  2. Learning JavaScript data structures and algorithms

    Groner, Loiane


    If you are a JavaScript developer or someone who has basic knowledge of JavaScript, and want to explore its optimum ability, this fast-paced book is definitely for you. Programming logic is the only thing you need to know to start having fun with algorithms.

  3. Optimal Fungal Space Searching Algorithms.

    Asenova, Elitsa; Lin, Hsin-Yu; Fu, Eileen; Nicolau, Dan V; Nicolau, Dan V


    Previous experiments have shown that fungi use an efficient natural algorithm for searching the space available for their growth in micro-confined networks, e.g., mazes. This natural "master" algorithm, which comprises two "slave" sub-algorithms, i.e., collision-induced branching and directional memory, has been shown to be more efficient than alternatives, with one, or the other, or both sub-algorithms turned off. In contrast, the present contribution compares the performance of the fungal natural algorithm against several standard artificial homologues. It was found that the space-searching fungal algorithm consistently outperforms uninformed algorithms, such as Depth-First-Search (DFS). Furthermore, while the natural algorithm is inferior to informed ones, such as A*, this under-performance does not importantly increase with the increase of the size of the maze. These findings suggest that a systematic effort of harvesting the natural space searching algorithms used by microorganisms is warranted and possibly overdue. These natural algorithms, if efficient, can be reverse-engineered for graph and tree search strategies.

  4. Routing Diverse Evacuees with the Cognitive Packet Network Algorithm

    Huibo Bi


    Full Text Available Regarding mobility, health conditions and personal preferences, evacuees can be categorized into different classes in realistic environments. Previous emergency navigation algorithms that direct evacuees with a single decision rule cannot fulfil civilians’ distinct service requirements and increase the likelihood of inducing destructive crowd behaviours, such as clogging, pushing and trampling, due to diverse mobility. This paper explores a distributed emergency navigation algorithm that employs the cognitive packet network concept to tailor different quality of service needs to different categories of evacuees. In addition, a congestion-aware algorithm is presented to predict the future congestion degree of a path with respect to the observed population density, arrival rate and service rate of each route segment. Experiments are implemented in a simulated environment populated with autonomous agents. Results show that our algorithm can increase the number of survivors while providing improved quality of service.

  5. Mobile transporter path planning using a genetic algorithm approach

    Baffes, Paul; Wang, Lui


    The use of an optimization technique known as a genetic algorithm for solving the mobile transporter path planning problem is investigated. The mobile transporter is a traveling robotic vehicle proposed for the Space Station which must be able to reach any point of the structure autonomously. Specific elements of the genetic algorithm are explored in both a theoretical and experimental sense. Recent developments in genetic algorithm theory are shown to be particularly effective in a path planning problem domain, though problem areas can be cited which require more research. However, trajectory planning problems are common in space systems and the genetic algorithm provides an attractive alternative to the classical techniques used to solve these problems.

  6. Phase unwrapping using an extrapolation-projection algorithm

    Marendic, Boris; Yang, Yongyi; Stark, Henry


    We explore an approach to the unwrapping of two-dimensional phase functions using a robust extrapolation-projection algorithm. Phase unwrapping is essential for imaging systems that construct the image from phase information. Unlike some existing methods where unwrapping is performed locally on a pixel-by-pixel basis, this work approaches the unwrapping problem from a global point of view. The unwrapping is done iteratively by a modification of the Gerchberg-Papoulis extrapolation algorithm, and the solution is refined by projecting onto the available global data at each iteration. Robustness of the algorithm is demonstrated through its performance in a noisy environment, and in comparison with a least-squares algorithm well-known in the literature.

  7. A New Generalized Similarity-Based Topic Distillation Algorithm

    ZHOU Hongfang; DANG Xiaohui


    The procedure of hypertext induced topic search based on a semantic relation model is analyzed, and the reason for the topic drift of HITS algorithm was found to prove that Web pages are projected to a wrong latent semantic basis. A new concept-generalized similarity is introduced and, based on this, a new topic distillation algorithm GSTDA(generalized similarity based topic distillation algorithm) was presented to improve the quality of topic distillation. GSTDA was applied not only to avoid the topic drift, but also to explore relative topics to user query. The experimental results on 10 queries show that GSTDA reduces topic drift rate by 10% to 58% compared to that of HITS(hypertext induced topic search) algorithm, and discovers several relative topics to queries that have multiple meanings.

  8. The theory of variational hybrid quantum-classical algorithms

    McClean, Jarrod R; Babbush, Ryan; Aspuru-Guzik, Alán


    Many quantum algorithms have daunting resource requirements when compared to what is available today. To address this discrepancy, a quantum-classical hybrid optimization scheme known as "the quantum variational eigensolver" was developed with the philosophy that even minimal quantum resources could be made useful when used in conjunction with classical routines. In this work we extend the general theory of this algorithm and suggest algorithmic improvements for practical implementations. Specifically, we develop a variational adiabatic ansatz and explore unitary coupled cluster where we establish a connection from second order unitary coupled cluster to universal gate sets through relaxation of exponential splitting. We introduce the concept of quantum variational error suppression that allows some errors to be suppressed naturally in this algorithm on a pre-threshold quantum device. Additionally, we analyze truncation and correlated sampling in Hamiltonian averaging as ways to reduce the cost of this proced...

  9. A fast butterfly algorithm for generalized Radon transforms

    Hu, Jingwei


    Generalized Radon transforms, such as the hyperbolic Radon transform, cannot be implemented as efficiently in the frequency domain as convolutions, thus limiting their use in seismic data processing. We have devised a fast butterfly algorithm for the hyperbolic Radon transform. The basic idea is to reformulate the transform as an oscillatory integral operator and to construct a blockwise lowrank approximation of the kernel function. The overall structure follows the Fourier integral operator butterfly algorithm. For 2D data, the algorithm runs in complexity O(N2 log N), where N depends on the maximum frequency and offset in the data set and the range of parameters (intercept time and slowness) in the model space. From a series of studies, we found that this algorithm can be significantly more efficient than the conventional time-domain integration. © 2013 Society of Exploration Geophysicists.

  10. A Hierachical Evolutionary Algorithm for Multiobjective Optimization in IMRT

    Holdsworth, Clay; Liao, Jay; Phillips, Mark H


    Purpose: Current inverse planning methods for IMRT are limited because they are not designed to explore the trade-offs between the competing objectives between the tumor and normal tissues. Our goal was to develop an efficient multiobjective optimization algorithm that was flexible enough to handle any form of objective function and that resulted in a set of Pareto optimal plans. Methods: We developed a hierarchical evolutionary multiobjective algorithm designed to quickly generate a diverse Pareto optimal set of IMRT plans that meet all clinical constraints and reflect the trade-offs in the plans. The top level of the hierarchical algorithm is a multiobjective evolutionary algorithm (MOEA). The genes of the individuals generated in the MOEA are the parameters that define the penalty function minimized during an accelerated deterministic IMRT optimization that represents the bottom level of the hierarchy. The MOEA incorporates clinical criteria to restrict the search space through protocol objectives and then...

  11. Improving fundamental factors among correlation matching algorithms in underwater TANS

    Lin, Yi; Yan, Lei; Tong, Qingxi


    TERCOM, ICP and TIEM algorithms, which mathematically all apply correlation matching mode, have been developed for positioning in underwater Terrain-aided Navigation System (TANS), but how to virtually improve their performance is still research puzzle now. Analyzing the characters of terrain reference data's distribution and vehicles prowling underwater, we find that grid spacing and accumulation sequence are two decisional elements of underwater TANS. Then the modified Maximum a Posteriori (MAP) estimation algorithm (M-MAP) from super-resolution images reconstruction is creatively explored for implementing interpolation to enhance the accuracy of non-surveyed points' deep-determination, and basic error mechanism model (EMM) based on Mean Absolute Difference (MAD) algorithm is deduced which can reflect the relationship of underwater TANS's inner factors. Simulation experiments indicate that adopting appropriate fundamental factors can effectively boost up underwater TANS's navigation competence based on the algorithms listed above.

  12. Algorithm aversion: people erroneously avoid algorithms after seeing them err.

    Dietvorst, Berkeley J; Simmons, Joseph P; Massey, Cade


    Research shows that evidence-based algorithms more accurately predict the future than do human forecasters. Yet when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly, and it is important to understand its causes. We show that people are especially averse to algorithmic forecasters after seeing them perform, even when they see them outperform a human forecaster. This is because people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake. In 5 studies, participants either saw an algorithm make forecasts, a human make forecasts, both, or neither. They then decided whether to tie their incentives to the future predictions of the algorithm or the human. Participants who saw the algorithm perform were less confident in it, and less likely to choose it over an inferior human forecaster. This was true even among those who saw the algorithm outperform the human.

  13. Exploration Blueprint: Data Book

    Drake, Bret G. (Editor)


    The material contained in this report was compiled to capture the work performed by the National Aeronautics and Space Administration's (NASA's) Exploration study team in the late 2002 timeframe. The "Exploration Blueprint Data Book" documents the analyses and findings of the 90-day Agency-wide study conducted from September - November 2002. During the summer of 2002, the NASA Deputy Administrator requested that a study be performed with the following objectives: (1) Develop the rationale for exploration beyond low-Earth orbit (2) Develop roadmaps for how to accomplish the first steps through humans to Mars (3) Develop design reference missions as a basis for the roadmaps 4) Make recommendations on what can be done now to effect this future This planning team, termed the Exploration Blueprint, performed architecture analyses to develop roadmaps for how to accomplish the first steps beyond LEO through the human exploration of Mars. The previous NASA Exploration Team activities laid the foundation and framework for development of NASA's Integrated Space Plan. The reference missions resulting from the analysis performed by the Exploration Blueprint team formed the basis for requirement definition, systems development, technology roadmapping, and risk assessments for future human exploration beyond low-Earth orbit. Emphasis was placed on developing recommendations on what could be done now to effect future exploration activities. The Exploration Blueprint team embraced the "Stepping Stone" approach to exploration where human and robotic activities are conducted through progressive expansion outward beyond low-Earth orbit. Results from this study produced a long-term strategy for exploration with near-term implementation plans, program recommendations, and technology investments. Specific results included the development of a common exploration crew vehicle concept, a unified space nuclear strategy, focused bioastronautics research objectives, and an integrated human

  14. An overview of smart grid routing algorithms

    Wang, Junsheng; OU, Qinghai; Shen, Haijuan


    This paper summarizes the typical routing algorithm in smart grid by analyzing the communication business and communication requirements of intelligent grid. Mainly from the two kinds of routing algorithm is analyzed, namely clustering routing algorithm and routing algorithm, analyzed the advantages and disadvantages of two kinds of typical routing algorithm in routing algorithm and applicability.

  15. Gas-Kinetic Computational Algorithms for Hypersonic Flows in Continuum and Transitional Regimes Project

    National Aeronautics and Space Administration — This SBIR Phase I project explores two gas-kinetic computational algorithms for simulation of hypersonic flows in both continuum and transitional regimes. One is the...

  16. Artificial Bee Colony Algorithm with Time-Varying Strategy

    Quande Qin


    Full Text Available Artificial bee colony (ABC is one of the newest additions to the class of swarm intelligence. ABC algorithm has been shown to be competitive with some other population-based algorithms. However, there is still an insufficiency that ABC is good at exploration but poor at exploitation. To make a proper balance between these two conflictive factors, this paper proposed a novel ABC variant with a time-varying strategy where the ratio between the number of employed bees and the number of onlooker bees varies with time. The linear and nonlinear time-varying strategies can be incorporated into the basic ABC algorithm, yielding ABC-LTVS and ABC-NTVS algorithms, respectively. The effects of the added parameters in the two new ABC algorithms are also studied through solving some representative benchmark functions. The proposed ABC algorithm is a simple and easy modification to the structure of the basic ABC algorithm. Moreover, the proposed approach is general and can be incorporated in other ABC variants. A set of 21 benchmark functions in 30 and 50 dimensions are utilized in the experimental studies. The experimental results show the effectiveness of the proposed time-varying strategy.

  17. Analysing Threshold Value in Fire Detection Algorithm Using MODIS Data

    Bowo E. Cahyono


    Full Text Available MODIS instruments have been designed to include special channels for fire monitoring by adding more spectral thermal band detector on them. The basic understanding of remote sensing fire detection should be kept in mind to be able to improve the algorithm for regional scale detection purposes. It still gives many chances for more exploration. This paper describe the principle of fire investigation applied on MODIS data. The main used algorithm in this research is contextual algorithm which has been developed by NASA scientist team. By varying applied threshold of T4 value in the range of 320-360K it shows that detected fire is significantly changed. While significant difference of detected FHS by changing ∆T threshold value is occurred in the range of 15-35K. Improve and adjustment of fire detection algorithm is needed to get the best accuracy result proper to local or regional conditions. MOD14 algorithm is applied threshold values of 325K for T4 and 20K for ∆T. Validation has been done from the algorithm result of MODIS dataset over Indonesia and South Africa. The accuracy of MODIS fire detection by MOD14 algorithm is 73.2% and 91.7% on MODIS data over Sumatra-Borneo and South Africa respectively

  18. Fast search algorithms for computational protein design.

    Traoré, Seydou; Roberts, Kyle E; Allouche, David; Donald, Bruce R; André, Isabelle; Schiex, Thomas; Barbe, Sophie


    One of the main challenges in computational protein design (CPD) is the huge size of the protein sequence and conformational space that has to be computationally explored. Recently, we showed that state-of-the-art combinatorial optimization technologies based on Cost Function Network (CFN) processing allow speeding up provable rigid backbone protein design methods by several orders of magnitudes. Building up on this, we improved and injected CFN technology into the well-established CPD package Osprey to allow all Osprey CPD algorithms to benefit from associated speedups. Because Osprey fundamentally relies on the ability of A* to produce conformations in increasing order of energy, we defined new A* strategies combining CFN lower bounds, with new side-chain positioning-based branching scheme. Beyond the speedups obtained in the new A*-CFN combination, this novel branching scheme enables a much faster enumeration of suboptimal sequences, far beyond what is reachable without it. Together with the immediate and important speedups provided by CFN technology, these developments directly benefit to all the algorithms that previously relied on the DEE/ A* combination inside Osprey* and make it possible to solve larger CPD problems with provable algorithms.

  19. A Program Complexity Metric Based on Variable Usage for Algorithmic Thinking Education of Novice Learners

    Fuwa, Minori; Kayama, Mizue; Kunimune, Hisayoshi; Hashimoto, Masami; Asano, David K.


    We have explored educational methods for algorithmic thinking for novices and implemented a block programming editor and a simple learning management system. In this paper, we propose a program/algorithm complexity metric specified for novice learners. This metric is based on the variable usage in arithmetic and relational formulas in learner's…

  20. Sounds unheard of evolutionary algorithms as creative tools for the contemporary composer

    Dahlstedt, Palle


    Evolutionary algorithms are studied as tools for generating novel musical material in the form of musical scores and synthesized sounds. The choice of genetic representation defines a space of potential music. This space is explored using evolutionary algorithms, in search of useful musical mater...... composed with the tools described in the thesis are presented....

  1. Evaluating Data Assimilation Algorithms

    Law, K J H


    Data assimilation refers to methodologies for the incorporation of noisy observations of a physical system into an underlying model in order to infer the properties of the state of the system (and/or parameters). The model itself is typically subject to uncertainties, in the input data and in the physical laws themselves. This leads naturally to a Bayesian formulation in which the posterior probability distribution of the system state, given the observations, plays a central conceptual role. The aim of this paper is to use this Bayesian posterior probability distribution as a gold standard against which to evaluate various commonly used data assimilation algorithms. A key aspect of geophysical data assimilation is the high dimensionality of the computational model. With this in mind, yet with the goal of allowing an explicit and accurate computation of the posterior distribution in order to facilitate our evaluation, we study the 2D Navier-Stokes equations in a periodic geometry. We compute the posterior prob...

  2. Efficient Kriging Algorithms

    Memarsadeghi, Nargess


    More efficient versions of an interpolation method, called kriging, have been introduced in order to reduce its traditionally high computational cost. Written in C++, these approaches were tested on both synthetic and real data. Kriging is a best unbiased linear estimator and suitable for interpolation of scattered data points. Kriging has long been used in the geostatistic and mining communities, but is now being researched for use in the image fusion of remotely sensed data. This allows a combination of data from various locations to be used to fill in any missing data from any single location. To arrive at the faster algorithms, sparse SYMMLQ iterative solver, covariance tapering, Fast Multipole Methods (FMM), and nearest neighbor searching techniques were used. These implementations were used when the coefficient matrix in the linear system is symmetric, but not necessarily positive-definite.

  3. Trial encoding algorithms ensemble.

    Cheng, Lipin Bill; Yeh, Ren Jye


    This paper proposes trial algorithms for some basic components in cryptography and lossless bit compression. The symmetric encryption is accomplished by mixing up randomizations and scrambling with hashing of the key playing an essential role. The digital signature is adapted from the Hill cipher with the verification key matrices incorporating un-invertible parts to hide the signature matrix. The hash is a straight running summation (addition chain) of data bytes plus some randomization. One simplified version can be burst error correcting code. The lossless bit compressor is the Shannon-Fano coding that is less optimal than the later Huffman and Arithmetic coding, but can be conveniently implemented without the use of a tree structure and improvable with bytes concatenation.

  4. External memory algorithms

    Abello, James M


    The AMS and DIMACS are pleased to present this 50th volume in the DIMACS series. This series contains volumes coming out of programs at the Center for Discrete Mathematics and Theoretical Computer Science (DIMACS), which is headquartered at Rutgers University in partnership with Princeton University, AT&T Labs-Research, Bell Labs (Lucent Technologies), NEC Research Institute, and Telcordia Technologies (formerly Bellcore). The series includes topics on research and education concerning areas such as discrete and computational geometry, discrete optimization, data structures and algorithms, computational intractability, massive data sets, networks, graph theory, combinatorics, computational number theory and cryptology, discrete probability, recursive function theory and mathematical logic, Boolean functions, computational and mathematical biology, and computational algebra.

  5. Tau reconstruction and identification algorithm

    Raman Khurana


    CMS has developed sophisticated tau identification algorithms for tau hadronic decay modes. Production of tau lepton decaying to hadrons are studied at 7 TeV centre-of-mass energy with 2011 collision data collected by CMS detector and has been used to measure the performance of tau identification algorithms by measuring identification efficiency and misidentification rates from electrons, muons and hadronic jets. These algorithms enable extended reach for the searches for MSSM Higgs, and other exotic particles.

  6. Algorithms over partially ordered sets

    Baer, Robert M.; Østerby, Ole


    in partially ordered sets, answer the combinatorial question of how many maximal chains might exist in a partially ordered set withn elements, and we give an algorithm for enumerating all maximal chains. We give (in § 3) algorithms which decide whether a partially ordered set is a (lower or upper) semi......-lattice, and whether a lattice has distributive, modular, and Boolean properties. Finally (in § 4) we give Algol realizations of the various algorithms....

  7. Marine Mineral Exploration

    The past 20 years have seen extensive marine exploration work by the major industrialized countries. Studies have, in part, been concentrated on Pacific manganese nodule occurrences and on massive sulfides on mid-oceanic ridges. An international jurisdictional framework of the sea-bed mineral...... in EEZ areas are fairly unknown; many areas need detailed mapping and mineral exploration, and the majority of coastal or island states with large EEZ areas have little experience in exploration for marine hard minerals. This book describes the systematic steps in marine mineral exploration....... Such exploration requires knowledge of mineral deposits and models of their formation, of geophysical and geochemical exploration methods, and of data evaluation and interpretation methods. These topics are described in detail by an international group of authors. A short description is also given of marine...

  8. Visual explorer facilitator's guide

    Palus, Charles J


    Grounded in research and practice, the Visual Explorer™ Facilitator's Guide provides a method for supporting collaborative, creative conversations about complex issues through the power of images. The guide is available as a component in the Visual Explorer Facilitator's Letter-sized Set, Visual Explorer Facilitator's Post card-sized Set, Visual Explorer Playing Card-sized Set, and is also available as a stand-alone title for purchase to assist multiple tool users in an organization.

  9. A novel mating approach for genetic algorithms.

    Galán, Severino F; Mengshoel, Ole J; Pinter, Rafael


    Genetic algorithms typically use crossover, which relies on mating a set of selected parents. As part of crossover, random mating is often carried out. A novel approach to parent mating is presented in this work. Our novel approach can be applied in combination with a traditional similarity-based criterion to measure distance between individuals or with a fitness-based criterion. We introduce a parameter called the mating index that allows different mating strategies to be developed within a uniform framework: an exploitative strategy called best-first, an explorative strategy called best-last, and an adaptive strategy called self-adaptive. Self-adaptive mating is defined in the context of the novel algorithm, and aims to achieve a balance between exploitation and exploration in a domain-independent manner. The present work formally defines the novel mating approach, analyzes its behavior, and conducts an extensive experimental study to quantitatively determine its benefits. In the domain of real function optimization, the experiments show that, as the degree of multimodality of the function at hand grows, increasing the mating index improves performance. In the case of the self-adaptive mating strategy, the experiments give strong results for several case studies.

  10. Preconditioned quantum linear system algorithm.

    Clader, B D; Jacobs, B C; Sprouse, C R


    We describe a quantum algorithm that generalizes the quantum linear system algorithm [Harrow et al., Phys. Rev. Lett. 103, 150502 (2009)] to arbitrary problem specifications. We develop a state preparation routine that can initialize generic states, show how simple ancilla measurements can be used to calculate many quantities of interest, and integrate a quantum-compatible preconditioner that greatly expands the number of problems that can achieve exponential speedup over classical linear systems solvers. To demonstrate the algorithm's applicability, we show how it can be used to compute the electromagnetic scattering cross section of an arbitrary target exponentially faster than the best classical algorithm.

  11. Quantum algorithm for data fitting.

    Wiebe, Nathan; Braun, Daniel; Lloyd, Seth


    We provide a new quantum algorithm that efficiently determines the quality of a least-squares fit over an exponentially large data set by building upon an algorithm for solving systems of linear equations efficiently [Harrow et al., Phys. Rev. Lett. 103, 150502 (2009)]. In many cases, our algorithm can also efficiently find a concise function that approximates the data to be fitted and bound the approximation error. In cases where the input data are pure quantum states, the algorithm can be used to provide an efficient parametric estimation of the quantum state and therefore can be applied as an alternative to full quantum-state tomography given a fault tolerant quantum computer.

  12. A Fast Fractional Difference Algorithm

    Jensen, Andreas Noack; Nielsen, Morten Ørregaard

    We provide a fast algorithm for calculating the fractional difference of a time series. In standard implementations, the calculation speed (number of arithmetic operations) is of order T 2, where T is the length of the time series. Our algorithm allows calculation speed of order T logT . For mode......We provide a fast algorithm for calculating the fractional difference of a time series. In standard implementations, the calculation speed (number of arithmetic operations) is of order T 2, where T is the length of the time series. Our algorithm allows calculation speed of order T log...

  13. Bayesian exploration for intelligent identification of textures

    Jeremy A. Fishel


    Full Text Available In order to endow robots with humanlike abilities to characterize and identify objects, they must be provided with tactile sensors and intelligent algorithms to select, control and interpret data from useful exploratory movements. Humans make informed decisions on the sequence of exploratory movements that would yield the most information for the task, depending on what the object may be and prior knowledge of what to expect from possible exploratory movements. This study is focused on texture discrimination, a subset of a much larger group of exploratory movements and percepts that humans use to discriminate, characterize, and identify objects. Using a testbed equipped with a biologically inspired tactile sensor (the BioTac® we produced sliding movements similar to those that humans make when exploring textures. Measurement of tactile vibrations and reaction forces when exploring textures were used to extract measures of textural properties inspired from psychophysical literature (traction, roughness, and fineness. Different combinations of normal force and velocity were identified to be useful for each of these three properties. A total of 117 textures were explored with these three movements to create a database of prior experience to use for identifying these same textures in future encounters. When exploring a texture, the discrimination algorithm adaptively selects the optimal movement to make and property to measure based on previous experience to differentiate the texture from a set of plausible candidates, a process we call Bayesian exploration. Performance of 99.6% in correctly discriminating pairs of similar textures was found to exceed human capabilities. Absolute classification from the entire set of 117 textures generally required a small number of well-chosen exploratory movements (median=5 and yielded a 95.4% success rate. The method of Bayesian exploration developed and tested in this paper may generalize well to other

  14. Bayesian exploration for intelligent identification of textures.

    Fishel, Jeremy A; Loeb, Gerald E


    In order to endow robots with human-like abilities to characterize and identify objects, they must be provided with tactile sensors and intelligent algorithms to select, control, and interpret data from useful exploratory movements. Humans make informed decisions on the sequence of exploratory movements that would yield the most information for the task, depending on what the object may be and prior knowledge of what to expect from possible exploratory movements. This study is focused on texture discrimination, a subset of a much larger group of exploratory movements and percepts that humans use to discriminate, characterize, and identify objects. Using a testbed equipped with a biologically inspired tactile sensor (the BioTac), we produced sliding movements similar to those that humans make when exploring textures. Measurement of tactile vibrations and reaction forces when exploring textures were used to extract measures of textural properties inspired from psychophysical literature (traction, roughness, and fineness). Different combinations of normal force and velocity were identified to be useful for each of these three properties. A total of 117 textures were explored with these three movements to create a database of prior experience to use for identifying these same textures in future encounters. When exploring a texture, the discrimination algorithm adaptively selects the optimal movement to make and property to measure based on previous experience to differentiate the texture from a set of plausible candidates, a process we call Bayesian exploration. Performance of 99.6% in correctly discriminating pairs of similar textures was found to exceed human capabilities. Absolute classification from the entire set of 117 textures generally required a small number of well-chosen exploratory movements (median = 5) and yielded a 95.4% success rate. The method of Bayesian exploration developed and tested in this paper may generalize well to other cognitive problems.

  15. Optimal exploration systems

    Klesh, Andrew T.

    This dissertation studies optimal exploration, defined as the collection of information about given objects of interest by a mobile agent (the explorer) using imperfect sensors. The key aspects of exploration are kinematics (which determine how the explorer moves in response to steering commands), energetics (which determine how much energy is consumed by motion and maneuvers), informatics (which determine the rate at which information is collected) and estimation (which determines the states of the objects). These aspects are coupled by the steering decisions of the explorer. We seek to improve exploration by finding trade-offs amongst these couplings and the components of exploration: the Mission, the Path and the Agent. A comprehensive model of exploration is presented that, on one hand, accounts for these couplings and on the other hand is simple enough to allow analysis. This model is utilized to pose and solve several exploration problems where an objective function is to be minimized. Specific functions to be considered are the mission duration and the total energy. These exploration problems are formulated as optimal control problems and necessary conditions for optimality are obtained in the form of two-point boundary value problems. An analysis of these problems reveals characteristics of optimal exploration paths. Several regimes are identified for the optimal paths including the Watchtower, Solar and Drag regime, and several non-dimensional parameters are derived that determine the appropriate regime of travel. The so-called Power Ratio is shown to predict the qualitative features of the optimal paths, provide a metric to evaluate an aircrafts design and determine an aircrafts capability for flying perpetually. Optimal exploration system drivers are identified that provide perspective as to the importance of these various regimes of flight. A bank-to-turn solar-powered aircraft flying at constant altitude on Mars is used as a specific platform for

  16. Hybrid fuzzy charged system search algorithm based state estimation in distribution networks

    Sachidananda Prasad


    Full Text Available This paper proposes a new hybrid charged system search (CSS algorithm based state estimation in radial distribution networks in fuzzy framework. The objective of the optimization problem is to minimize the weighted square of the difference between the measured and the estimated quantity. The proposed method of state estimation considers bus voltage magnitude and phase angle as state variable along with some equality and inequality constraints for state estimation in distribution networks. A rule based fuzzy inference system has been designed to control the parameters of the CSS algorithm to achieve better balance between the exploration and exploitation capability of the algorithm. The efficiency of the proposed fuzzy adaptive charged system search (FACSS algorithm has been tested on standard IEEE 33-bus system and Indian 85-bus practical radial distribution system. The obtained results have been compared with the conventional CSS algorithm, weighted least square (WLS algorithm and particle swarm optimization (PSO for feasibility of the algorithm.

  17. Spatial Coverage Planning for Exploration Robots

    Gaines, Daniel; Estlin, Tara; Chouinard, Caroline


    A report discusses an algorithm for an onboard planning and execution technology to support the exploration and characterization of geological features by autonomous rovers. A rover that is capable of deciding which observations are more important relieves the engineering team from much of the burden of attempting to make accurate predictions of what the available rover resources will be in the future. Instead, the science and engineering teams can uplink a set of observation requests that may potentially oversubscribe resources and let the rover use observation priorities and its current assessment of available resources to make decisions about which observations to perform and when to perform them. The algorithm gives the rover the ability to model spatial coverage quality based on data from different scientific instruments, to assess the impact of terrain on coverage quality, to incorporate user-defined priorities among subregions of the terrain to be covered, and to update coverage quality rankings of observations when terrain knowledge changes. When the rover is exploring large geographical features such as craters, channels, or boundaries between two different regions, an important factor in assessing the quality of a mission plan is how the set of chosen observations spatially cover the area of interest. The algorithm allows the rover to evaluate which observation to perform and to what extent the candidate observation will increase the spatial coverage of the plan.

  18. A New Metaheuristic Bat-Inspired Algorithm

    Yang, Xin-She


    Metaheuristic algorithms such as particle swarm optimization, firefly algorithm and harmony search are now becoming powerful methods for solving many tough optimization problems. In this paper, we propose a new metaheuristic method, the Bat Algorithm, based on the echolocation behaviour of bats. We also intend to combine the advantages of existing algorithms into the new bat algorithm. After a detailed formulation and explanation of its implementation, we will then compare the proposed algorithm with other existing algorithms, including genetic algorithms and particle swarm optimization. Simulations show that the proposed algorithm seems much superior to other algorithms, and further studies are also discussed.

  19. A hybrid cuckoo search algorithm with Nelder Mead method for solving global optimization problems.

    Ali, Ahmed F; Tawhid, Mohamed A


    Cuckoo search algorithm is a promising metaheuristic population based method. It has been applied to solve many real life problems. In this paper, we propose a new cuckoo search algorithm by combining the cuckoo search algorithm with the Nelder-Mead method in order to solve the integer and minimax optimization problems. We call the proposed algorithm by hybrid cuckoo search and Nelder-Mead method (HCSNM). HCSNM starts the search by applying the standard cuckoo search for number of iterations then the best obtained solution is passing to the Nelder-Mead algorithm as an intensification process in order to accelerate the search and overcome the slow convergence of the standard cuckoo search algorithm. The proposed algorithm is balancing between the global exploration of the Cuckoo search algorithm and the deep exploitation of the Nelder-Mead method. We test HCSNM algorithm on seven integer programming problems and ten minimax problems and compare against eight algorithms for solving integer programming problems and seven algorithms for solving minimax problems. The experiments results show the efficiency of the proposed algorithm and its ability to solve integer and minimax optimization problems in reasonable time.

  20. Comparison and analysis of nonlinear algorithms for compressed sensing in MRI.

    Yu, Yeyang; Hong, Mingjian; Liu, Feng; Wang, Hua; Crozier, Stuart


    Compressed sensing (CS) theory has been recently applied in Magnetic Resonance Imaging (MRI) to accelerate the overall imaging process. In the CS implementation, various algorithms have been used to solve the nonlinear equation system for better image quality and reconstruction speed. However, there are no explicit criteria for an optimal CS algorithm selection in the practical MRI application. A systematic and comparative study of those commonly used algorithms is therefore essential for the implementation of CS in MRI. In this work, three typical algorithms, namely, the Gradient Projection For Sparse Reconstruction (GPSR) algorithm, Interior-point algorithm (l(1)_ls), and the Stagewise Orthogonal Matching Pursuit (StOMP) algorithm are compared and investigated in three different imaging scenarios, brain, angiogram and phantom imaging. The algorithms' performances are characterized in terms of image quality and reconstruction speed. The theoretical results show that the performance of the CS algorithms is case sensitive; overall, the StOMP algorithm offers the best solution in imaging quality, while the GPSR algorithm is the most efficient one among the three methods. In the next step, the algorithm performances and characteristics will be experimentally explored. It is hoped that this research will further support the applications of CS in MRI.

  1. Performance Analysis of CPU Scheduling Algorithms with Novel OMDRRS Algorithm

    Neetu Goel


    Full Text Available CPU scheduling is one of the most primary and essential part of any operating system. It prioritizes processes to efficiently execute the user requests and help in choosing the appropriate process for execution. Round Robin (RR & Priority Scheduling(PS are one of the most widely used and acceptable CPU scheduling algorithm. But, its performance degrades with respect to turnaround time, waiting time & context switching with each recurrence. A New scheduling algorithm OMDRRS is developed to improve the performance of RR and priority scheduling algorithms. The new algorithm performs better than the popular existing algorithm. Drastic improvement is seen in waiting time, turnaround time, response time and context switching. Comparative analysis of Turn around Time(TAT, Waiting Time(WT, Response Time (RT is shown with the help of ANOVA and t-test.

  2. SIM_EXPLORE: Software for Directed Exploration of Complex Systems

    Burl, Michael; Wang, Esther; Enke, Brian; Merline, William J.


    Physics-based numerical simulation codes are widely used in science and engineering to model complex systems that would be infeasible to study otherwise. While such codes may provide the highest- fidelity representation of system behavior, they are often so slow to run that insight into the system is limited. Trying to understand the effects of inputs on outputs by conducting an exhaustive grid-based sweep over the input parameter space is simply too time-consuming. An alternative approach called "directed exploration" has been developed to harvest information from numerical simulators more efficiently. The basic idea is to employ active learning and supervised machine learning to choose cleverly at each step which simulation trials to run next based on the results of previous trials. SIM_EXPLORE is a new computer program that uses directed exploration to explore efficiently complex systems represented by numerical simulations. The software sequentially identifies and runs simulation trials that it believes will be most informative given the results of previous trials. The results of new trials are incorporated into the software's model of the system behavior. The updated model is then used to pick the next round of new trials. This process, implemented as a closed-loop system wrapped around existing simulation code, provides a means to improve the speed and efficiency with which a set of simulations can yield scientifically useful results. The software focuses on the case in which the feedback from the simulation trials is binary-valued, i.e., the learner is only informed of the success or failure of the simulation trial to produce a desired output. The software offers a number of choices for the supervised learning algorithm (the method used to model the system behavior given the results so far) and a number of choices for the active learning strategy (the method used to choose which new simulation trials to run given the current behavior model). The software

  3. The exploration metaphor

    Mcgreevy, Michael W.


    NASA's experience in planetary exploration has demonstrated that the desktop workstation is inadequate for many visualization situations. The primary mission displays for the unmanned Surveyor missions to the moon during the mid-1960's, for example, were environmental images assembled on the inside surfaces of spherical shells. Future exploration missions will greatly benefit from advances in digital computer and display technology, but there remain unmet user interface needs. Alternative user interfaces and metaphors are needed for planetary exploration and other interactions with complex spatial environments. These interfaces and metaphors would enable the user to directly explore environments and naturally manipulate objects in those environments. Personal simulators, virtual workstations, and telepresence user interfaces are systems capable of providing this integration of user space and task space. The Exploration Metaphor is a useful concept for guiding the design of user interfaces for virtual environments and telepresence. To apply the Exploration Metaphor is to assert that computing is like exploration, and to support objects, operations, and contexts comparable to those encountered in the exploration of natural environments. The Exploration Metaphor, under development for user interfaces in support of NASA's planetary exploration missions and goals, will also benefit other applications where complex spatial information must be visualized. Visualization methods and systems for planetary exploration are becoming increasingly integrated and interactive as computing technology improves. These advances will benefit from virtual environment and telepresence interface technology. A key development has been the processing of multiple images and other sensor data to create detailed digital models of the planets and moons. Data from images of the Earth, Mars, and Miranda, for example, have been converted into 3D models, and dynamic virtual fly-overs have been

  4. Exploring Generic Haskell

    Löh, A.


    This thesis is an exploration -- an exploration of a language extension of the functional programming language Haskell. The extension is called Generic Haskell, albeit the name has been used to refer to different objects over the last several years: Many papers have described different proposals, fe

  5. Composite Technology for Exploration

    Fikes, John


    The CTE (Composite Technology for Exploration) Project will develop and demonstrate critical composites technologies with a focus on joints that utilize NASA expertise and capabilities. The project will advance composite technologies providing lightweight structures to support future NASA exploration missions. The CTE project will demonstrate weight-saving, performance-enhancing bonded joint technology for Space Launch System (SLS)-scale composite hardware.

  6. Effective Parallel Algorithm Animation


    a text file that is suitable for a plotting package such as gnuplot (. This representation is show 66 in Figure 21. The horizontal axis in this diagram...Explorer (75), and Gnuplot 81 .9 1111-__ _ _ UU Figue 2. SytemStrctur DigramfortheAnalsisToo 82n 0 0 (82). Since gnuplot is freely available on work...stations at AFIT and the plotting requirements are limited, gnuplot was used as the plotting package. However, users are free to select their own if they

  7. Exploration Laboratory Analysis - ARC

    Krihak, Michael K.; Fung, Paul P.


    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk, Risk of Inability to Adequately Treat an Ill or Injured Crew Member, and ExMC Gap 4.05: Lack of minimally invasive in-flight laboratory capabilities with limited consumables required for diagnosing identified Exploration Medical Conditions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability in future exploration missions. Mission architecture poses constraints on equipment and procedures that will be available to treat evidence-based medical conditions according to the Space Medicine Exploration Medical Conditions List (SMEMCL). The SMEMCL provided diagnosis and treatment for the evidence-based medical conditions and hence, a basis for developing ELA functional requirements.

  8. Algorithms on ensemble quantum computers.

    Boykin, P Oscar; Mor, Tal; Roychowdhury, Vwani; Vatan, Farrokh


    In ensemble (or bulk) quantum computation, all computations are performed on an ensemble of computers rather than on a single computer. Measurements of qubits in an individual computer cannot be performed; instead, only expectation values (over the complete ensemble of computers) can be measured. As a result of this limitation on the model of computation, many algorithms cannot be processed directly on such computers, and must be modified, as the common strategy of delaying the measurements usually does not resolve this ensemble-measurement problem. Here we present several new strategies for resolving this problem. Based on these strategies we provide new versions of some of the most important quantum algorithms, versions that are suitable for implementing on ensemble quantum computers, e.g., on liquid NMR quantum computers. These algorithms are Shor's factorization algorithm, Grover's search algorithm (with several marked items), and an algorithm for quantum fault-tolerant computation. The first two algorithms are simply modified using a randomizing and a sorting strategies. For the last algorithm, we develop a classical-quantum hybrid strategy for removing measurements. We use it to present a novel quantum fault-tolerant scheme. More explicitly, we present schemes for fault-tolerant measurement-free implementation of Toffoli and σ(z)(¼) as these operations cannot be implemented "bitwise", and their standard fault-tolerant implementations require measurement.

  9. Classification of Global Illumination Algorithms

    Lesev, Hristo


    This article describes and classifies various approaches for solving the global illumination problem. The classification aims to show the similarities between different types of algorithms. We introduce the concept of Light Manager, as a central element and mediator between illumination algorithms in a heterogeneous environment of a graphical system. We present results and analysis of the implementation of the described ideas.

  10. Algorithms in combinatorial design theory

    Colbourn, CJ


    The scope of the volume includes all algorithmic and computational aspects of research on combinatorial designs. Algorithmic aspects include generation, isomorphism and analysis techniques - both heuristic methods used in practice, and the computational complexity of these operations. The scope within design theory includes all aspects of block designs, Latin squares and their variants, pairwise balanced designs and projective planes and related geometries.

  11. Streaming Algorithms for Line Simplification

    Abam, Mohammad; de Berg, Mark; Hachenberger, Peter


    this problem in a streaming setting, where we only have a limited amount of storage, so that we cannot store all the points. We analyze the competitive ratio of our algorithms, allowing resource augmentation: we let our algorithm maintain a simplification with 2k (internal) points and compare the error of our...

  12. Search for New Quantum Algorithms


    Topological computing for beginners, (slide presentation), Lecture Notes for Chapter 9 - Physics 219 - Quantum Computation. (http...14 II.A.8. A QHS algorithm for Feynman integrals ......................................................18 II.A.9. Non-abelian QHS algorithms -- A...idea is that NOT all environmentally entangling transformations are equally likely. In particular, for spatially separated physical quantum

  13. An Improved Moving Mesh Algorithm


    we consider an iterative algorithm of mesh optimization for finite element solution, and give an improved moving mesh strategy that reduces rapidly the complexity and cost of solving variational problems.A numerical result is presented for a 2-dimensional problem by the improved algorithm.

  14. Penalty Algorithms in Hilbert Spaces

    Jean Pierre DUSSAULT; Hai SHEN; André BANDRAUK


    We analyze the classical penalty algorithm for nonlinear programming in HUbert spaces and obtain global convergence results, as well as asymptotic superlinear convergence order. These convergence results generalize similar results obtained for finite-dimensional problems. Moreover, the nature of the algorithms allows us to solve the unconstrained subproblems in finite-dimensional spaces.

  15. The politics of algorithmic finance

    Hansen, Kristian Bondo


    ... learning algorithm, gradually escapes from the company's and thus its creator's control. Although the trading algorithms used in today's financial markets are not sophisticated enough to threaten the global economy, Harris's novel does address some of the political, economic, and regulatory issues emanating from the growing automation of financia...

  16. Recovery Rate of Clustering Algorithms

    Li, Fajie; Klette, Reinhard; Wada, T; Huang, F; Lin, S


    This article provides a simple and general way for defining the recovery rate of clustering algorithms using a given family of old clusters for evaluating the performance of the algorithm when calculating a family of new clusters. Under the assumption of dealing with simulated data (i.e., known old

  17. Online co-regularized algorithms

    Ruijter, T. de; Tsivtsivadze, E.; Heskes, T.


    We propose an online co-regularized learning algorithm for classification and regression tasks. We demonstrate that by sequentially co-regularizing prediction functions on unlabeled data points, our algorithm provides improved performance in comparison to supervised methods on several UCI benchmarks

  18. Executable Pseudocode for Graph Algorithms

    B. Ó Nualláin (Breanndán)


    textabstract Algorithms are written in pseudocode. However the implementation of an algorithm in a conventional, imperative programming language can often be scattered over hundreds of lines of code thus obscuring its essence. This can lead to difficulties in understanding or verifying the

  19. Online co-regularized algorithms

    Ruijter, T. de; Tsivtsivadze, E.; Heskes, T.


    We propose an online co-regularized learning algorithm for classification and regression tasks. We demonstrate that by sequentially co-regularizing prediction functions on unlabeled data points, our algorithm provides improved performance in comparison to supervised methods on several UCI benchmarks

  20. View-Dependent Streamline Deformation and Exploration

    Tong, Xin; Edwards, John; Chen, Chun-Ming; Shen, Han-Wei; Johnson, Chris R.; Wong, Pak Chung


    Occlusion presents a major challenge in visualizing 3D flow and tensor fields using streamlines. Displaying too many streamlines creates a dense visualization filled with occluded structures, but displaying too few streams risks losing important features. We propose a new streamline exploration approach by visually manipulating the cluttered streamlines by pulling visible layers apart and revealing the hidden structures underneath. This paper presents a customized view-dependent deformation algorithm and an interactive visualization tool to minimize visual cluttering for visualizing 3D vector and tensor fields. The algorithm is able to maintain the overall integrity of the fields and expose previously hidden structures. Our system supports both mouse and direct-touch interactions to manipulate the viewing perspectives and visualize the streamlines in depth. By using a lens metaphor of different shapes to select the transition zone of the targeted area interactively, the users can move their focus and examine the vector or tensor field freely.

  1. Advanced software algorithms

    Berry, K.; Dayton, S.


    Citibank was using a data collection system to create a one-time-only mailing history on prospective credit card customers that was becoming dated in its time to market requirements and as such was in need of performance improvements. To compound problems with their existing system, the assurance of the quality of the data matching process was manpower intensive and needed to be automated. Analysis, design, and prototyping capabilities involving information technology were areas of expertise provided by DOE-LMES Data Systems Research and Development (DSRD) program. The goal of this project was for Data Systems Research and Development (DSRD) to analyze the current Citibank credit card offering system and suggest and prototype technology improvements that would result in faster processing with quality as good as the current system. Technologies investigated include: a high-speed network of reduced instruction set computing (RISC) processors for loosely coupled parallel processing, tightly coupled, high performance parallel processing, higher order computer languages such as `C`, fuzzy matching algorithms applied to very large data files, relational database management system, and advanced programming techniques.

  2. Elementary functions algorithms and implementation

    Muller, Jean-Michel


    This textbook presents the concepts and tools necessary to understand, build, and implement algorithms for computing elementary functions (e.g., logarithms, exponentials, and the trigonometric functions). Both hardware- and software-oriented algorithms are included, along with issues related to accurate floating-point implementation. This third edition has been updated and expanded to incorporate the most recent advances in the field, new elementary function algorithms, and function software. After a preliminary chapter that briefly introduces some fundamental concepts of computer arithmetic, such as floating-point arithmetic and redundant number systems, the text is divided into three main parts. Part I considers the computation of elementary functions using algorithms based on polynomial or rational approximations and using table-based methods; the final chapter in this section deals with basic principles of multiple-precision arithmetic. Part II is devoted to a presentation of “shift-and-add” algorithm...

  3. A secured Cryptographic Hashing Algorithm

    Mohanty, Rakesh; Bishi, Sukant kumar


    Cryptographic hash functions for calculating the message digest of a message has been in practical use as an effective measure to maintain message integrity since a few decades. This message digest is unique, irreversible and avoids all types of collisions for any given input string. The message digest calculated from this algorithm is propagated in the communication medium along with the original message from the sender side and on the receiver side integrity of the message can be verified by recalculating the message digest of the received message and comparing the two digest values. In this paper we have designed and developed a new algorithm for calculating the message digest of any message and implemented t using a high level programming language. An experimental analysis and comparison with the existing MD5 hashing algorithm, which is predominantly being used as a cryptographic hashing tool, shows this algorithm to provide more randomness and greater strength from intrusion attacks. In this algorithm th...


    V. E. Marlei


    Full Text Available The problem of reducing the barrier between the user and the computer appeared immediately, as there were computers, and remains relevant in the present time. This problem is formulated as the task of developing a user-friendly interface. One way of solving this problem is the use of graphical representation, which can really reduce the barrier between human and computer. In this series stands and algorithmic formalism networks intended to describe algorithmic models proposed by V.V. Iaanishchev about 30 years ago. Under algorithmic model refers to a formalized description of the scenario subject specialist for the simulated process, the structure of which is comparable with the structure of the causal and temporal relationships between phenomena simulated process, together with all information necessary for its software implementation. This article is devoted to the definition of isomorphous investment of the algorithmic networks and description of conversions for implementation, using principle to division vertices into classes. Also in this article presented and described approach end algorithm to identification of isomorphism algorithmic networks in detail and its using for base of algorithmic models.

  5. A New Page Ranking Algorithm Based On WPRVOL Algorithm

    Roja Javadian Kootenae


    Full Text Available The amount of information on the web is always growing, thus powerful search tools are needed to search for such a large collection. Search engines in this direction help users so they can find their desirable information among the massive volume of information in an easier way. But what is important in the search engines and causes a distinction between them is page ranking algorithm used in them. In this paper a new page ranking algorithm based on "Weighted Page Ranking based on Visits of Links (WPRVOL Algorithm" for search engines is being proposed which is called WPR'VOL for short. The proposed algorithm considers the number of visits of first and second level in-links. The original WPRVOL algorithm takes into account the number of visits of first level in-links of the pages and distributes rank scores based on the popularity of the pages whereas the proposed algorithm considers both in-links of that page (first level in-links and in-links of the pages that point to it (second level in-links in order to calculation of rank of the page, hence more related pages are displayed at the top of search result list. In the summary it is said that the proposed algorithm assigns higher rank to pages that both themselves and pages that point to them be important.

  6. Multi-robot task allocation for exploration

    GAO Ping-an; CAI Zi-xing


    The problem of allocating a number of exploration tasks to a team of mobile robots in dynamic environments was studied. The team mission is to visit several distributed targets. The path cost of target is proportional to the distance that a robot has to move to visit the target. The team objective is to minimize the average path cost of target over all targets. Finding an optimal allocation is strongly NP-hard. The proposed algorithm can produce a near-optimal solution to it. The allocation can be cast in terms of a multi-round single-item auction by which robots bid on targets. In each auction round, one target is assigned to a robot that produces the lowest path cost of the target. The allocated targets form a forest where each tree corresponds a robot's exploring targets set. Each robot constructs an exploring path through depth-first search in its target tree. The time complexity of the proposed algorithm is polynomial. Simulation experiments show that the allocating method is valid.

  7. Double evolutsional artificial bee colony algorithm for multiple traveling salesman problem

    Xue Ming Hao


    Full Text Available The double evolutional artificial bee colony algorithm (DEABC is proposed for solving the single depot multiple traveling salesman problem (MTSP. The proposed DEABC algorithm, which takes advantage of the strength of the upgraded operators, is characterized by its guidance in exploitation search and diversity in exploration search. The double evolutional process for exploitation search is composed of two phases of half stochastic optimal search, and the diversity generating operator for exploration search is used for solutions which cannot be improved after limited times. The computational results demonstrated the superiority of our algorithm over previous state-of-the-art methods.

  8. A Vigorous Explorer Program

    Elvis, Martin; Brissenden, Roger; Chakrabarti, Supriya; Cherry, Michael; Devlin, Mark; Edelstein, Jerry; Eisenhardt, Peter; Feldman, Paul; Ford, Holland; Gehrels, Neil; Golub, Leon; Marshall, Herman; Martin, Christopher; Mather, John; McCandliss, Stephan; McConnell, Mark; McDowell, Jonathan; Meier, David; Millan, Robyn; Mitchell, John; Moos, Warren; Murray, Steven S; Nousek, John; Oegerle, William; Ramsey, Brian; Green, James; Grindlay, Jonathan; Kaaret, Philip; Kaiser, Mary Elizabeth; Kaltenegger, Lisa; Kasper, Justin; Krolik, Julian; Kruk, Jeffrey W; Latham, David; MacKenty, John; Mainzer, Amanda; Ricker, George; Rinehart, Stephen; Romaine, Suzanne; Scowen, Paul; Silver, Eric; Sonneborn, George; Stern, Daniel; Swain, Mark; Swank, Jean; Traub, Wesley; Weisskopf, Martin; Werner, Michael; Wright, Edward


    Explorers have made breakthroughs in many fields of astrophysics. The science from both these missions contributed to three Nobel Prizes - Giacconi (2002), Mather, and Smoot (2006). Explorers have: marked the definitive beginning of precision cosmology, discovered that short gamma-ray bursts are caused by compact star mergers and have measured metalicity to redshifts z>6. NASA Explorers do cutting-edge science that cannot be done by facility-class instruments. The Explorer program provides a rapid response to changing science and technology, to enable cutting-edge science at moderate cost. Explorers also enable innovation, and engage & train scientists, managers and engineers, adding human capital to NASA and the nation. The astrophysics Explorer launch rate now being achieved is 1 per 3 years, and budget projections are in the $150M/year range for the next five years. A newly Vigorous Explorer Program should be created to: 1. Reach the long-stated goal of annual astrophysics launches; 2. Find additional ...

  9. A functional clustering algorithm for the analysis of neural relationships

    Feldt, S; Hetrick, V L; Berke, J D; Zochowski, M


    We formulate a novel technique for the detection of functional clusters in neural data. In contrast to prior network clustering algorithms, our procedure progressively combines spike trains and derives the optimal clustering cutoff in a simple and intuitive manner. To demonstrate the power of this algorithm to detect changes in network dynamics and connectivity, we apply it to both simulated data and real neural data obtained from the mouse hippocampus during exploration and slow-wave sleep. We observe state-dependent clustering patterns consistent with known neurophysiological processes involved in memory consolidation.

  10. Fundamentals of grid computing theory, algorithms and technologies


    This volume discusses how the novel technologies of semantic web and workflow have been integrated into the grid and grid services. It focuses on sharing resources, data replication, data management, fault tolerance, scheduling, broadcasting, and load balancing algorithms. The book discusses emerging developments in grid computing, including cloud computing, and explores large-scale computing in high energy physics, weather forecasting, and more. The contributors often use simulations to evaluate the performance of models and algorithms. In the appendices, they present two types of easy-to-use open source software written in Java

  11. Cloud-based Evolutionary Algorithms: An algorithmic study

    Merelo, Juan-J; Mora, Antonio M; Castillo, Pedro; Romero, Gustavo; Laredo, JLJ


    After a proof of concept using Dropbox(tm), a free storage and synchronization service, showed that an evolutionary algorithm using several dissimilar computers connected via WiFi or Ethernet had a good scaling behavior in terms of evaluations per second, it remains to be proved whether that effect also translates to the algorithmic performance of the algorithm. In this paper we will check several different, and difficult, problems, and see what effects the automatic load-balancing and asynchrony have on the speed of resolution of problems.

  12. Explorer I Architects


    The three men responsible for the success of Explorer 1, America's first Earth satellite which was launched January 31, 1958. At left is Dr. William H. Pickering, former director of JPL, which built and operated the satellite. Dr. James A. van Allen, center, of the State University of Iowa, designed and built the instrument on Explorer that discovered the radiation belts which circle the Earth. At right is Dr. Wernher von Braun, leader of the Army's Redstone Arsenal team which built the first stage Redstone rocket that launched Explorer 1.

  13. Exploring Grid Polygons Online

    Icking, Christian; Kamphans, Tom; Klein, Rolf; Langetepe, Elmar


    We investigate the exploration problem of a short-sighted mobile robot moving in an unknown cellular room. To explore a cell, the robot must enter it. Once inside, the robot knows which of the 4 adjacent cells exist and which are boundary edges. The robot starts from a specified cell adjacent to the room's outer wall; it visits each cell, and returns to the start. Our interest is in a short exploration tour; that is, in keeping the number of multiple cell visits small. For abitrary environmen...

  14. Neighborhood-following algorithms for linear programming

    AI Wenbao


    In this paper, we present neighborhood-following algorithms for linear programming. When the neighborhood is a wide neighborhood, our algorithms are wide neighborhood primal-dual interior point algorithms. If the neighborhood degenerates into the central path, our algorithms also degenerate into path-following algorithms. We prove that our algorithms maintain the O(√nL)-iteration complexity still, while the classical wide neighborhood primal-dual interior point algorithms have only the O(nL)-iteration complexity. We also proved that the algorithms are quadratic convergence if the optimal vertex is nondegenerate. Finally, we show some computational results of our algorithms.

  15. An improved form of the ELMS algorithm

    Gao Ying; Xie Shengli


    ELMS algorithm is the first two-channel adaptive filtering algorithm that takes into account the cross-correlation between the two input signals. The algorithm does not preprocess input signals, so it does not degrade the quality of the speech. However, a lot of computer simulation results show that ELMS algorithm has a bad performance. The ELMS algorithm is analyzed firstly, then a new algorithm is presented by modifying the block matrix used in ELMS algorithm to approximate input signals self-correlation matrix. The computer simulation results indicate that the improved algorithm has a better behavior than the ELMS algorithm.

  16. A generic algorithm for layout of biological networks

    Dwyer Tim


    Full Text Available Abstract Background Biological networks are widely used to represent processes in biological systems and to capture interactions and dependencies between biological entities. Their size and complexity is steadily increasing due to the ongoing growth of knowledge in the life sciences. To aid understanding of biological networks several algorithms for laying out and graphically representing networks and network analysis results have been developed. However, current algorithms are specialized to particular layout styles and therefore different algorithms are required for each kind of network and/or style of layout. This increases implementation effort and means that new algorithms must be developed for new layout styles. Furthermore, additional effort is necessary to compose different layout conventions in the same diagram. Also the user cannot usually customize the placement of nodes to tailor the layout to their particular need or task and there is little support for interactive network exploration. Results We present a novel algorithm to visualize different biological networks and network analysis results in meaningful ways depending on network types and analysis outcome. Our method is based on constrained graph layout and we demonstrate how it can handle the drawing conventions used in biological networks. Conclusion The presented algorithm offers the ability to produce many of the fundamental popular drawing styles while allowing the exibility of constraints to further tailor these layouts.

  17. Sampling-based Algorithms for Optimal Motion Planning

    Karaman, Sertac


    During the last decade, sampling-based path planning algorithms, such as Probabilistic RoadMaps (PRM) and Rapidly-exploring Random Trees (RRT), have been shown to work well in practice and possess theoretical guarantees such as probabilistic completeness. However, little effort has been devoted to the formal analysis of the quality of the solution returned by such algorithms, e.g., as a function of the number of samples. The purpose of this paper is to fill this gap, by rigorously analyzing the asymptotic behavior of the cost of the solution returned by stochastic sampling-based algorithms as the number of samples increases. A number of negative results are provided, characterizing existing algorithms, e.g., showing that, under mild technical conditions, the cost of the solution returned by broadly used sampling-based algorithms converges almost surely to a non-optimal value. The main contribution of the paper is the introduction of new algorithms, namely, PRM* and RRT*, which are provably asymptotically opti...

  18. Efficient On-the-fly Algorithms for the Analysis of Timed Games

    Cassez, Franck; David, Alexandre; Fleury, Emmanuel;


    -checking of finite-state systems. Being on-the-fly, the symbolic algorithm may terminate long before having explored the entire state-space. Also the individual steps of the algorithm are carried out efficiently by the use of so-called zones as the underlying data structure. Various optimizations of the basic...... symbolic algorithm are proposed as well as methods for obtaining time-optimal winning strategies (for reachability games). Extensive evaluation of an experimental implementation of the algorithm yields very encouraging performance results....

  19. Application of detecting algorithm based on network

    张凤斌; 杨永田; 江子扬; 孙冰心


    Because currently intrusion detection systems cannot detect undefined intrusion behavior effectively,according to the robustness and adaptability of the genetic algorithms, this paper integrates the genetic algorithms into an intrusion detection system, and a detection algorithm based on network traffic is proposed. This algorithm is a real-time and self-study algorithm and can detect undefined intrusion behaviors effectively.

  20. Dynamics Explorer guest investigator

    Sojka, J. J.


    Four objectives were accomplished during this reporting period. The visible auroral image conversion algorithms were compated with algorithms developed by Dr. M. H. Rees for data at different wavelengths. In the study 630 and 557 nm images were used to deduce the auroral energy flux and characteristic energy of the precipitating auroral electrons. The data for Southward IMF, B sub y negative conditions were collected and put into global format. A total of 55 sets of auroral images were obtained, and then converted to energy flux and characteristic energy data sets. The shortcoming of representing the high latitude convection pattern as a smooth function was written up and submitted to the Journal of Geophysical Research. A series of midlatitude corotational model runs were performed to quantitatively show how the F region varied as a function of electric field, topside number flux, and a topside heat source.

  1. Real-time recursive hyperspectral sample and band processing algorithm architecture and implementation

    Chang, Chein-I


    This book explores recursive architectures in designing progressive hyperspectral imaging algorithms. In particular, it makes progressive imaging algorithms recursive by introducing the concept of Kalman filtering in algorithm design so that hyperspectral imagery can be processed not only progressively sample by sample or band by band but also recursively via recursive equations. This book can be considered a companion book of author’s books, Real-Time Progressive Hyperspectral Image Processing, published by Springer in 2016. Explores recursive structures in algorithm architecture Implements algorithmic recursive architecture in conjunction with progressive sample and band processing Derives Recursive Hyperspectral Sample Processing (RHSP) techniques according to Band-Interleaved Sample/Pixel (BIS/BIP) acquisition format Develops Recursive Hyperspectral Band Processing (RHBP) techniques according to Band SeQuential (BSQ) acquisition format for hyperspectral data.

  2. Multi-Robot Searching Algorithm Using Levy Flight and Artificial Potential Field

    Sutantyo, Donny K; Nepomnyashchikh, Valentin A; Levi, Paul


    An efficient search algorithm is very crucial in robotic area, especially for exploration missions, where the target availability is unknown and the condition of the environment is highly unpredictable. In a very large environment, it is not sufficient to scan an area or volume by a single robot, multiple robots should be involved to perform the collective exploration. In this paper, we propose to combine bio-inspired search algorithm called Levy flight and artificial potential field method to perform an efficient searching algorithm for multi-robot applications. The main focus of this work is not only to prove the concept or to measure the efficiency of the algorithm by experiments, but also to develop an appropriate generic framework to be implemented both in simulation and on real robotic platforms. Several experiments, which compare different search algorithms, are also performed.

  3. Automated Design Space Exploration with Aspen

    Kyle L. Spafford


    Full Text Available Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation language with three new language constructs: user-defined resources, parameter ranges, and a collection of costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.

  4. Advanced Exploration Systems Program

    National Aeronautics and Space Administration — AES consists of more than 35 projects that target high-priority capabilities needed for human exploration such as crew mobility, deep-space habitation, vehicle...

  5. Neurodynamics of mental exploration.

    Hopfield, John J


    Thinking allows an animal to take an effective action in a novel situation based on a mental exploration of possibilities and previous knowledge. We describe a model animal, with a neural system based loosely on the rodent hippocampus, which performs mental exploration to find a useful route in a spatial world it has previously learned. It then mentally recapitulates the chosen route, and this intent is converted to motor acts that move the animal physically along the route. The modeling is based on spiking neurons with spike-frequency adaptation. Adaptation causes the continuing evolution in the pattern of neural activity that is essential to mental exploration. A successful mental exploration is remembered through spike-timing-dependent synaptic plasticity. The system is also an episodic memory for an animal chiefly concerned with locations.

  6. Arts of urban exploration

    Pinder, David


    This paper addresses ways in which artists and cultural practitioners have recently been using forms of urban exploration as a means of engaging with, and intervening in, cities. It takes its cues from recent events on the streets of New York that involved exploring urban spaces through artistic...... that experimental modes of exploration can play a vital role in the development of critical approaches to the cultural geographies of cities. In particular, discussion centres on the political significance of these spatial practices, drawing out what they have to say about two interconnected themes: ‘rights...... to the city’ and ‘writing the city’. Through addressing recent cases of psychogeographical experimentation in terms of these themes, the paper raises broad questions about artistic practices and urban exploration to introduce this theme issue on ‘Arts of urban exploration’ and to lead into the specific...

  7. Academics explore humidity's benefits.

    Mortimer, Dave


    The effects of humidification on hospital superbugs are being explored by some of the UK's top academics, in what Dave Mortimer, national sales manager for Vapac Humidity Control, explains are the UK's first such studies.

  8. Foreign Aid Explorer)

    US Agency for International Development — The Foreign Aid Explorer shows the multi-dimensional picture of U.S. foreign assistance through a highly visual and interactive website. The website makes it easy...

  9. Abdominal exploration - slideshow

    ... ency/presentations/100049.htm Abdominal exploration - series—Normal anatomy To use the ... Overview The abdomen contains many vital organs: the stomach, the small intestine (jejunum and ileum), the large ...

  10. Exploring ambiguous realms

    Clemensen, Nana


    In Hang'ombe Village in rural Zambia, the relative lack of physical boundaries between the activities of family members allow children to observe the actions and discussions of adults on close hand, exposing them to the ambiguities of daily life. Children explore these ambiguities...... in their interactions, testing social roles and conventions. This article explores the vigilance and creative agency displayed by Hang'ombe children, in an environment spurring their acquisition of distinct social and discursive skills....

  11. Neurodynamics of mental exploration

    Hopfield, John J


    Thinking allows an animal to take an effective action in a novel situation based on a mental exploration of possibilities and previous knowledge. We describe a model animal, with a neural system based loosely on the rodent hippocampus, which performs mental exploration to find a useful route in a spatial world it has previously learned. It then mentally recapitulates the chosen route, and this intent is converted to motor acts that move the animal physically along the route. The modeling is b...

  12. Routing Algorithm Exploits Spatial Relations

    Okino, Clayton; Jennings, Esther


    A recently developed routing algorithm for broadcasting in an ad hoc wireless communication network takes account of, and exploits, the spatial relationships among the locations of nodes, in addition to transmission power levels and distances between the nodes. In contrast, most prior algorithms for discovering routes through ad hoc networks rely heavily on transmission power levels and utilize limited graph-topology techniques that do not involve consideration of the aforesaid spatial relationships. The present algorithm extracts the relevant spatial-relationship information by use of a construct denoted the relative-neighborhood graph (RNG).

  13. Subcubic Control Flow Analysis Algorithms

    Midtgaard, Jan; Van Horn, David

    We give the first direct subcubic algorithm for performing control flow analysis of higher-order functional programs. Despite the long held belief that inclusion-based flow analysis could not surpass the ``cubic bottleneck, '' we apply known set compression techniques to obtain an algorithm...... that runs in time O(n^3/log n) on a unit cost random-access memory model machine. Moreover, we refine the initial flow analysis into two more precise analyses incorporating notions of reachability. We give subcubic algorithms for these more precise analyses and relate them to an existing analysis from...



    This paper presents a new arc flow model for the one-dimensional bin covering problem and an algorithm to solve the problem exactly through a branch-and-bound procedure and the technique of column generation. The subproblems occuring in the procedure of branch-and-bound have the same structure and therefore can be solved by the same algorithm. In order to solve effectively the subproblems which are generally large scale, a column generation algorithm is employed. Many rules found in this paper can improve the performance of the methods.

  15. CPU Scheduling Algorithms: A Survey

    Imran Qureshi


    Full Text Available Scheduling is the fundamental function of operating system. For scheduling, resources of system shared among processes which are going to be executed. CPU scheduling is a technique by which processes are allocating to the CPU for a specific time quantum. In this paper the review of different scheduling algorithms are perform with different parameters, such as running time, burst time and waiting times etc. The reviews algorithms are first come first serve, Shortest Job First, Round Robin, and Priority scheduling algorithm.

  16. Variables Bounding Based Retiming Algorithm

    宫宗伟; 林争辉; 陈后鹏


    Retiming is a technique for optimizing sequential circuits. In this paper, wediscuss this problem and propose an improved retiming algorithm based on variables bounding.Through the computation of the lower and upper bounds on variables, the algorithm can signi-ficantly reduce the number of constraints and speed up the execution of retiming. Furthermore,the elements of matrixes D and W are computed in a demand-driven way, which can reducethe capacity of memory. It is shown through the experimental results on ISCAS89 benchmarksthat our algorithm is very effective for large-scale sequential circuits.

  17. Instance-specific algorithm configuration

    Malitsky, Yuri


    This book presents a modular and expandable technique in the rapidly emerging research area of automatic configuration and selection of the best algorithm for the instance at hand. The author presents the basic model behind ISAC and then details a number of modifications and practical applications. In particular, he addresses automated feature generation, offline algorithm configuration for portfolio generation, algorithm selection, adaptive solvers, online tuning, and parallelization.    The author's related thesis was honorably mentioned (runner-up) for the ACP Dissertation Award in 2014,



    Tornado codes have been used in the error control of data transmission in IP network. The efficiency of this erasure codes is critically affected by the short cycles in its bipartite graph. To remove this effect,two algorithms are introduced: (1) while generating the graph, the cycle eliminating algorithm is used to reduce the number of the short cycles in it; (2) in the decoding algorithm, cycles that are inevitably in the graph are used to remove decoding efficiency degradation. The simulation results show that they have a better performance than that of general tornado codes.

  19. A filtered backprojection algorithm with characteristics of the iterative landweber algorithm

    L. Zeng, Gengsheng


    Purpose: In order to eventually develop an analytical algorithm with noise characteristics of an iterative algorithm, this technical note develops a window function for the filtered backprojection (FBP) algorithm in tomography that behaves as an iterative Landweber algorithm.

  20. Parallelizing flow-accumulation calculations on graphics processing units—From iterative DEM preprocessing algorithm to recursive multiple-flow-direction algorithm

    Qin, Cheng-Zhi; Zhan, Lijun


    As one of the important tasks in digital terrain analysis, the calculation of flow accumulations from gridded digital elevation models (DEMs) usually involves two steps in a real application: (1) using an iterative DEM preprocessing algorithm to remove the depressions and flat areas commonly contained in real DEMs, and (2) using a recursive flow-direction algorithm to calculate the flow accumulation for every cell in the DEM. Because both algorithms are computationally intensive, quick calculation of the flow accumulations from a DEM (especially for a large area) presents a practical challenge to personal computer (PC) users. In recent years, rapid increases in hardware capacity of the graphics processing units (GPUs) provided in modern PCs have made it possible to meet this challenge in a PC environment. Parallel computing on GPUs using a compute-unified-device-architecture (CUDA) programming model has been explored to speed up the execution of the single-flow-direction algorithm (SFD). However, the parallel implementation on a GPU of the multiple-flow-direction (MFD) algorithm, which generally performs better than the SFD algorithm, has not been reported. Moreover, GPU-based parallelization of the DEM preprocessing step in the flow-accumulation calculations has not been addressed. This paper proposes a parallel approach to calculate flow accumulations (including both iterative DEM preprocessing and a recursive MFD algorithm) on a CUDA-compatible GPU. For the parallelization of an MFD algorithm (MFD-md), two different parallelization strategies using a GPU are explored. The first parallelization strategy, which has been used in the existing parallel SFD algorithm on GPU, has the problem of computing redundancy. Therefore, we designed a parallelization strategy based on graph theory. The application results show that the proposed parallel approach to calculate flow accumulations on a GPU performs much faster than either sequential algorithms or other parallel GPU