WorldWideScience

Sample records for intelligence search method

  1. Search for extraterrestrial intelligence (SETI)

    International Nuclear Information System (INIS)

    Morrison, P.; Billingham, J.; Wolfe, J.

    1977-01-01

    Findings are presented of a series of workshops on the existence of extraterrestrial intelligent life and ways in which extraterrestrial intelligence might be detected. The coverage includes the cosmic and cultural evolutions, search strategies, detection of other planetary systems, alternate methods of communication, and radio frequency interference. 17 references

  2. Competing intelligent search agents in global optimization

    Energy Technology Data Exchange (ETDEWEB)

    Streltsov, S.; Vakili, P. [Boston Univ., MA (United States); Muchnik, I. [Rutgers Univ., Piscataway, NJ (United States)

    1996-12-31

    In this paper we present a new search methodology that we view as a development of intelligent agent approach to the analysis of complex system. The main idea is to consider search process as a competition mechanism between concurrent adaptive intelligent agents. Agents cooperate in achieving a common search goal and at the same time compete with each other for computational resources. We propose a statistical selection approach to resource allocation between agents that leads to simple and efficient on average index allocation policies. We use global optimization as the most general setting that encompasses many types of search problems, and show how proposed selection policies can be used to improve and combine various global optimization methods.

  3. Intelligent Search Method Based ACO Techniques for a Multistage Decision Problem EDP/LFP

    Directory of Open Access Journals (Sweden)

    Mostefa RAHLI

    2006-07-01

    Full Text Available The implementation of a numerical library of calculation based optimization in electrical supply networks area is in the centre of the current research orientations, thus, our project in a form given is centred on the development of platform NMSS1. It's a software environment which will preserve many efforts as regards calculations of charge, smoothing curves, losses calculation and economic planning of the generated powers [23].The operational research [17] in a hand and the industrial practice in the other, prove that the means and processes of simulation reached a level of very appreciable reliability and mathematical confidence [4, 5, 14]. It is of this expert observation that many processes make confidence to the results of simulation.The handicaps of this approach or methodology are that it makes base its judgments and handling on simplified assumptions and constraints whose influence was deliberately neglected to be added to the cost to spend [14].By juxtaposing the methods of simulation with artificial intelligence techniques, gathering set of numerical methods acquires an optimal reliability whose assurance can not leave doubt.Software environment NMSS [23] can be a in the field of the rallying techniques of simulation and electric network calculation via a graphic interface. In the same software integrate an AI capability via a module expert system.Our problem is a multistage case where are completely dependant and can't be performed separately.For a multistage problem [21, 22], the results obtained from a credible (large size problem calculation, makes the following question: Could choice of numerical methods set make the calculation of a complete problem using more than two treatments levels, a total error which will be the weakest one possible? It is well-known according to algorithmic policy; each treatment can be characterized by a function called mathematical complexity. This complexity is in fact a coast (a weight overloading

  4. Intelligent search in Big Data

    Science.gov (United States)

    Birialtsev, E.; Bukharaev, N.; Gusenkov, A.

    2017-10-01

    An approach to data integration, aimed on the ontology-based intelligent search in Big Data, is considered in the case when information objects are represented in the form of relational databases (RDB), structurally marked by their schemes. The source of information for constructing an ontology and, later on, the organization of the search are texts in natural language, treated as semi-structured data. For the RDBs, these are comments on the names of tables and their attributes. Formal definition of RDBs integration model in terms of ontologies is given. Within framework of the model universal RDB representation ontology, oil production subject domain ontology and linguistic thesaurus of subject domain language are built. Technique of automatic SQL queries generation for subject domain specialists is proposed. On the base of it, information system for TATNEFT oil-producing company RDBs was implemented. Exploitation of the system showed good relevance with majority of queries.

  5. Model of intelligent information searching system

    International Nuclear Information System (INIS)

    Yastrebkov, D.I.

    2004-01-01

    A brief description of the technique to search for electronic documents in large archives as well as drawbacks is presented. A solution close to intelligent information searching systems is proposed. (author)

  6. The Search for Extraterrestrial Intelligence (SETI)

    Science.gov (United States)

    Tarter, Jill

    The search for evidence of extraterrestrial intelligence is placed in the broader astronomical context of the search for extrasolar planets and biomarkers of primitive life elsewhere in the universe. A decision tree of possible search strategies is presented as well as a brief history of the search for extraterrestrial intelligence (SETI) projects since 1960. The characteristics of 14 SETI projects currently operating on telescopes are discussed and compared using one of many possible figures of merit. Plans for SETI searches in the immediate and more distant future are outlined. Plans for success, the significance of null results, and some opinions on deliberate transmission of signals (as well as listening) are also included. SETI results to date are negative, but in reality, not much searching has yet been done.

  7. Intelligent Search on XML Data

    NARCIS (Netherlands)

    Blanken, Henk; Grabs, T.; Schek, H-J.; Schenkel, R.; Weikum, G.; Unknown, [Unknown

    2003-01-01

    Recently, we have seen a steep increase in the popularity and adoption of XML, in areas such as traditional databases, e-business, the scientific environment, and on the web. Querying XML documents and data efficiently is a challenging issue; this book approaches search on XML data by combining

  8. Effect of Undergraduates’ Emotional Intelligence on Information Search Behavior

    Directory of Open Access Journals (Sweden)

    Wang Haocheng

    2017-06-01

    Full Text Available [Purpose/significance] Information search capability is the focus of information literacy education. This paper explores the relationship between emotional intelligence and information search behavior. [Method/process] Based on the data from the questionnaires by 250 undergraduates, this paper used IBM SPSS Statistics 19.0 for statistical data analysis. [Result/conclusion]The correlation between emotional intelligence and information search capability is positively obvious. When it comes to all variables in the regression equation, information search behavior is mainly affected by regulation and utilization of the dimension of emotion. Utilization of emotion mainly affects retrieval strategies, information evaluation, behavior adjustment and total score; regulation of emotions mainly affects the information reference.

  9. The internet and intelligent machines: search engines, agents and robots

    International Nuclear Information System (INIS)

    Achenbach, S.; Alfke, H.

    2000-01-01

    The internet plays an important role in a growing number of medical applications. Finding relevant information is not always easy as the amount of available information on the Web is rising quickly. Even the best Search Engines can only collect links to a fraction of all existing Web pages. In addition, many of these indexed documents have been changed or deleted. The vast majority of information on the Web is not searchable with conventional methods. New search strategies, technologies and standards are combined in Intelligent Search Agents (ISA) an Robots, which can retrieve desired information in a specific approach. Conclusion: The article describes differences between ISAs and conventional Search Engines and how communication between Agents improves their ability to find information. Examples of existing ISAs are given and the possible influences on the current and future work in radiology is discussed. (orig.) [de

  10. Recent advances in intelligent image search and video retrieval

    CERN Document Server

    2017-01-01

    This book initially reviews the major feature representation and extraction methods and effective learning and recognition approaches, which have broad applications in the context of intelligent image search and video retrieval. It subsequently presents novel methods, such as improved soft assignment coding, Inheritable Color Space (InCS) and the Generalized InCS framework, the sparse kernel manifold learner method, the efficient Support Vector Machine (eSVM), and the Scale-Invariant Feature Transform (SIFT) features in multiple color spaces. Lastly, the book presents clothing analysis for subject identification and retrieval, and performance evaluation methods of video analytics for traffic monitoring. Digital images and videos are proliferating at an amazing speed in the fields of science, engineering and technology, media and entertainment. With the huge accumulation of such data, keyword searches and manual annotation schemes may no longer be able to meet the practical demand for retrieving relevant conte...

  11. The Breakthrough Listen Search for Intelligent Life

    Science.gov (United States)

    Croft, Steve; Siemion, Andrew; De Boer, David; Enriquez, J. Emilio; Foster, Griffin; Gajjar, Vishal; Hellbourg, Greg; Hickish, Jack; Isaacson, Howard; Lebofsky, Matt; MacMahon, David; Price, Daniel; Werthimer, Dan

    2018-01-01

    The $100M, 10-year philanthropic "Breakthrough Listen" project is driving an unprecedented expansion of the search for intelligent life beyond Earth. Modern instruments allow ever larger regions of parameter space (luminosity function, duty cycle, beaming fraction, frequency coverage) to be explored, which is enabling us to place meaningful physical limits on the prevalence of transmitting civilizations. Data volumes are huge, and preclude long-term storage of the raw data products, so real-time and machine learning processing techniques must be employed to identify candidate signals as well as simultaneously classifying interfering sources. However, the Galaxy is now known to be a target-rich environment, teeming with habitable planets.Data from Breakthrough Listen can also be used by researchers in other areas of astronomy to study pulsars, fast radio bursts, and a range of other science targets. Breakthrough Listen is already underway in the optical and radio bands, and is also engaging with facilities across the world, including Square Kilometer Array precursors and pathfinders. I will give an overview of the technology, science goals, data products, and roadmap of Breakthrough Listen, as we attempt to answer one of humanity's oldest questions: Are we alone?

  12. Intelligent methods for data retrieval in fusion databases

    International Nuclear Information System (INIS)

    Vega, J.

    2008-01-01

    The plasma behaviour is identified through the recognition of patterns inside signals. The search for patterns is usually a manual and tedious procedure in which signals need to be examined individually. A breakthrough in data retrieval for fusion databases is the development of intelligent methods to search for patterns. A pattern (in the broadest sense) could be a single segment of a waveform, a set of pixels within an image or even a heterogeneous set of features made up of waveforms, images and any kind of experimental data. Intelligent methods will allow searching for data according to technical, scientific and structural criteria instead of an identifiable time interval or pulse number. Such search algorithms should be intelligent enough to avoid passing over the entire database. Benefits of such access methods are discussed and several available techniques are reviewed. In addition, the applicability of the methods from general purpose searching systems to ad hoc developments is covered

  13. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  14. An introduction to harmony search optimization method

    CERN Document Server

    Wang, Xiaolei; Zenger, Kai

    2014-01-01

    This brief provides a detailed introduction, discussion and bibliographic review of the nature1-inspired optimization algorithm called Harmony Search. It uses a large number of simulation results to demonstrate the advantages of Harmony Search and its variants and also their drawbacks. The authors show how weaknesses can be amended by hybridization with other optimization methods. The Harmony Search Method with Applications will be of value to researchers in computational intelligence in demonstrating the state of the art of research on an algorithm of current interest. It also helps researche

  15. Intelligent Search Optimization using Artificial Fuzzy Logics

    OpenAIRE

    Manral, Jai

    2015-01-01

    Information on the web is prodigious; searching relevant information is difficult making web users to rely on search engines for finding relevant information on the web. Search engines index and categorize web pages according to their contents using crawlers and rank them accordingly. For given user query they retrieve millions of webpages and display them to users according to web-page rank. Every search engine has their own algorithms based on certain parameters for ranking web-pages. Searc...

  16. Knowledge in Artificial Intelligence Systems: Searching the Strategies for Application

    OpenAIRE

    Kornienko, Alla A.; Kornienko, Anatoly V.; Fofanov, Oleg B.; Chubik, Maxim P.

    2015-01-01

    The studies based on auto-epistemic logic are pointed out as an advanced direction for development of artificial intelligence (AI). Artificial intelligence is taken as a system that imitates the solution of complicated problems by human during the course of life. The structure of symbols and operations, by which intellectual solution is performed, as well as searching the strategic reference points for those solutions, which are caused by certain structures of symbols and operations, – are co...

  17. Funding the Search for Extraterrestrial Intelligence with a Lottery Bond

    OpenAIRE

    Haqq-Misra, Jacob

    2013-01-01

    I propose the establishment of a SETI Lottery Bond to provide a continued source of funding for the search for extraterrestrial intelligence (SETI). The SETI Lottery Bond is a fixed rate perpetual bond with a lottery at maturity, where maturity occurs only upon discovery and confirmation of extraterrestrial intelligent life. Investors in the SETI Lottery Bond purchase shares that yield a fixed rate of interest that continues indefinitely until SETI succeeds---at which point a random subset of...

  18. Intelligent methods for cyber warfare

    CERN Document Server

    Reformat, Marek; Alajlan, Naif

    2015-01-01

    Cyberwarfare has become an important concern for governmental agencies as well businesses of various types.  This timely volume, with contributions from some of the internationally recognized, leaders in the field, gives readers a glimpse of the new and emerging ways that Computational Intelligence and Machine Learning methods can be applied to address problems related to cyberwarfare. The book includes a number of chapters that can be conceptually divided into three topics: chapters describing different data analysis methodologies with their applications to cyberwarfare, chapters presenting a number of intrusion detection approaches, and chapters dedicated to analysis of possible cyber attacks and their impact. The book provides the readers with a variety of methods and techniques, based on computational intelligence, which can be applied to the broad domain of cyberwarfare.

  19. Searching for exoplanets using artificial intelligence

    Science.gov (United States)

    Pearson, Kyle A.; Palafox, Leon; Griffith, Caitlin A.

    2018-02-01

    In the last decade, over a million stars were monitored to detect transiting planets. Manual interpretation of potential exoplanet candidates is labor intensive and subject to human error, the results of which are difficult to quantify. Here we present a new method of detecting exoplanet candidates in large planetary search projects which, unlike current methods uses a neural network. Neural networks, also called "deep learning" or "deep nets" are designed to give a computer perception into a specific problem by training it to recognize patterns. Unlike past transit detection algorithms deep nets learn to recognize planet features instead of relying on hand-coded metrics that humans perceive as the most representative. Our convolutional neural network is capable of detecting Earth-like exoplanets in noisy time-series data with a greater accuracy than a least-squares method. Deep nets are highly generalizable allowing data to be evaluated from different time series after interpolation without compromising performance. As validated by our deep net analysis of Kepler light curves, we detect periodic transits consistent with the true period without any model fitting. Our study indicates that machine learning will facilitate the characterization of exoplanets in future analysis of large astronomy data sets.

  20. Psycholinguistics and the Search for Extraterrestrial Intelligence

    Directory of Open Access Journals (Sweden)

    Lidija Krotenko

    2017-09-01

    Full Text Available The author of the article reveals the possibilities of psycholinguistics in the identifi cation and interpretation of languages and texts of Alien Civilizations. The author combines modern interdisciplinary research in psycholinguistics with the theory “Evolving Matter” proposed by Oleg Bazaluk and concludes that the identifi cation of languages and texts of Alien Civilizations, as well as the communication of terrestrial civilization with Extraterrestrial Intelligence, is in principle possible. To that end, it is necessary to achieve the required level of the modeling of neurophilosophy and to include these achievements of modern psycholinguistics studies: а language acquisition; b language comprehension; c language production; d second language acquisition. On the one hand, the possibilities of neurophilosophy to accumulate and model advanced neuroscience research; on the other hand, highly specialized psycholinguistic studies in language evolution are able to provide the communication of terrestrial civilization with Extraterrestrial Intelligence.

  1. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  2. Searching for Exoplanets using Artificial Intelligence

    Science.gov (United States)

    Pearson, Kyle Alexander; Palafox, Leon; Griffith, Caitlin Ann

    2017-10-01

    In the last decade, over a million stars were monitored to detect transiting planets. The large volume of data obtained from current and future missions (e.g. Kepler, K2, TESS and LSST) requires automated methods to detect the signature of a planet. Manual interpretation of potential exoplanet candidates is labor intensive and subject to human error, the results of which are difficult to quantify. Here we present a new method of detecting exoplanet candidates in large planetary search projects which, unlike current methods uses a neural network. Neural networks, also called ``deep learning'' or ``deep nets'', are a state of the art machine learning technique designed to give a computer perception into a specific problem by training it to recognize patterns. Unlike past transit detection algorithms, the deep net learns to characterize the data instead of relying on hand-coded metrics that humans perceive as the most representative. Exoplanet transits have different shapes, as a result of, e.g. the planet's and stellar atmosphere and transit geometry. Thus, a simple template does not suffice to capture the subtle details, especially if the signal is below the noise or strong systematics are present. Current false-positive rates from the Kepler data are estimated around 12.3% for Earth-like planets and there has been no study of the false negative rates. It is therefore important to ask how the properties of current algorithms exactly affect the results of the Kepler mission and, future missions such as TESS, which flies next year. These uncertainties affect the fundamental research derived from missions, such as the discovery of habitable planets, estimates of their occurrence rates and our understanding about the nature and evolution of planetary systems.

  3. Intelligent Chatter Bot for Regulation Search

    Science.gov (United States)

    De Luise, María Daniela López; Pascal, Andrés; Saad, Ben; Álvarez, Claudia; Pescio, Pablo; Carrilero, Patricio; Malgor, Rafael; Díaz, Joaquín

    2016-01-01

    This communication presents a functional prototype, named PTAH, implementing a linguistic model focused on regulations in Spanish. Its global architecture, the reasoning model and short statistics are provided for the prototype. It is mainly a conversational robot linked to an Expert System by a module with many intelligent linguistic filters, implementing the reasoning model of an expert. It is focused on bylaws, regulations, jurisprudence and customized background representing entity mission, vision and profile. This Structure and model are generic enough to self-adapt to any regulatory environment, but as a first step, it was limited to an academic field. This way it is possible to limit the slang and data numbers. The foundations of the linguistic model are also outlined and the way the architecture implements the key features of the behavior.

  4. Intelligent Information Systems for Web Product Search

    NARCIS (Netherlands)

    D. Vandic (Damir)

    2017-01-01

    markdownabstractOver the last few years, we have experienced an increase in online shopping. Consequently, there is a need for efficient and effective product search engines. The rapid growth of e-commerce, however, has also introduced some challenges. Studies show that users can get overwhelmed by

  5. Search techniques in intelligent classification systems

    CERN Document Server

    Savchenko, Andrey V

    2016-01-01

    A unified methodology for categorizing various complex objects is presented in this book. Through probability theory, novel asymptotically minimax criteria suitable for practical applications in imaging and data analysis are examined including the special cases such as the Jensen-Shannon divergence and the probabilistic neural network. An optimal approximate nearest neighbor search algorithm, which allows faster classification of databases is featured. Rough set theory, sequential analysis and granular computing are used to improve performance of the hierarchical classifiers. Practical examples in face identification (including deep neural networks), isolated commands recognition in voice control system and classification of visemes captured by the Kinect depth camera are included. This approach creates fast and accurate search procedures by using exact probability densities of applied dissimilarity measures. This book can be used as a guide for independent study and as supplementary material for a technicall...

  6. Development of intelligent semantic search system for rubber research data in Thailand

    Science.gov (United States)

    Kaewboonma, Nattapong; Panawong, Jirapong; Pianhanuruk, Ekkawit; Buranarach, Marut

    2017-10-01

    The rubber production of Thailand increased not only by strong demand from the world market, but was also stimulated strongly through the replanting program of the Thai Government from 1961 onwards. With the continuous growth of rubber research data volume on the Web, the search for information has become a challenging task. Ontologies are used to improve the accuracy of information retrieval from the web by incorporating a degree of semantic analysis during the search. In this context, we propose an intelligent semantic search system for rubber research data in Thailand. The research methods included 1) analyzing domain knowledge, 2) ontologies development, and 3) intelligent semantic search system development to curate research data in trusted digital repositories may be shared among the wider Thailand rubber research community.

  7. SETI pioneers scientists talk about their search for extraterrestrial intelligence

    CERN Document Server

    Swift, David W.

    1990-01-01

    Why did some scientists decide to conduct a search for extraterrestrial intelligence (SETI)? What factors in their personal development predisposed them to such a quest? What obstacles did they encounter along the way? David Swift interviewed the first scientists involved in the search & offers a fascinating overview of the emergence of this modern scientific endeavor. He allows some of the most imaginative scientific thinkers of our time to hold forth on their views regarding SETI & extraterrestrial life & on how the field has developed. Readers will react with a range of opinions as broad as those concerning the likelihood of success in SETI itself. ''A goldmine of original information.''

  8. EIIS: An Educational Information Intelligent Search Engine Supported by Semantic Services

    Science.gov (United States)

    Huang, Chang-Qin; Duan, Ru-Lin; Tang, Yong; Zhu, Zhi-Ting; Yan, Yong-Jian; Guo, Yu-Qing

    2011-01-01

    The semantic web brings a new opportunity for efficient information organization and search. To meet the special requirements of the educational field, this paper proposes an intelligent search engine enabled by educational semantic support service, where three kinds of searches are integrated into Educational Information Intelligent Search (EIIS)…

  9. Intelligent structural optimization: Concept, Model and Methods

    International Nuclear Information System (INIS)

    Lu, Dagang; Wang, Guangyuan; Peng, Zhang

    2002-01-01

    Structural optimization has many characteristics of Soft Design, and so, it is necessary to apply the experience of human experts to solving the uncertain and multidisciplinary optimization problems in large-scale and complex engineering systems. With the development of artificial intelligence (AI) and computational intelligence (CI), the theory of structural optimization is now developing into the direction of intelligent optimization. In this paper, a concept of Intelligent Structural Optimization (ISO) is proposed. And then, a design process model of ISO is put forward in which each design sub-process model are discussed. Finally, the design methods of ISO are presented

  10. Science, religion, and the search for extraterrestrial intelligence

    CERN Document Server

    Wilkinson, David

    2013-01-01

    If the discovery of life elsewhere in the universe is just around the corner, what would be the consequences for religion? Would it represent another major conflict between science and religion, even leading to the death of faith? Some would suggest that the discovery of any suggestion of extraterrestrial life would have a greater impact than even the Copernican and Darwinian revolutions. It is now over 50 years since the first modern scientific papers were published on the search for extraterrestrial intelligence (SETI). Yet the religious implications of this search and possible discovery have never been systematically addressed in the scientific or theological arena. SETI is now entering its most important era of scientific development. New observation techniques are leading to the discovery of extra-solar planets daily, and the Kepler mission has already collected over 1000 planetary candidates. This deluge of data is transforming the scientific and popular view of the existence of extraterrestrial intel...

  11. Artificial intelligence methods for diagnostic

    International Nuclear Information System (INIS)

    Dourgnon-Hanoune, A.; Porcheron, M.; Ricard, B.

    1996-01-01

    To assist in diagnosis of its nuclear power plants, the Research and Development Division of Electricite de France has been developing skills in Artificial Intelligence for about a decade. Different diagnostic expert systems have been designed. Among them, SILEX for control rods cabinet troubleshooting, DIVA for turbine generator diagnosis, DIAPO for reactor coolant pump diagnosis. This know how in expert knowledge modeling and acquisition is direct result of experience gained during developments and of a more general reflection on knowledge based system development. We have been able to reuse this results for other developments such as a guide for auxiliary rotating machines diagnosis. (authors)

  12. WANDERER IN THE MIST: THE SEARCH FOR INTELLIGENCE, SURVEILLANCE, AND RECONNAISSANCE (ISR) STRATEGY

    Science.gov (United States)

    2017-06-01

    the production of over 383,000 photographic prints to support various intelligence , mapping, and 15...WANDERER IN THE MIST: THE SEARCH FOR INTELLIGENCE , SURVEILLANCE, AND RECONNAISSANCE (ISR) STRATEGY BY MAJOR RYAN D. SKAGGS, USAF...program from the University of California at Los Angeles (UCLA) in 2004. He is a career intelligence officer with over 13 years of experience across a

  13. Hybrid intelligent optimization methods for engineering problems

    Science.gov (United States)

    Pehlivanoglu, Yasin Volkan

    The purpose of optimization is to obtain the best solution under certain conditions. There are numerous optimization methods because different problems need different solution methodologies; therefore, it is difficult to construct patterns. Also mathematical modeling of a natural phenomenon is almost based on differentials. Differential equations are constructed with relative increments among the factors related to yield. Therefore, the gradients of these increments are essential to search the yield space. However, the landscape of yield is not a simple one and mostly multi-modal. Another issue is differentiability. Engineering design problems are usually nonlinear and they sometimes exhibit discontinuous derivatives for the objective and constraint functions. Due to these difficulties, non-gradient-based algorithms have become more popular in recent decades. Genetic algorithms (GA) and particle swarm optimization (PSO) algorithms are popular, non-gradient based algorithms. Both are population-based search algorithms and have multiple points for initiation. A significant difference from a gradient-based method is the nature of the search methodologies. For example, randomness is essential for the search in GA or PSO. Hence, they are also called stochastic optimization methods. These algorithms are simple, robust, and have high fidelity. However, they suffer from similar defects, such as, premature convergence, less accuracy, or large computational time. The premature convergence is sometimes inevitable due to the lack of diversity. As the generations of particles or individuals in the population evolve, they may lose their diversity and become similar to each other. To overcome this issue, we studied the diversity concept in GA and PSO algorithms. Diversity is essential for a healthy search, and mutations are the basic operators to provide the necessary variety within a population. After having a close scrutiny of the diversity concept based on qualification and

  14. Searching for Extraterrestrial Intelligence SETI Past, Present, and Future

    CERN Document Server

    Shuch, H Paul

    2011-01-01

    This book is a collection of essays written by the very scientists and engineers who have led, and continue to lead, the scientific quest known as SETI, the search for extraterrestrial intelligence. Divided into three parts, the first section, ‘The Spirit of SETI Past’, written by the surviving pioneers of this then emerging discipline, reviews the major projects undertaken during the first 50 years of SETI science and the results of that research. In the second section, ‘The Spirit of SETI Present’, the present-day science and technology is discussed in detail, providing the technical background to contemporary SETI instruments, experiments, and analytical techniques, including the processing of the received signals to extract potential alien communications. In the third and final section, ‘The Spirit of SETI Future’, the book looks ahead to the possible directions that SETI will take in the next 50 years, addressing such important topics as interstellar message construction, the risks and assump...

  15. A framework for intelligent data acquisition and real-time database searching for shotgun proteomics.

    Science.gov (United States)

    Graumann, Johannes; Scheltema, Richard A; Zhang, Yong; Cox, Jürgen; Mann, Matthias

    2012-03-01

    In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides "on-the-fly" within 30 ms, well within the time constraints of a shotgun fragmentation "topN" method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available.

  16. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  17. Efficient searching in meshfree methods

    Science.gov (United States)

    Olliff, James; Alford, Brad; Simkins, Daniel C.

    2018-04-01

    Meshfree methods such as the Reproducing Kernel Particle Method and the Element Free Galerkin method have proven to be excellent choices for problems involving complex geometry, evolving topology, and large deformation, owing to their ability to model the problem domain without the constraints imposed on the Finite Element Method (FEM) meshes. However, meshfree methods have an added computational cost over FEM that come from at least two sources: increased cost of shape function evaluation and the determination of adjacency or connectivity. The focus of this paper is to formally address the types of adjacency information that arises in various uses of meshfree methods; a discussion of available techniques for computing the various adjacency graphs; propose a new search algorithm and data structure; and finally compare the memory and run time performance of the methods.

  18. Fuzzy Search Method for Hi Education Information Security

    Directory of Open Access Journals (Sweden)

    Grigory Grigorevich Novikov

    2016-03-01

    Full Text Available The main reason of the research is how to use fuzzy search method for information security of Hi Education or some similar purposes. So many sensitive information leaks are through non SUMMARY 149 classified documents legal publishing. That’s why many intelligence services so love to use the «mosaic» information collection method. This article is about how to prevent it.

  19. Geometrical Fuzzy Search Method for the Business Information Security Systems

    Directory of Open Access Journals (Sweden)

    Grigory Grigorievich Novikov

    2014-12-01

    Full Text Available The main reason of the article is how to use one of new fuzzy search method for information security of business or some other purposes. So many sensitive information leaks are through non-classified documents legal publishing. That’s why many intelligence services like to use the “mosaic” information collection method so much: This article is about how to prevent it.

  20. Automatic figure ranking and user interfacing for intelligent figure search.

    Directory of Open Access Journals (Sweden)

    Hong Yu

    2010-10-01

    Full Text Available Figures are important experimental results that are typically reported in full-text bioscience articles. Bioscience researchers need to access figures to validate research facts and to formulate or to test novel research hypotheses. On the other hand, the sheer volume of bioscience literature has made it difficult to access figures. Therefore, we are developing an intelligent figure search engine (http://figuresearch.askhermes.org. Existing research in figure search treats each figure equally, but we introduce a novel concept of "figure ranking": figures appearing in a full-text biomedical article can be ranked by their contribution to the knowledge discovery.We empirically validated the hypothesis of figure ranking with over 100 bioscience researchers, and then developed unsupervised natural language processing (NLP approaches to automatically rank figures. Evaluating on a collection of 202 full-text articles in which authors have ranked the figures based on importance, our best system achieved a weighted error rate of 0.2, which is significantly better than several other baseline systems we explored. We further explored a user interfacing application in which we built novel user interfaces (UIs incorporating figure ranking, allowing bioscience researchers to efficiently access important figures. Our evaluation results show that 92% of the bioscience researchers prefer as the top two choices the user interfaces in which the most important figures are enlarged. With our automatic figure ranking NLP system, bioscience researchers preferred the UIs in which the most important figures were predicted by our NLP system than the UIs in which the most important figures were randomly assigned. In addition, our results show that there was no statistical difference in bioscience researchers' preference in the UIs generated by automatic figure ranking and UIs by human ranking annotation.The evaluation results conclude that automatic figure ranking and user

  1. SOLVING ENGINEERING OPTIMIZATION PROBLEMS WITH THE SWARM INTELLIGENCE METHODS

    Directory of Open Access Journals (Sweden)

    V. Panteleev Andrei

    2017-01-01

    Full Text Available An important stage in problem solving process for aerospace and aerostructures designing is calculating their main charac- teristics optimization. The results of the four constrained optimization problems related to the design of various technical systems: such as determining the best parameters of welded beams, pressure vessel, gear, spring are presented. The purpose of each task is to minimize the cost and weight of the construction. The object functions in optimization practical problem are nonlinear functions with a lot of variables and a complex layer surface indentations. That is why using classical approach for extremum seeking is not efficient. Here comes the necessity of using such methods of optimization that allow to find a near optimal solution in acceptable amount of time with the minimum waste of computer power. Such methods include the methods of Swarm Intelligence: spiral dy- namics algorithm, stochastic diffusion search, hybrid seeker optimization algorithm. The Swarm Intelligence methods are designed in such a way that a swarm consisting of agents carries out the search for extremum. In search for the point of extremum, the parti- cles exchange information and consider their experience as well as the experience of population leader and the neighbors in some area. To solve the listed problems there has been designed a program complex, which efficiency is illustrated by the solutions of four applied problems. Each of the considered applied optimization problems is solved with all the three chosen methods. The ob- tained numerical results can be compared with the ones found in a swarm with a particle method. The author gives recommenda- tions on how to choose methods parameters and penalty function value, which consider inequality constraints.

  2. A Privacy-Preserving Intelligent Medical Diagnosis System Based on Oblivious Keyword Search

    Directory of Open Access Journals (Sweden)

    Zhaowen Lin

    2017-01-01

    Full Text Available One of the concerns people have is how to get the diagnosis online without privacy being jeopardized. In this paper, we propose a privacy-preserving intelligent medical diagnosis system (IMDS, which can efficiently solve the problem. In IMDS, users submit their health examination parameters to the server in a protected form; this submitting process is based on Paillier cryptosystem and will not reveal any information about their data. And then the server retrieves the most likely disease (or multiple diseases from the database and returns it to the users. In the above search process, we use the oblivious keyword search (OKS as a basic framework, which makes the server maintain the computational ability but cannot learn any personal information over the data of users. Besides, this paper also provides a preprocessing method for data stored in the server, to make our protocol more efficient.

  3. 3rd Workshop on "Combinations of Intelligent Methods and Applications"

    CERN Document Server

    Palade, Vasile

    2013-01-01

    The combination of different intelligent methods is a very active research area in Artificial Intelligence (AI). The aim is to create integrated or hybrid methods that benefit from each of their components.  The 3rd Workshop on “Combinations of Intelligent Methods and Applications” (CIMA 2012) was intended to become a forum for exchanging experience and ideas among researchers and practitioners who are dealing with combining intelligent methods either based on first principles or in the context of specific applications. CIMA 2012 was held in conjunction with the 22nd European Conference on Artificial Intelligence (ECAI 2012).This volume includes revised versions of the papers presented at CIMA 2012.  .

  4. Classification of Children Intelligence with Fuzzy Logic Method

    Science.gov (United States)

    Syahminan; ika Hidayati, Permata

    2018-04-01

    Intelligence of children s An Important Thing To Know The Parents Early on. Typing Can be done With a Child’s intelligence Grouping Dominant Characteristics Of each Type of Intelligence. To Make it easier for Parents in Determining The type of Children’s intelligence And How to Overcome them, for It Created A Classification System Intelligence Grouping Children By Using Fuzzy logic method For determination Of a Child’s degree of intelligence type. From the analysis We concluded that The presence of Intelligence Classification systems Pendulum Children With Fuzzy Logic Method Of determining The type of The Child’s intelligence Can be Done in a way That is easier And The results More accurate Conclusions Than Manual tests.

  5. Optimization of Transformation Coefficients Using Direct Search and Swarm Intelligence

    Directory of Open Access Journals (Sweden)

    Manusov V.Z.

    2017-04-01

    Full Text Available This research considers optimization of tap position of transformers in power systems to reduce power losses. Now, methods based on heuristic rules and fuzzy logic, or methods that optimize parts of the whole system separately, are applied to this problem. The first approach requires expert knowledge about processes in the network. The second methods are not able to consider all the interrelations of system’s parts, while changes in segment affect the entire system. Both approaches are tough to implement and require adjustment to the tasks solved. It needs to implement algorithms that can take into account complex interrelations of optimized variables and self-adapt to optimization task. It is advisable to use algorithms given complex interrelations of optimized variables and independently adapting from optimization tasks. Such algorithms include Swarm Intelligence algorithms. Their main features are self-organization, which allows them to automatically adapt to conditions of tasks, and the ability to efficiently exit from local extremes. Thus, they do not require specialized knowledge of the system, in contrast to fuzzy logic. In addition, they can efficiently find quasi-optimal solutions converging to the global optimum. This research applies Particle Swarm Optimization algorithm (PSO. The model of Tajik power system used in experiments. It was found out that PSO is much more efficient than greedy heuristics and more flexible and easier to use than fuzzy logic. PSO allows reducing active power losses from 48.01 to 45.83 MW (4.5%. With al, the effect of using greedy heuristics or fuzzy logic is two times smaller (2.3%.

  6. Representation Methods in AI. Searching by Graphs

    Directory of Open Access Journals (Sweden)

    Angel GARRIDO

    2012-12-01

    Full Text Available The historical origin of the Artificial Intelligence (A I is usually established in the Darmouth Conference, of 1956. But we can find many more arcane origins [1]. Also, we can consider, in more recent times, very great thinkers, as Janos Neumann (then, John von Neumann, arrived in USA, Norbert Wiener, Alan Mathison Turing, or Lofti Zadehfor instance [6, 7]. Frequently A I requires Logic. But its classical version shows too many insufficiencies. So, it was necessary to introduce more sophisticated tools, as fuzzy logic, modal logic, non-monotonic logic and so on [2]. Among the things that A I needs to represent are: categories, objects, properties, relations between objects, situations, states, time, events, causes and effects, knowledge about knowledge, and so on. The problems in A I can be classified in two general types [3, 4]: search problems and representation problems. In this last “mountain”, there exist different ways to reach their summit. So, we have [3]: logics, rules, frames, associative nets, scripts and so on, many times connectedamong them. We attempt, in this paper, a panoramic vision of the scope of application of such Representation Methods in A I. The two more disputable questions of both modern philosophy of mind and A I will be Turing Test and The Chinese Room Argument. To elucidate these very difficult questions, see both final Appendices.

  7. A Secured Cognitive Agent based Multi-strategic Intelligent Search System

    Directory of Open Access Journals (Sweden)

    Neha Gulati

    2018-04-01

    Full Text Available Search Engine (SE is the most preferred information retrieval tool ubiquitously used. In spite of vast scale involvement of users in SE’s, their limited capabilities to understand the user/searcher context and emotions places high cognitive, perceptual and learning load on the user to maintain the search momentum. In this regard, the present work discusses a Cognitive Agent (CA based approach to support the user in Web-based search process. The work suggests a framework called Secured Cognitive Agent based Multi-strategic Intelligent Search System (CAbMsISS to assist the user in search process. It helps to reduce the contextual and emotional mismatch between the SE’s and user. After implementation of the proposed framework, performance analysis shows that CAbMsISS framework improves Query Retrieval Time (QRT and effectiveness for retrieving relevant results as compared to Present Search Engine (PSE. Supplementary to this, it also provides search suggestions when user accesses a resource previously tagged with negative emotions. Overall, the goal of the system is to enhance the search experience for keeping the user motivated. The framework provides suggestions through the search log that tracks the queries searched, resources accessed and emotions experienced during the search. The implemented framework also considers user security. Keywords: BDI model, Cognitive Agent, Emotion, Information retrieval, Intelligent search, Search Engine

  8. Information theory, animal communication, and the search for extraterrestrial intelligence

    Science.gov (United States)

    Doyle, Laurance R.; McCowan, Brenda; Johnston, Simon; Hanser, Sean F.

    2011-02-01

    We present ongoing research in the application of information theory to animal communication systems with the goal of developing additional detectors and estimators for possible extraterrestrial intelligent signals. Regardless of the species, for intelligence (i.e., complex knowledge) to be transmitted certain rules of information theory must still be obeyed. We demonstrate some preliminary results of applying information theory to socially complex marine mammal species (bottlenose dolphins and humpback whales) as well as arboreal squirrel monkeys, because they almost exclusively rely on vocal signals for their communications, producing signals which can be readily characterized by signal analysis. Metrics such as Zipf's Law and higher-order information-entropic structure are emerging as indicators of the communicative complexity characteristic of an "intelligent message" content within these animals' signals, perhaps not surprising given these species' social complexity. In addition to human languages, for comparison we also apply these metrics to pulsar signals—perhaps (arguably) the most "organized" of stellar systems—as an example of astrophysical systems that would have to be distinguished from an extraterrestrial intelligence message by such information theoretic filters. We also look at a message transmitted from Earth (Arecibo Observatory) that contains a lot of meaning but little information in the mathematical sense we define it here. We conclude that the study of non-human communication systems on our own planet can make a valuable contribution to the detection of extraterrestrial intelligence by providing quantitative general measures of communicative complexity. Studying the complex communication systems of other intelligent species on our own planet may also be one of the best ways to deprovincialize our thinking about extraterrestrial communication systems in general.

  9. Statistic methods for searching inundated radioactive entities

    International Nuclear Information System (INIS)

    Dubasov, Yu.V.; Krivokhatskij, A.S.; Khramov, N.N.

    1993-01-01

    The problem of searching flooded radioactive object in a present area was considered. Various models of the searching route plotting are discussed. It is shown that spiral route by random points from the centre of the area examined is the most efficient one. The conclusion is made that, when searching flooded radioactive objects, it is advisable to use multidimensional statistical methods of classification

  10. Analytical Methods in Search Theory

    Science.gov (United States)

    1979-11-01

    X, t ) ,I pick g(x,t;E), *(x,tjc) and find the b necessary to satisfy the search equation. SOLUTION: This is an audience participation problem. It...Cnstotiaticon G11trant,’ ’I pp 2110 Path lestegsls,’ to pp., Jun IBM Iltetteol Pepsi pp., Ott 1313 (Tt o besubmitoet lot pubtinatteon l t Messino, Daidit

  11. Intelligible Artificial Intelligence

    OpenAIRE

    Weld, Daniel S.; Bansal, Gagan

    2018-01-01

    Since Artificial Intelligence (AI) software uses techniques like deep lookahead search and stochastic optimization of huge neural networks to fit mammoth datasets, it often results in complex behavior that is difficult for people to understand. Yet organizations are deploying AI algorithms in many mission-critical settings. In order to trust their behavior, we must make it intelligible --- either by using inherently interpretable models or by developing methods for explaining otherwise overwh...

  12. Intelligent bioinformatics : the application of artificial intelligence techniques to bioinformatics problems

    National Research Council Canada - National Science Library

    Keedwell, Edward

    2005-01-01

    ... Intelligence and Computer Science 3.1 Introduction to search 3.2 Search algorithms 3.3 Heuristic search methods 3.4 Optimal search strategies 3.5 Problems with search techniques 3.6 Complexity of...

  13. The Multiple Intelligences Teaching Method and Mathematics ...

    African Journals Online (AJOL)

    The Multiple Intelligences teaching approach has evolved and been embraced widely especially in the United States. The approach has been found to be very effective in changing situations for the better, in the teaching and learning of any subject especially mathematics. Multiple Intelligences teaching approach proposes ...

  14. Cybervetting internet searches for vetting, investigations, and open-source intelligence

    CERN Document Server

    Appel, Edward J

    2014-01-01

    Section I Behavior and TechnologyThe Internet's Potential for Investigators and Intelligence OfficersIntroductionGrowth of Internet UseA Practitioner's PerspectiveThe SearchInternet Posts and the People They ProfileFinding the NeedlesThe Need for SpeedSufficiency of SearchesNotesBehavior OnlineInternet Use GrowthEvolution of Internet UsesPhysical World, Virtual ActivitiesConnections and DisconnectingNotesUse and Abuse: Crime and Mis

  15. Intelligent Robot-assisted Humanitarian Search and Rescue System

    Directory of Open Access Journals (Sweden)

    Henry Y. K. Lau

    2009-11-01

    Full Text Available The unprecedented scale and number of natural and man-made disasters in the past decade has urged international emergency search and rescue communities to seek for novel technology to enhance operation efficiency. Tele-operated search and rescue robots that can navigate deep into rubble to search for victims and to transfer critical field data back to the control console has gained much interest among emergency response institutions. In response to this need, a low-cost autonomous mini robot equipped with thermal sensor, accelerometer, sonar, pin-hole camera, microphone, ultra-bright LED and wireless communication module is developed to study the control of a group of decentralized mini search and rescue robots. The robot can navigate autonomously between voids to look for living body heat and can send back audio and video information to allow the operator to determine if the found object is a living human. This paper introduces the design and control of a low-cost robotic search and rescue system based on an immuno control framework developed for controlling decentralized systems. Design and development of the physical prototype and the immunity-based control system are described in this paper.

  16. Intelligent Robot-Assisted Humanitarian Search and Rescue System

    Directory of Open Access Journals (Sweden)

    Albert W. Y. Ko

    2009-06-01

    Full Text Available The unprecedented scale and number of natural and man-made disasters in the past decade has urged international emergency search and rescue communities to seek for novel technology to enhance operation efficiency. Tele-operated search and rescue robots that can navigate deep into rubble to search for victims and to transfer critical field data back to the control console has gained much interest among emergency response institutions. In response to this need, a low-cost autonomous mini robot equipped with thermal sensor, accelerometer, sonar, pin-hole camera, microphone, ultra-bright LED and wireless communication module is developed to study the control of a group of decentralized mini search and rescue robots. The robot can navigate autonomously between voids to look for living body heat and can send back audio and video information to allow the operator to determine if the found object is a living human. This paper introduces the design and control of a low-cost robotic search and rescue system based on an immuno control framework developed for controlling decentralized systems. Design and development of the physical prototype and the immunity-based control system are described in this paper.

  17. Delamination detection using methods of computational intelligence

    Science.gov (United States)

    Ihesiulor, Obinna K.; Shankar, Krishna; Zhang, Zhifang; Ray, Tapabrata

    2012-11-01

    Abstract Reliable delamination prediction scheme is indispensable in order to prevent potential risks of catastrophic failures in composite structures. The existence of delaminations changes the vibration characteristics of composite laminates and hence such indicators can be used to quantify the health characteristics of laminates. An approach for online health monitoring of in-service composite laminates is presented in this paper that relies on methods based on computational intelligence. Typical changes in the observed vibration characteristics (i.e. change in natural frequencies) are considered as inputs to identify the existence, location and magnitude of delaminations. The performance of the proposed approach is demonstrated using numerical models of composite laminates. Since this identification problem essentially involves the solution of an optimization problem, the use of finite element (FE) methods as the underlying tool for analysis turns out to be computationally expensive. A surrogate assisted optimization approach is hence introduced to contain the computational time within affordable limits. An artificial neural network (ANN) model with Bayesian regularization is used as the underlying approximation scheme while an improved rate of convergence is achieved using a memetic algorithm. However, building of ANN surrogate models usually requires large training datasets. K-means clustering is effectively employed to reduce the size of datasets. ANN is also used via inverse modeling to determine the position, size and location of delaminations using changes in measured natural frequencies. The results clearly highlight the efficiency and the robustness of the approach.

  18. 4th Workshop on Combinations of Intelligent Methods and Applications

    CERN Document Server

    Palade, Vasile; Prentzas, Jim

    2016-01-01

    This volume includes extended and revised versions of the papers presented at the 4th Workshop on “Combinations of Intelligent Methods and Applications” (CIMA 2014) which was intended to become a forum for exchanging experience and ideas among researchers and practitioners dealing with combinations of different intelligent methods in Artificial Intelligence. The aim is to create integrated or hybrid methods that benefit from each of their components. Some of the existing presented efforts combine soft computing methods (fuzzy logic, neural networks and genetic algorithms). Another stream of efforts integrates case-based reasoning or machine learning with soft-computing methods. Some of the combinations have been more widely explored, like neuro-symbolic methods, neuro-fuzzy methods and methods combining rule-based and case-based reasoning. CIMA 2014 was held in conjunction with the 26th IEEE International Conference on Tools with Artificial Intelligence (ICTAI 2014). .

  19. Optimizing Vector-Quantization Processor Architecture for Intelligent Query-Search Applications

    Science.gov (United States)

    Xu, Huaiyu; Mita, Yoshio; Shibata, Tadashi

    2002-04-01

    The architecture of a very large scale integration (VLSI) vector-quantization processor (VQP) has been optimized to develop a general-purpose intelligent query-search agent. The agent performs a similarity-based search in a large-volume database. Although similarity-based search processing is computationally very expensive, latency-free searches have become possible due to the highly parallel maximum-likelihood search architecture of the VQP chip. Three architectures of the VQP chip have been studied and their performances are compared. In order to give reasonable searching results according to the different policies, the concept of penalty function has been introduced into the VQP. An E-commerce real-estate agency system has been developed using the VQP chip implemented in a field-programmable gate array (FPGA) and the effectiveness of such an agency system has been demonstrated.

  20. Phonetic search methods for large speech databases

    CERN Document Server

    Moyal, Ami; Tetariy, Ella; Gishri, Michal

    2013-01-01

    “Phonetic Search Methods for Large Databases” focuses on Keyword Spotting (KWS) within large speech databases. The brief will begin by outlining the challenges associated with Keyword Spotting within large speech databases using dynamic keyword vocabularies. It will then continue by highlighting the various market segments in need of KWS solutions, as well as, the specific requirements of each market segment. The work also includes a detailed description of the complexity of the task and the different methods that are used, including the advantages and disadvantages of each method and an in-depth comparison. The main focus will be on the Phonetic Search method and its efficient implementation. This will include a literature review of the various methods used for the efficient implementation of Phonetic Search Keyword Spotting, with an emphasis on the authors’ own research which entails a comparative analysis of the Phonetic Search method which includes algorithmic details. This brief is useful for resea...

  1. What Friends Are For: Collaborative Intelligence Analysis and Search

    Science.gov (United States)

    2014-06-01

    preferences, then the similarity measure could then be some type of vector angularity measurement. Regardless of how similarity is computed, once 26 the...III. In addition to implementing the model, the software supports analysis of search performance. The program is written in Java and Python and...profiles within the profile database are encoded in XML format, as seen in Figure 13. Profiler is written in both Java and Python and is dependent upon

  2. L factor: hope and fear in the search for extraterrestrial intelligence

    Science.gov (United States)

    Rubin, Charles T.

    2001-08-01

    The L factor in the Drake equation is widely understood to account for most of the variance in estimates of the number of extraterrestrial intelligences that might be contacted by the search for extraterrestrial intelligence (SETI). It is also among the hardest to quantify. An examination of discussions of the L factor in the popular and technical SETI literature suggests that attempts to estimate L involve a variety of potentially conflicting assumptions about civilizational lifespan that reflect hopes and fears about the human future.

  3. Graphics-based intelligent search and abstracting using Data Modeling

    Science.gov (United States)

    Jaenisch, Holger M.; Handley, James W.; Case, Carl T.; Songy, Claude G.

    2002-11-01

    This paper presents an autonomous text and context-mining algorithm that converts text documents into point clouds for visual search cues. This algorithm is applied to the task of data-mining a scriptural database comprised of the Old and New Testaments from the Bible and the Book of Mormon, Doctrine and Covenants, and the Pearl of Great Price. Results are generated which graphically show the scripture that represents the average concept of the database and the mining of the documents down to the verse level.

  4. Intelligent energy allocation strategy for PHEV charging station using gravitational search algorithm

    Science.gov (United States)

    Rahman, Imran; Vasant, Pandian M.; Singh, Balbir Singh Mahinder; Abdullah-Al-Wadud, M.

    2014-10-01

    Recent researches towards the use of green technologies to reduce pollution and increase penetration of renewable energy sources in the transportation sector are gaining popularity. The development of the smart grid environment focusing on PHEVs may also heal some of the prevailing grid problems by enabling the implementation of Vehicle-to-Grid (V2G) concept. Intelligent energy management is an important issue which has already drawn much attention to researchers. Most of these works require formulation of mathematical models which extensively use computational intelligence-based optimization techniques to solve many technical problems. Higher penetration of PHEVs require adequate charging infrastructure as well as smart charging strategies. We used Gravitational Search Algorithm (GSA) to intelligently allocate energy to the PHEVs considering constraints such as energy price, remaining battery capacity, and remaining charging time.

  5. Anthropomorphism in the search for extra-terrestrial intelligence - The limits of cognition?

    Science.gov (United States)

    Bohlmann, Ulrike M.; Bürger, Moritz J. F.

    2018-02-01

    The question "Are we alone?" lingers in the human mind since ancient times. Early human civilisations populated the heavens above with a multitude of Gods endowed with some all too human characteristics - from their outer appearance to their innermost motivations. En passant they created thereby their own cultural founding myths on which they built their understanding of the world and its phenomena and deduced as well rules for the functioning of their own society. Advancing technology has enabled us to conduct this human quest for knowledge with more scientific means: optical and radio-wavelengths are being monitored for messages by an extra-terrestrial intelligence and active messaging attempts have also been undertaken. Scenarios have been developed for a possible detection of extra-terrestrial intelligence and post-detection guidelines and protocols have been elaborated. The human responses to the whole array of questions concerning the potential existence, discovery of and communication/interaction with an extra-terrestrial intelligence share as one clear thread a profound anthropomorphism, which ascribes classical human behavioural patterns also to an extra-terrestrial intelligence in much the same way as our ancestors attributed comparable conducts to mythological figures. This paper aims at pinpointing this thread in a number of classical reactions to basic questions related to the search for extra-terrestrial intelligence. Many of these reactions are based on human motives such as curiosity and fear, rationalised by experience and historical analogy and modelled in the Science Fiction Culture by literature and movies. Scrutinising the classical hypothetical explanations of the Fermi paradox under the angle of a potentially undue anthropomorphism, this paper intends to assist in understanding our human epistemological limitations in the search for extra-terrestrial intelligence. This attempt is structured into a series of questions: I. Can we be alone? II

  6. Search for design intelligence: A field study on the role of emotional intelligence in architectural design studios

    OpenAIRE

    Nazidizaji, Sajjad; Tomé, Ana; Regateiro, Francisco

    2017-01-01

    The design studio is the core of the architecture curriculum. Interpersonal interactions have a key role during the processes of design and critique. The influence of emotional intelligence (EQ) on interpersonal communication skills has been widely proven. This study examines the correlation between EQ and architectural design competence. To achieve this, 78 architecture students were selected via a simple random sampling method and tested using an EQ test questionnaire developed by Bradbury ...

  7. Applying intelligent statistical methods on biometric systems

    OpenAIRE

    Betschart, Willie

    2005-01-01

    This master’s thesis work was performed at Optimum Biometric Labs, OBL, located in Karlskrona, Sweden. Optimum Biometric Labs perform independent scenario evaluations to companies who develop biometric devices. The company has a product Optimum preConTM which is surveillance and diagnosis tool for biometric systems. This thesis work’s objective was to develop a conceptual model and implement it as an additional layer above the biometric layer with intelligence about the biometric users. The l...

  8. The Breakthrough Listen Initiative and the Future of the Search for Intelligent Life

    Science.gov (United States)

    Enriquez, J. Emilio; Siemion, Andrew; Croft, Steve; Hellbourg, Greg; Lebofsky, Matt; MacMahon, David; Price, Danny; DeBoer, David; Werthimer, Dan

    2017-05-01

    Unprecedented recent results in the fields of exoplanets and astrobiology have dramatically increased the interest in the potential existence of intelligent life elsewhere in the galaxy. Additionally, the capabilities of modern Searches for Extraterrestrial Intelligence (SETI) have increased tremendously. Much of this improvement is due to the ongoing development of wide bandwidth radio instruments and the Moore's Law increase in computing power over the previous decades. Together, these instrumentation improvements allow for narrow band signal searches of billions of frequency channels at once.The Breakthrough Listen Initiative (BL) was launched on July 20, 2015 at the Royal Society in London, UK with the goal to conduct the most comprehensive and sensitive search for advanced life in humanity's history. Here we detail important milestones achieved during the first year and a half of the program. We describe the key BL SETI surveys and briefly describe current facilities, including the Green Bank Telescope, the Automated Planet Finder and the Parkes Observatory. We also mention the ongoing and potential collaborations focused on complementary sciences, these include pulse searches of pulsars and FRBs, as well as astrophysically powered radio emission from stars targeted by our program.We conclude with a brief view towards future SETI searches with upcoming next-generation radio facilities such as SKA and ngVLA.

  9. A review of the scientific rationale and methods used in the search for other planetary systems

    Science.gov (United States)

    Black, D. C.

    1985-01-01

    Planetary systems appear to be one of the crucial links in the chain leading from simple molecules to living systems, particularly complex (intelligent?) living systems. Although there is currently no observational proof of the existence of any planetary system other than our own, techniques are now being developed which will permit a comprehensive search for other planetary systems. The scientific rationale for and methods used in such a search effort are reviewed here.

  10. The internet and intelligent machines: search engines, agents and robots; Radiologische Informationssuche im Internet: Datenbanken, Suchmaschinen und intelligente Agenten

    Energy Technology Data Exchange (ETDEWEB)

    Achenbach, S; Alfke, H [Marburg Univ. (Germany). Abt. fuer Strahlendiagnostik

    2000-04-01

    The internet plays an important role in a growing number of medical applications. Finding relevant information is not always easy as the amount of available information on the Web is rising quickly. Even the best Search Engines can only collect links to a fraction of all existing Web pages. In addition, many of these indexed documents have been changed or deleted. The vast majority of information on the Web is not searchable with conventional methods. New search strategies, technologies and standards are combined in Intelligent Search Agents (ISA) an Robots, which can retrieve desired information in a specific approach. Conclusion: The article describes differences between ISAs and conventional Search Engines and how communication between Agents improves their ability to find information. Examples of existing ISAs are given and the possible influences on the current and future work in radiology is discussed. (orig.) [German] Das Internet findet zunehmend in medizinischen Anwendungen Verbreitung, jedoch ist das Auffinden relevanter Informationen nicht immer leicht. Die Anzahl der verfuegbaren Dokumente im World wide web nimmt so schnell zu, dass die Suche zunehmend Probleme bereitet: Auch gute Suchmaschinen erfassen nur einige Prozent der vorhandenen Seiten in Ihren Datenbanken. Zusaetzlich sorgen staendige Veraenderungen dafuer, dass nur ein Teil dieser durchsuchbaren Dokumente ueberhaupt noch existiert. Der Grossteil des Internets ist daher mit konventionellen Methoden nicht zu erschliessen. Neue Standards, Suchstrategien und Technologien vereinen sich in den Suchagenten und Robots, die gezielter und intelligenter Inhalte ermitteln koennen. Schlussfolgerung: Der Artikel stellt dar, wie sich ein Intelligent search agent (ISA) von einer Suchmaschine unterscheidet und durch Kooperation mit anderen Agenten die Anforderungen der Benutzer besser erfuellen kann. Neben den Grundlagen werden exemplarische Anwendungen gezeigt, die heute im Netz existieren, und ein Ausblick

  11. A swarm intelligence framework for reconstructing gene networks: searching for biologically plausible architectures.

    Science.gov (United States)

    Kentzoglanakis, Kyriakos; Poole, Matthew

    2012-01-01

    In this paper, we investigate the problem of reverse engineering the topology of gene regulatory networks from temporal gene expression data. We adopt a computational intelligence approach comprising swarm intelligence techniques, namely particle swarm optimization (PSO) and ant colony optimization (ACO). In addition, the recurrent neural network (RNN) formalism is employed for modeling the dynamical behavior of gene regulatory systems. More specifically, ACO is used for searching the discrete space of network architectures and PSO for searching the corresponding continuous space of RNN model parameters. We propose a novel solution construction process in the context of ACO for generating biologically plausible candidate architectures. The objective is to concentrate the search effort into areas of the structure space that contain architectures which are feasible in terms of their topological resemblance to real-world networks. The proposed framework is initially applied to the reconstruction of a small artificial network that has previously been studied in the context of gene network reverse engineering. Subsequently, we consider an artificial data set with added noise for reconstructing a subnetwork of the genetic interaction network of S. cerevisiae (yeast). Finally, the framework is applied to a real-world data set for reverse engineering the SOS response system of the bacterium Escherichia coli. Results demonstrate the relative advantage of utilizing problem-specific knowledge regarding biologically plausible structural properties of gene networks over conducting a problem-agnostic search in the vast space of network architectures.

  12. Cyclotron operating mode determination based on intelligent methods

    International Nuclear Information System (INIS)

    Ouda, M.M.E.M.

    2011-01-01

    adjust the parameters of the operating mode from the acceleration- extraction- focusing and steering until end of the experiment. This process is tedious and also time consuming and these were the main reasons to search better, faster and efficient method to determine the parameters of a new operating mode. As a result the artificial neural networks as a basis for intelligent system have been used to determine new operating systems for the MGC-20 cyclotron.In this thesis; an intelligent system has been designed and developed to determine new operating systems for the MGC-20 cyclotron, Nuclear Research Center, Atomic Energy Authority. This system based on Feed Forward Back Propagation Neural Networks (FFBPNN). The system consists of five neural networks work in parallel. Every neural network consists of three layers, input, hidden, and output layers. The outputs of the five neural networks represent the normalized values (from 0 to 1 and from -1 to 0) of the 19 parameters of the new operating mode. The inputs for every neural network are the normalized values (from 0 to 1) of the particle name, the particle energy, the beam current intensity, and the duty factor. The outputs of the five neural networks must be calibrated to obtain the real values of the parameters of the new operating mode. These elements of the outputs are the magnetic lenses, the magnetic correctors, the concentric coils, and the harmonic coils. The FFBPNNs are learned by using the feed forward back propagation training algorithm. The learning has been done with different values of the learning factor , the momentum factor and also the number of the hidden layers. The best structure which needs the shortest time to learn and achieve the allowed maximum error has been used.

  13. Artificial intelligence methods applied in the controlled synthesis of polydimethilsiloxane - poly (methacrylic acid) copolymer networks with imposed properties

    Science.gov (United States)

    Rusu, Teodora; Gogan, Oana Marilena

    2016-05-01

    This paper describes the use of artificial intelligence method in copolymer networks design. In the present study, we pursue a hybrid algorithm composed from two research themes in the genetic design framework: a Kohonen neural network (KNN), path (forward problem) combined with a genetic algorithm path (backward problem). The Tabu Search Method is used to improve the performance of the genetic algorithm path.

  14. Automated search method for AFM and profilers

    Science.gov (United States)

    Ray, Michael; Martin, Yves C.

    2001-08-01

    A new automation software creates a search model as an initial setup and searches for a user-defined target in atomic force microscopes or stylus profilometers used in semiconductor manufacturing. The need for such automation has become critical in manufacturing lines. The new method starts with a survey map of a small area of a chip obtained from a chip-design database or an image of the area. The user interface requires a user to point to and define a precise location to be measured, and to select a macro function for an application such as line width or contact hole. The search algorithm automatically constructs a range of possible scan sequences within the survey, and provides increased speed and functionality compared to the methods used in instruments to date. Each sequence consists in a starting point relative to the target, a scan direction, and a scan length. The search algorithm stops when the location of a target is found and criteria for certainty in positioning is met. With today's capability in high speed processing and signal control, the tool can simultaneously scan and search for a target in a robotic and continuous manner. Examples are given that illustrate the key concepts.

  15. Employed and unemployed job search methods: Australian evidence on search duration, wages and job stability

    OpenAIRE

    Colin Green

    2012-01-01

    This paper examines the use and impact of job search methods of both unemployed and employed job seekers. Informal job search methods are associated with relativel high level of job exit and shorter search duration. Job exists through the public employment agency (PEA) display positive duration dependence for the unemployed. This may suggest that the PEA is used as a job search method of last resort. Informal job search methods have lower associated duration in search and higher wages than th...

  16. Artificial intelligence search techniques for optimization of the cold source geometry

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1988-01-01

    Most optimization studies of cold neutron sources have concentrated on the numerical prediction or experimental measurement of the cold moderator optimum thickness which produces the largest cold neutron leakage for a given thermal neutron source. Optimizing the geometrical shape of the cold source, however, is a more difficult problem because the optimized quantity, the cold neutron leakage, is an implicit function of the shape which is the unknown in such a study. We draw an analogy between this problem and a state space search, then we use a simple Artificial Intelligence (AI) search technique to determine the optimum cold source shape based on a two-group, r-z diffusion model. We implemented this AI design concept in the computer program AID which consists of two modules, a physical model module and a search module, which can be independently modified, improved, or made more sophisticated. 7 refs., 1 fig

  17. Artificial intelligence search techniques for the optimization of cold source geometry

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1988-01-01

    Most optimization studies of cold neutron sources have concentrated on the numerical prediction or experimental measurement of the cold moderator optimum thickness that produces the largest cold neutron leakage for a given thermal neutron source. Optimizing the geometric shape of the cold source, however, is a more difficult problem because the optimized quantity, the cold neutron leakage, is an implicit function of the shape, which is the unknown in such a study. An analogy is drawn between this problem and a state space search, then a simple artificial intelligence (AI) search technique is used to determine the optimum cold source shape based on a two-group, r-z diffusion model. This AI design concept was implemented in the computer program AID, which consists of two modules, a physical model module, and a search module, which can be independently modified, improved, or made more sophisticated

  18. The method of search of tendencies

    International Nuclear Information System (INIS)

    Reuss, Paul.

    1981-08-01

    The search of tendencies is an application of the mean squares method. Its objective is the better possible evaluation of the basic data used in the calculations from the comparison between measurements of integral characteristics and the corresponding theoretical results. This report presents the minimization which allows the estimation of the basic data and, above all, the methods which are necessary for the critical analysis of the obtained results [fr

  19. The Search for Extraterrestrial Intelligence in the 1960s: Science in Popular Culture

    Science.gov (United States)

    Smith, Sierra

    2012-01-01

    Building upon the advancement of technology during the Second World War and the important scientific discoveries which have been made about the structure and components of the universe, scientists, especially in radio astronomy and physics, began seriously addressing the possibility of extraterrestrial intelligence in the 1960s. The Search for Extraterrestrial Intelligence (SETI) quickly became one of the most controversial scientific issues in the post Second World War period. The controversy played out, not only in scientific and technical journals, but in newspapers and in popular literature. Proponents for SETI, including Frank Drake, Carl Sagan, and Philip Morrison, actively used a strategy of engagement with the public by using popular media to lobby for exposure and funding. This paper will examine the use of popular media by scientists interested in SETI to popularize and heighten public awareness and also to examine the effects of popularization on SETI's early development. My research has been generously supported by the National Radio Astronomy Observatory.

  20. ORTHO IMAGE AND DTM GENERATION WITH INTELLIGENT METHODS

    Directory of Open Access Journals (Sweden)

    H. Bagheri

    2013-10-01

    Finally the artificial intelligence methods, like genetic algorithms as well as neural networks, were examined on sample data for optimizing interpolation and for generating Digital Terrain Models. The results then were compared with existing conventional methods and it appeared that these methods have a high capacity in heights interpolation and that using these networks for interpolating and optimizing the weighting methods based on inverse distance leads to a high accurate estimation of heights.

  1. Search for design intelligence: A field study on the role of emotional intelligence in architectural design studios

    Directory of Open Access Journals (Sweden)

    Sajjad Nazidizaji

    2014-12-01

    Full Text Available The design studio is the core of the architecture curriculum. Interpersonal interactions have a key role during the processes of design and critique. The influence of emotional intelligence (EQ on interpersonal communication skills has been widely proven. This study examines the correlation between EQ and architectural design competence. To achieve this, 78 architecture students were selected via a simple random sampling method and tested using an EQ test questionnaire developed by Bradbury and Greaves (2006. The scores of five architectural design studio courses (ADS-1, ADS-2, ADS-3, ADS-4, and ADS-5 were used as indicators of the progress in design of the students. Descriptive and inferential statistics methods were both employed to analyze the research data. The methods included correlation analysis, mean comparison t-test for independent samples, and single sample t-test. Findings showed no significant relationship between EQ and any of the indicators.

  2. A new hybrid optimization method inspired from swarm intelligence: Fuzzy adaptive swallow swarm optimization algorithm (FASSO

    Directory of Open Access Journals (Sweden)

    Mehdi Neshat

    2015-11-01

    Full Text Available In this article, the objective was to present effective and optimal strategies aimed at improving the Swallow Swarm Optimization (SSO method. The SSO is one of the best optimization methods based on swarm intelligence which is inspired by the intelligent behaviors of swallows. It has been able to offer a relatively strong method for solving optimization problems. However, despite its many advantages, the SSO suffers from two shortcomings. Firstly, particles movement speed is not controlled satisfactorily during the search due to the lack of an inertia weight. Secondly, the variables of the acceleration coefficient are not able to strike a balance between the local and the global searches because they are not sufficiently flexible in complex environments. Therefore, the SSO algorithm does not provide adequate results when it searches in functions such as the Step or Quadric function. Hence, the fuzzy adaptive Swallow Swarm Optimization (FASSO method was introduced to deal with these problems. Meanwhile, results enjoy high accuracy which are obtained by using an adaptive inertia weight and through combining two fuzzy logic systems to accurately calculate the acceleration coefficients. High speed of convergence, avoidance from falling into local extremum, and high level of error tolerance are the advantages of proposed method. The FASSO was compared with eleven of the best PSO methods and SSO in 18 benchmark functions. Finally, significant results were obtained.

  3. Ortho Image and DTM Generation with Intelligent Methods

    Science.gov (United States)

    Bagheri, H.; Sadeghian, S.

    2013-10-01

    Nowadays the artificial intelligent algorithms has considered in GIS and remote sensing. Genetic algorithm and artificial neural network are two intelligent methods that are used for optimizing of image processing programs such as edge extraction and etc. these algorithms are very useful for solving of complex program. In this paper, the ability and application of genetic algorithm and artificial neural network in geospatial production process like geometric modelling of satellite images for ortho photo generation and height interpolation in raster Digital Terrain Model production process is discussed. In first, the geometric potential of Ikonos-2 and Worldview-2 with rational functions, 2D & 3D polynomials were tested. Also comprehensive experiments have been carried out to evaluate the viability of the genetic algorithm for optimization of rational function, 2D & 3D polynomials. Considering the quality of Ground Control Points, the accuracy (RMSE) with genetic algorithm and 3D polynomials method for Ikonos-2 Geo image was 0.508 pixel sizes and the accuracy (RMSE) with GA algorithm and rational function method for Worldview-2 image was 0.930 pixel sizes. For more another optimization artificial intelligent methods, neural networks were used. With the use of perceptron network in Worldview-2 image, a result of 0.84 pixel sizes with 4 neurons in middle layer was gained. The final conclusion was that with artificial intelligent algorithms it is possible to optimize the existing models and have better results than usual ones. Finally the artificial intelligence methods, like genetic algorithms as well as neural networks, were examined on sample data for optimizing interpolation and for generating Digital Terrain Models. The results then were compared with existing conventional methods and it appeared that these methods have a high capacity in heights interpolation and that using these networks for interpolating and optimizing the weighting methods based on inverse

  4. Harmony Search Method: Theory and Applications

    Directory of Open Access Journals (Sweden)

    X. Z. Gao

    2015-01-01

    Full Text Available The Harmony Search (HS method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem.

  5. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  6. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining.

    Science.gov (United States)

    Salehi, Mojtaba; Bahreininejad, Ardeshir

    2011-08-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.

  7. Heuristic method for searching global maximum of multimodal unknown function

    Energy Technology Data Exchange (ETDEWEB)

    Kamei, K; Araki, Y; Inoue, K

    1983-06-01

    The method is composed of three kinds of searches. They are called g (grasping)-mode search, f (finding)-mode search and c (confirming)-mode search. In the g-mode search and the c-mode search, a heuristic method is used which was extracted from search behaviors of human subjects. In f-mode search, the simplex method is used which is well known as a search method for unimodal unknown function. Each mode search and its transitions are shown in the form of flowchart. The numerical results for one-dimensional through six-dimensional multimodal functions prove the proposed search method to be an effective one. 11 references.

  8. Design and economic investigation of shell and tube heat exchangers using Improved Intelligent Tuned Harmony Search algorithm

    Directory of Open Access Journals (Sweden)

    Oguz Emrah Turgut

    2014-12-01

    Full Text Available This study explores the thermal design of shell and tube heat exchangers by using Improved Intelligent Tuned Harmony Search (I-ITHS algorithm. Intelligent Tuned Harmony Search (ITHS is an upgraded version of harmony search algorithm which has an advantage of deciding intensification and diversification processes by applying proper pitch adjusting strategy. In this study, we aim to improve the search capacity of ITHS algorithm by utilizing chaotic sequences instead of uniformly distributed random numbers and applying alternative search strategies inspired by Artificial Bee Colony algorithm and Opposition Based Learning on promising areas (best solutions. Design variables including baffle spacing, shell diameter, tube outer diameter and number of tube passes are used to minimize total cost of heat exchanger that incorporates capital investment and the sum of discounted annual energy expenditures related to pumping and heat exchanger area. Results show that I-ITHS can be utilized in optimizing shell and tube heat exchangers.

  9. Durham Zoo: Powering a Search-&-Innovation Engine with Collective Intelligence

    Directory of Open Access Journals (Sweden)

    Richard Absalom

    2015-02-01

    Full Text Available Purpose – Durham Zoo (hereinafter – DZ is a project to design and operate a concept search engine for science and technology. In DZ, a concept includes a solution to a problem in a particular context.Design – Concept searching is rendered complex by the fuzzy nature of a concept, the many possible implementations of the same concept, and the many more ways that the many implementations can be expressed in natural language. An additional complexity is the diversity of languages and formats, in which the concepts can be disclosed.Humans understand language, inference, implication and abstraction and, hence, concepts much better than computers, that in turn are much better at storing and processing vast amounts of data.We are 7 billion on the planet and we have the Internet as the backbone for Collective Intelligence. So, our concept search engine uses humans to store concepts via a shorthand that can be stored, processed and searched by computers: so, humans IN and computers OUT.The shorthand is classification: metadata in a structure that can define the content of a disclosure. The classification is designed to be powerful in terms of defining and searching concepts, whilst suited to a crowdsourcing effort. It is simple and intuitive to use. Most importantly, it is adapted to restrict ambiguity, which is the poison of classification, without imposing a restrictive centralised management.In the classification scheme, each entity is shown together in a graphical representation with related entities. The entities are arranged on a sliding scale of similarity. This sliding scale is effectively fuzzy classification.Findings – The authors of the paper have been developing a first classification scheme for the technology of traffic cones, this in preparation for a trial of a working system. The process has enabled the authors to further explore the practicalities of concept classification. The CmapTools knowledge modelling kit to develop the

  10. Artificial neural network intelligent method for prediction

    Science.gov (United States)

    Trifonov, Roumen; Yoshinov, Radoslav; Pavlova, Galya; Tsochev, Georgi

    2017-09-01

    Accounting and financial classification and prediction problems are high challenge and researchers use different methods to solve them. Methods and instruments for short time prediction of financial operations using artificial neural network are considered. The methods, used for prediction of financial data as well as the developed forecasting system with neural network are described in the paper. The architecture of a neural network used four different technical indicators, which are based on the raw data and the current day of the week is presented. The network developed is used for forecasting movement of stock prices one day ahead and consists of an input layer, one hidden layer and an output layer. The training method is algorithm with back propagation of the error. The main advantage of the developed system is self-determination of the optimal topology of neural network, due to which it becomes flexible and more precise The proposed system with neural network is universal and can be applied to various financial instruments using only basic technical indicators as input data.

  11. When is Information Sufficient for Action Search with Unreliable Yet Informative Intelligence

    Science.gov (United States)

    2016-03-30

    specificity. Math . Methods Oper. Res. 68(3): 539–549. Lange R-J (2012) Brownian motion and multidimensional decision making. Unpublished doctoral...intelligence assets, political ramifications, etc. We describe the problem in §2 and formulate the mathe - matical model in §3. The cases of n= 2 and n...boundary problem in n dimensions. When n> 2 cells, our problem relates to the family of multinomial selection problems (Kim and Nelson 2006) in which an

  12. Intelligent numerical methods applications to fractional calculus

    CERN Document Server

    Anastassiou, George A

    2016-01-01

    In this monograph the authors present Newton-type, Newton-like and other numerical methods, which involve fractional derivatives and fractional integral operators, for the first time studied in the literature. All for the purpose to solve numerically equations whose associated functions can be also non-differentiable in the ordinary sense. That is among others extending the classical Newton method theory which requires usual differentiability of function. Chapters are self-contained and can be read independently and several advanced courses can be taught out of this book. An extensive list of references is given per chapter. The book’s results are expected to find applications in many areas of applied mathematics, stochastics, computer science and engineering. As such this monograph is suitable for researchers, graduate students, and seminars of the above subjects, also to be in all science and engineering libraries.

  13. Comparison of methods for estimating premorbid intelligence

    OpenAIRE

    Bright, Peter; van der Linde, Ian

    2018-01-01

    To evaluate impact of neurological injury on cognitive performance it is typically necessary to derive a baseline (or ‘premorbid’) estimate of a patient’s general cognitive ability prior to the onset of impairment. In this paper, we consider a range of common methods for producing this estimate, including those based on current best performance, embedded ‘hold/no hold’ tests, demographic information, and word reading ability. Ninety-two neurologically healthy adult participants were assessed ...

  14. Are We Alone? GAVRT Search for Extra Terrestrial Intelligence (SETI) Project

    Science.gov (United States)

    Bensel, Holly; Cool, Ian; St. Mary's High School Astronomy Club; St. Mary's Middle School Astronomy Club

    2017-01-01

    The Goldstone Apple Valley Radio Telescope Program (GAVRT) is a partnership between NASA’s Jet Propulsion Laboratory and the Lewis Center for Educational Research. The program is an authentic science investigation program for students in grades K through 12 and offers them the ability to learn how to be a part of a science team while they are making a real contribution to scientific knowledge.Using the internet from their classroom, students take control of a 34-meter decommissioned NASA radio telescope located at the Goldstone Deep Space Network complex in California. Students collect data on strong radio sources and work in collaboration with professional radio astronomers to analyze the data.Throughout history man has wondered if we were alone in the Universe. SETI - or the Search for Extra Terrestrial Intelligence - is one of the programs offered through GAVRT that is designed to help answer that question. By participating in SETI, students learn about science by doing real science and maybe, if they get very lucky, they might make the most important discovery of our lifetime: Intelligent life beyond Earth!At St. Mary’s School, students in grades 6-12 have participated in the project since its inception. The St. Mary’s Middle School Astronomy Club is leading the way in their relentless search for ET and radio telescope studies. Students use the radio telescope to select a very small portion of the Milky Way Galaxy - or galactic plane - and scan across it over and over in the hopes of finding a signal that is not coming from humans or radio interference. The possibility of being the first to discover an alien signal has kept some students searching for the past three years. For them to discover something of this magnitude is like winning the lottery: small chance of winning - big payoff. To that end, the club is focusing on several portions of the Milky Way where they have detected a strong candidate in the past. The hope is to pick it up a second and

  15. Method of dynamic fuzzy symptom vector in intelligent diagnosis

    International Nuclear Information System (INIS)

    Sun Hongyan; Jiang Xuefeng

    2010-01-01

    Aiming at the requirement of diagnostic symptom real-time updating brought from diagnostic knowledge accumulation and great gap in unit and value of diagnostic symptom in multi parameters intelligent diagnosis, the method of dynamic fuzzy symptom vector is proposed. The concept of dynamic fuzzy symptom vector is defined. Ontology is used to specify the vector elements, and the vector transmission method based on ontology is built. The changing law of symptom value is analyzed and fuzzy normalization method based on fuzzy membership functions is built. An instance proved method of dynamic fussy symptom vector is efficient to solve the problems of symptom updating and unify of symptom value and unit. (authors)

  16. Polyphase-discrete Fourier transform spectrum analysis for the Search for Extraterrestrial Intelligence sky survey

    Science.gov (United States)

    Zimmerman, G. A.; Gulkis, S.

    1991-01-01

    The sensitivity of a matched filter-detection system to a finite-duration continuous wave (CW) tone is compared with the sensitivities of a windowed discrete Fourier transform (DFT) system and an ideal bandpass filter-bank system. These comparisons are made in the context of the NASA Search for Extraterrestrial Intelligence (SETI) microwave observing project (MOP) sky survey. A review of the theory of polyphase-DFT filter banks and its relationship to the well-known windowed-DFT process is presented. The polyphase-DFT system approximates the ideal bandpass filter bank by using as few as eight filter taps per polyphase branch. An improvement in sensitivity of approx. 3 dB over a windowed-DFT system can be obtained by using the polyphase-DFT approach. Sidelobe rejection of the polyphase-DFT system is vastly superior to the windowed-DFT system, thereby improving its performance in the presence of radio frequency interference (RFI).

  17. Intelligent systems for urban search and rescue: challenges and lessons learned

    Science.gov (United States)

    Jacoff, Adam; Messina, Elena; Weiss, Brian A.

    2003-09-01

    Urban search and rescue (USAR) is one of the most dangerous and time-critical non-wartime activities. Researchers have been developing hardware and software to enable robots to perform some search and rescue functions so as to minimize the exposure of human rescue personnel to danger and maximize the survival of victims. Significant progress has been achieved, but much work remains. USAR demands a blending of numerous specialized technologies. An effective USAR robot must be endowed with key competencies, such as being able to negotiate collapsed structures, find victims and assess their condition, identify potential hazards, generate maps of the structure and victim locations, and communicate with rescue personnel. These competencies bring to bear work in numerous sub-disciplines of intelligent systems (or artificial intelligence) such as sensory processing, world modeling, behavior generation, path planning, and human-robot interaction, in addition to work in communications, mechanism design and advanced sensors. In an attempt to stimulate progress in the field, reference USAR challenges are being developed and propagated worldwide. In order to make efficient use of finite research resources, the robotic USAR community must share a common understanding of what is required, technologically, to attain each competency, and have a rigorous measure of the current level of effectiveness of various technologies. NIST is working with partner organizations to measure the performance of robotic USAR competencies and technologies. In this paper, we describe the reference test arenas for USAR robots, assess the current challenges within the field, and discuss experiences thus far in the testing effort.

  18. Intelligence

    Science.gov (United States)

    Sternberg, Robert J.

    2012-01-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain—especially with regard to the functioning in the prefrontal cortex—and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

  19. Intelligence.

    Science.gov (United States)

    Sternberg, Robert J

    2012-03-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex-and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret.

  20. The ontology supported intelligent system for experiment search in the scientific Research center

    Directory of Open Access Journals (Sweden)

    Cvjetković Vladimir

    2014-01-01

    Full Text Available Ontologies and corresponding knowledge bases can be quite successfully used for many tasks that rely on domain knowledge and semantic structures, which should be available for machine processing and sharing. Using SPARQL queries for retrieval of required elements from ontologies and knowledge bases, can significantly simplify modeling of arbitrary structures of concepts and data, and implementation of required functionalities. This paper describes developed ontology for support of Research Centre for testing of active substances that conducts scientific experiments. According to created ontology corresponding knowledge base was made and populated with real experimental data. Developed ontology and knowledge base are directly used for an intelligent system of experiment search which is based on many criteria from ontology. Proposed system gets the desired search result, which is actually an experiment in the form of a written report. Presented solution and implementation are very flexible and adaptable, and can be used as kind of a template by similar information system dealing with biological or similar complex system.

  1. Continuous surveillance of transformers using artificial intelligence methods; Surveillance continue des transformateurs: application des methodes d'intelligence artificielle

    Energy Technology Data Exchange (ETDEWEB)

    Schenk, A.; Germond, A. [Ecole Polytechnique Federale de Lausanne, Lausanne (Switzerland); Boss, P.; Lorin, P. [ABB Secheron SA, Geneve (Switzerland)

    2000-07-01

    The article describes a new method for the continuous surveillance of power transformers based on the application of artificial intelligence (AI) techniques. An experimental pilot project on a specially equipped, strategically important power transformer is described. Traditional surveillance methods and the use of mathematical models for the prediction of faults are described. The article describes the monitoring equipment used in the pilot project and the AI principles such as self-organising maps that are applied. The results obtained from the pilot project and methods for their graphical representation are discussed.

  2. Dual-mode nested search method for categorical uncertain multi-objective optimization

    Science.gov (United States)

    Tang, Long; Wang, Hu

    2016-10-01

    Categorical multi-objective optimization is an important issue involved in many matching design problems. Non-numerical variables and their uncertainty are the major challenges of such optimizations. Therefore, this article proposes a dual-mode nested search (DMNS) method. In the outer layer, kriging metamodels are established using standard regular simplex mapping (SRSM) from categorical candidates to numerical values. Assisted by the metamodels, a k-cluster-based intelligent sampling strategy is developed to search Pareto frontier points. The inner layer uses an interval number method to model the uncertainty of categorical candidates. To improve the efficiency, a multi-feature convergent optimization via most-promising-area stochastic search (MFCOMPASS) is proposed to determine the bounds of objectives. Finally, typical numerical examples are employed to demonstrate the effectiveness of the proposed DMNS method.

  3. Comparison tomography relocation hypocenter grid search and guided grid search method in Java island

    International Nuclear Information System (INIS)

    Nurdian, S. W.; Adu, N.; Palupi, I. R.; Raharjo, W.

    2016-01-01

    The main data in this research is earthquake data recorded from 1952 to 2012 with 9162 P wave and 2426 events are recorded by 30 stations located around Java island. Relocation hypocenter processed using grid search and guidded grid search method. Then the result of relocation hypocenter become input for tomography pseudo bending inversion process. It can be used to identification the velocity distribution in subsurface. The result of relocation hypocenter by grid search and guided grid search method after tomography process shown in locally and globally. In locally area grid search method result is better than guided grid search according to geological reseach area. But in globally area the result of guided grid search method is better for a broad area because the velocity variation is more diverse than the other one and in accordance with local geological research conditions. (paper)

  4. Reasoning methods in medical consultation systems: artificial intelligence approaches.

    Science.gov (United States)

    Shortliffe, E H

    1984-01-01

    It has been argued that the problem of medical diagnosis is fundamentally ill-structured, particularly during the early stages when the number of possible explanations for presenting complaints can be immense. This paper discusses the process of clinical hypothesis evocation, contrasts it with the structured decision making approaches used in traditional computer-based diagnostic systems, and briefly surveys the more open-ended reasoning methods that have been used in medical artificial intelligence (AI) programs. The additional complexity introduced when an advice system is designed to suggest management instead of (or in addition to) diagnosis is also emphasized. Example systems are discussed to illustrate the key concepts.

  5. Subspace methods for pattern recognition in intelligent environment

    CERN Document Server

    Jain, Lakhmi

    2014-01-01

    This research book provides a comprehensive overview of the state-of-the-art subspace learning methods for pattern recognition in intelligent environment. With the fast development of internet and computer technologies, the amount of available data is rapidly increasing in our daily life. How to extract core information or useful features is an important issue. Subspace methods are widely used for dimension reduction and feature extraction in pattern recognition. They transform a high-dimensional data to a lower-dimensional space (subspace), where most information is retained. The book covers a broad spectrum of subspace methods including linear, nonlinear and multilinear subspace learning methods and applications. The applications include face alignment, face recognition, medical image analysis, remote sensing image classification, traffic sign recognition, image clustering, super resolution, edge detection, multi-view facial image synthesis.

  6. The Breakthrough Listen Search for Intelligent Life: the first SETI results and other future science.

    Science.gov (United States)

    Enriquez, J. Emilio; Breakthrough Listen Team

    2018-01-01

    The Breakthrough Listen (BL) Initiative is the largest campaign in human history on the search for extraterrestrial intelligence. The work presented here is the first BL search for engineered signals. This comprises a sample of 692 nearby stars within 50 pc. We used the Green Bank Telescope (GBT) to conduct observations over 1.1-1.9 GHz (L-band). Our observing strategy allows us to reject most of the detected signals as terrestrial interference. During the analysis, eleven stars show events that passed our thresholding algorithm, but detailed analysis of their properties indicates they are consistent with known examples of anthropogenic radio frequency interference. This small number of false positives and their understood properties give confidence on the techniques used for this search. We conclude that, at the time of our observations none of the observed systems host high-duty-cycle radio transmitters emitting at the observed frequencies with an EIRP of 10^13 W, readily achievable by our own civilization.We can place limits on the presence of engineered signals from putative extraterrestrial civilizations inhabiting the environs of the target stars. Our results suggest that fewer than ~0.1% of the stellar systems within 50 pc possess the type of transmitters searched in this survey. This work provides the most stringent limit on the number of low power radio transmitters around nearby stars to date. We explored several metics to compare our results to previous SETI efforts. We developed a new figure-of-merit that can encompass a wider set of parameters and can be used on future SETI experiments for a meaningful comparison.We note that the current BL state-of-the-art digital backend installed at the Green Bank Observatory is the fastest ever used for a SETI experiment by a factor of a few. Here we will describe the potential use of the BL backend by other groups on complementary science, as well as a mention the ongoing and potential collaborations focused in

  7. AVID Students' Perceptions of Intelligence: A Mixed Methods Study

    Science.gov (United States)

    Becker, John Darrell

    2012-01-01

    Students' perceptions of intelligence have been shown to have an effect on learning. Students who see intelligence as something that can be developed, those with a growth mindset, often experience academic success, while those who perceive intelligence to be a fixed entity are typically less likely to take on challenging learning experiences and…

  8. AN OPPORTUNISTIC SEARCH FOR EXTRATERRESTRIAL INTELLIGENCE (SETI) WITH THE MURCHISON WIDEFIELD ARRAY

    Energy Technology Data Exchange (ETDEWEB)

    Tingay, S. J.; Tremblay, C.; Walsh, A.; Urquhart, R. [International Centre for Radio Astronomy Research (ICRAR), Curtin University, Bentley, WA 6102 (Australia)

    2016-08-20

    A spectral line image cube generated from 115 minutes of MWA data that covers a field of view of 400 sq, deg. around the Galactic Center is used to perform the first Search for ExtraTerrestrial Intelligence (SETI) with the Murchison Widefield Array (MWA). Our work constitutes the first modern SETI experiment at low radio frequencies, here between 103 and 133 MHz, paving the way for large-scale searches with the MWA and, in the future, the low-frequency Square Kilometre Array. Limits of a few hundred mJy beam{sup −1} for narrowband emission (10 kHz) are derived from our data, across our 400 sq. deg. field of view. Within this field, 45 exoplanets in 38 planetary systems are known. We extract spectra at the locations of these systems from our image cube to place limits on the presence of narrow line emission from these systems. We then derive minimum isotropic transmitter powers for these exoplanets; a small handful of the closest objects (10 s of pc) yield our best limits of order 10{sup 14} W (Equivalent Isotropic Radiated Power). These limits lie above the highest power directional transmitters near these frequencies currently operational on Earth. A SETI experiment with the MWA covering the full accessible sky and its full frequency range would require approximately one month of observing time. The MWA frequency range, its southern hemisphere location on an extraordinarily radio quiet site, its very large field of view, and its high sensitivity make it a unique facility for SETI.

  9. Efficient protein structure search using indexing methods.

    Science.gov (United States)

    Kim, Sungchul; Sael, Lee; Yu, Hwanjo

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively.

  10. Estimation of mechanical properties of nanomaterials using artificial intelligence methods

    Science.gov (United States)

    Vijayaraghavan, V.; Garg, A.; Wong, C. H.; Tai, K.

    2014-09-01

    Computational modeling tools such as molecular dynamics (MD), ab initio, finite element modeling or continuum mechanics models have been extensively applied to study the properties of carbon nanotubes (CNTs) based on given input variables such as temperature, geometry and defects. Artificial intelligence techniques can be used to further complement the application of numerical methods in characterizing the properties of CNTs. In this paper, we have introduced the application of multi-gene genetic programming (MGGP) and support vector regression to formulate the mathematical relationship between the compressive strength of CNTs and input variables such as temperature and diameter. The predictions of compressive strength of CNTs made by these models are compared to those generated using MD simulations. The results indicate that MGGP method can be deployed as a powerful method for predicting the compressive strength of the carbon nanotubes.

  11. Teaching Planetary Science as Part of the Search for Extraterrestrial Intelligence (SETI)

    Science.gov (United States)

    Margot, Jean-Luc; Greenberg, Adam H.

    2017-10-01

    In Spring 2016 and 2017, UCLA offered a course titled "EPSS C179/279 - Search for Extraterrestrial Intelligence: Theory and Applications". The course is designed for advanced undergraduate students and graduate students in the science, technical, engineering, and mathematical fields. Each year, students designed an observing sequence for the Green Bank telescope, observed known planetary systems remotely, wrote a sophisticated and modular data processing pipeline, analyzed the data, and presented their results. In 2016, 15 students participated in the course (9U, 5G; 11M, 3F) and observed 14 planetary systems in the Kepler field. In 2017, 17 students participated (15U, 2G; 10M, 7F) and observed 10 planetary systems in the Kepler field, TRAPPIST-1, and LHS 1140. In order to select suitable targets, students learned about planetary systems, planetary habitability, and planetary dynamics. In addition to planetary science fundamentals, students learned radio astronomy fundamentals, collaborative software development, signal processing techniques, and statistics. Evaluations indicate that the course is challenging but that students are eager to learn because of the engrossing nature of SETI. Students particularly value the teamwork approach, the observing experience, and working with their own data. The next offering of the course will be in Spring 2018. Additional information about our SETI work is available at seti.ucla.edu.

  12. 5th International Workshop on Combinations of Intelligent Methods and Applications

    CERN Document Server

    Palade, Vasile; Prentzas, Jim

    2017-01-01

    Complex problems usually cannot be solved by individual methods or techniques and require the synergism of more than one of them to be solved. This book presents a number of current efforts that use combinations of methods or techniques to solve complex problems in the areas of sentiment analysis, search in GIS, graph-based social networking, intelligent e-learning systems, data mining and recommendation systems. Most of them are connected with specific applications, whereas the rest are combinations based on principles. Most of the chapters are extended versions of the corresponding papers presented in CIMA-15 Workshop, which took place in conjunction with IEEE ICTAI-15, in November 2015. The rest are invited papers that responded to special call for papers for the book. The book is addressed to researchers and practitioners from academia or industry, who are interested in using combined methods in solving complex problems in the above areas.

  13. Hybrid Intelligent Control Method to Improve the Frequency Support Capability of Wind Energy Conversion Systems

    Directory of Open Access Journals (Sweden)

    Shin Young Heo

    2015-10-01

    Full Text Available This paper presents a hybrid intelligent control method that enables frequency support control for permanent magnet synchronous generators (PMSGs wind turbines. The proposed method for a wind energy conversion system (WECS is designed to have PMSG modeling and full-scale back-to-back insulated-gate bipolar transistor (IGBT converters comprising the machine and grid side. The controller of the machine side converter (MSC and the grid side converter (GSC are designed to achieve maximum power point tracking (MPPT based on an improved hill climb searching (IHCS control algorithm and de-loaded (DL operation to obtain a power margin. Along with this comprehensive control of maximum power tracking mode based on the IHCS, a method for kinetic energy (KE discharge control of the supporting primary frequency control scheme with DL operation is developed to regulate the short-term frequency response and maintain reliable operation of the power system. The effectiveness of the hybrid intelligent control method is verified by a numerical simulation in PSCAD/EMTDC. Simulation results show that the proposed approach can improve the frequency regulation capability in the power system.

  14. Based on Short Motion Paths and Artificial Intelligence Method for Chinese Chess Game

    Directory of Open Access Journals (Sweden)

    Chien-Ming Hung

    2017-08-01

    Full Text Available The article develops the decision rules to win each set of the Chinese chess game using evaluation algorithm and artificial intelligence method, and uses the mobile robot to be instead of the chess, and presents the movement scenarios using the shortest motion paths for mobile robots. Player can play the Chinese chess game according to the game rules with the supervised computer. The supervised computer decides the optimal motion path to win the set using artificial intelligence method, and controls mobile robots according to the programmed motion paths of the assigned chesses moving on the platform via wireless RF interface. We uses enhance A* searching algorithm to solve the shortest path problem of the assigned chess, and solve the collision problems of the motion paths for two mobile robots moving on the platform simultaneously. We implement a famous set to be called lwild horses run in farmr using the proposed method. First we use simulation method to display the motion paths of the assigned chesses for the player and the supervised computer. Then the supervised computer implements the simulation results on the chessboard platform using mobile robots. Mobile robots move on the chessboard platform according to the programmed motion paths and is guided to move on the centre line of the corridor, and avoid the obstacles (chesses, and detect the cross point of the platform using three reflective IR modules.

  15. Job Search Methods: Consequences for Gender-based Earnings Inequality.

    Science.gov (United States)

    Huffman, Matt L.; Torres, Lisa

    2001-01-01

    Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)

  16. The Breakthrough Listen Search for Intelligent Life: Radio Frequency Interference in the Green Bank Telescope

    Science.gov (United States)

    Dana, Ryan

    2018-01-01

    In the search for extra terrestrial intelligence, the vast majority of our “signals of interest,” are simply satellite radio frequency interference. The goal to my research, therefore, was to accurately predict the exact locations of satellites in our sky to analyze specific satellites causing the interference as well as potentially predict when satellites will cross in the way of our beams so that we can further optimize our scripts and get more usable data.I have built an algorithm that plots the exact location in altitude and azimuth of any grouping of satellites that you want in the sky from any position on earth in latitude, longitude, and elevation. From there, you can input a specific right ascension and declination of the location you are trying to track in the sky with a telescope. Using these inputs, we can calculate the angular and positional distance of certain satellites to our beam to further analyze satellite radio frequency interference.The process begins by importing a list of Two Line Element information that the algorithm reads in. Two Line Elements are how Satellites are organized and are updated frequently. They give a variety of information ranging from the Satellite ID to its Mean Motion or anomaly. From there, the code breaks up the information given by these elements to predict their location. The algorithm can also plot in 3D coordinates around an earthlike sphere to conceptualize the route that each Satellite has taken.The code has been used in a variety of ways but most notably to identify satellites interfering with the beam for Arecibo’s Ross 128 Candidate signal. From here, the code will be the backbone to calculating drift rates, Doppler shifts and intensity of certain satellites and why our team consistently receives estranged satellite signals of interest. Furthermore, in the case of a serious candidate signal in the near future, it will be important to analyze satellites interfering in the beam.

  17. An automated full-symmetry Patterson search method

    International Nuclear Information System (INIS)

    Rius, J.; Miravitlles, C.

    1987-01-01

    A full-symmetry Patterson search method is presented that performs a molecular coarse rotation search in vector space and orientation refinement using the σ function. The oriented molecule is positioned using the fast translation function τ 0 , which is based on the automated interpretation of τ projections using the sum function. This strategy reduces the number of Patterson-function values to be stored in the rotation search, and the use of the τ 0 function minimizes the required time for the development of all probable rotation search solutions. The application of this method to five representative test examples is shown. (orig.)

  18. Developing energy forecasting model using hybrid artificial intelligence method

    Institute of Scientific and Technical Information of China (English)

    Shahram Mollaiy-Berneti

    2015-01-01

    An important problem in demand planning for energy consumption is developing an accurate energy forecasting model. In fact, it is not possible to allocate the energy resources in an optimal manner without having accurate demand value. A new energy forecasting model was proposed based on the back-propagation (BP) type neural network and imperialist competitive algorithm. The proposed method offers the advantage of local search ability of BP technique and global search ability of imperialist competitive algorithm. Two types of empirical data regarding the energy demand (gross domestic product (GDP), population, import, export and energy demand) in Turkey from 1979 to 2005 and electricity demand (population, GDP, total revenue from exporting industrial products and electricity consumption) in Thailand from 1986 to 2010 were investigated to demonstrate the applicability and merits of the present method. The performance of the proposed model is found to be better than that of conventional back-propagation neural network with low mean absolute error.

  19. Application of artificial intelligence to search ground-state geometry of clusters

    International Nuclear Information System (INIS)

    Lemes, Mauricio Ruv; Marim, L.R.; Dal Pino, A. Jr.

    2002-01-01

    We introduce a global optimization procedure, the neural-assisted genetic algorithm (NAGA). It combines the power of an artificial neural network (ANN) with the versatility of the genetic algorithm. This method is suitable to solve optimization problems that depend on some kind of heuristics to limit the search space. If a reasonable amount of data is available, the ANN can 'understand' the problem and provide the genetic algorithm with a selected population of elements that will speed up the search for the optimum solution. We tested the method in a search for the ground-state geometry of silicon clusters. We trained the ANN with information about the geometry and energetics of small silicon clusters. Next, the ANN learned how to restrict the configurational space for larger silicon clusters. For Si 10 and Si 20 , we noticed that the NAGA is at least three times faster than the 'pure' genetic algorithm. As the size of the cluster increases, it is expected that the gain in terms of time will increase as well

  20. Fast radio burst search: cross spectrum vs. auto spectrum method

    Science.gov (United States)

    Liu, Lei; Zheng, Weimin; Yan, Zhen; Zhang, Juan

    2018-06-01

    The search for fast radio bursts (FRBs) is a hot topic in current radio astronomy studies. In this work, we carry out a single pulse search with a very long baseline interferometry (VLBI) pulsar observation data set using both auto spectrum and cross spectrum search methods. The cross spectrum method, first proposed in Liu et al., maximizes the signal power by fully utilizing the fringe phase information of the baseline cross spectrum. The auto spectrum search method is based on the popular pulsar software package PRESTO, which extracts single pulses from the auto spectrum of each station. According to our comparison, the cross spectrum method is able to enhance the signal power and therefore extract single pulses from data contaminated by high levels of radio frequency interference (RFI), which makes it possible to carry out a search for FRBs in regular VLBI observations when RFI is present.

  1. The Search for Extraterrestrial Intelligence (SETI) and Whether to send 'Messages' (METI): A Case for Conversation, Patience and Due Diligence

    Science.gov (United States)

    Brin, D.

    Understanding the controversy over "Messages to Extra Terrestrial Intelligence" or METI requires a grounding in the history and rationale of SETI (Search for ETI). Insights since the turn of the century have changed SETI's scientific basis. Continued null results from the radio search do not invalidate continuing effort, but they do raise questions about long-held assumptions. Modified search strategies are discussed. The Great Silence or Fermi Paradox is appraised, along with the disruptive plausibility of interstellar travel. Psychological motivations for METI are considered. With this underpinning, we consider why a small cadre of SETI-ist radio astronomers have resisted the notion of international consultations before humanity takes a brash and irreversible step into METI, shouting our presence into the cosmos.

  2. Real-time earthquake monitoring using a search engine method.

    Science.gov (United States)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  3. Search for microorganisms on Europa and Mars in relation with the evolution of intelligent behavior on other worlds

    International Nuclear Information System (INIS)

    Chela-Flores, Julian

    2001-11-01

    Within the context of how to search for life in the Solar System, we discuss the need to consider universal evolutionary biomarkers, in addition to those of biochemical nature that have already been selected for in the biology experiments of the old Viking and future Beagle-2 landers. For the wider problem of the evolution of intelligent behavior on other worlds (the SETI program), the type of experiments suggested below aim at establishing a direct connection between Solar System exploration and the first steps along the pathway toward the evolution of intelligent behavior. The two leading sites for the implementation of the proposed first whole-cell experiments would be, firstly, Europa after the Europa-Orbiter mission, either on the ice-crust, or in the ocean itself by means of a submersible; secondly, such experiments could be implemented once isolated liquid water oases are identified in the Martian substratum. (author)

  4. Building maps to search the web: the method Sewcom

    Directory of Open Access Journals (Sweden)

    Corrado Petrucco

    2002-01-01

    Full Text Available Seeking information on the Internet is becoming a necessity 'at school, at work and in every social sphere. Unfortunately the difficulties' inherent in the use of search engines and the use of unconscious cognitive approaches inefficient limit their effectiveness. It is in this respect presented a method, called SEWCOM that lets you create conceptual maps through interaction with search engines.

  5. Job Search as Goal-Directed Behavior: Objectives and Methods

    Science.gov (United States)

    Van Hoye, Greet; Saks, Alan M.

    2008-01-01

    This study investigated the relationship between job search objectives (finding a new job/turnover, staying aware of job alternatives, developing a professional network, and obtaining leverage against an employer) and job search methods (looking at job ads, visiting job sites, networking, contacting employment agencies, contacting employers, and…

  6. Sampling the Radio Transient Universe: Studies of Pulsars and the Search for Extraterrestrial Intelligence

    Science.gov (United States)

    Chennamangalam, Jayanth

    The transient radio universe is a relatively unexplored area of astronomy, offering a variety of phenomena, from solar and Jovian bursts, to flare stars, pulsars, and bursts of Galactic and potentially even cosmological origin. Among these, perhaps the most widely studied radio transients, pulsars are fast-spinning neutron stars that emit radio beams from their magnetic poles. In spite of over 40 years of research on pulsars, we have more questions than answers on these exotic compact objects, chief among them the nature of their emission mechanism. Nevertheless, the wealth of phenomena exhibited by pulsars make them one of the most useful astrophysical tools. With their high densities, pulsars are probes of the nature of ultra-dense matter. Characterized by their high timing stability, pulsars can be used to verify the predictions of general relativity, discover planets around them, study bodies in the solar system, and even serve as an interplanetary (and possibly some day, interstellar) navigation aid. Pulsars are also used to study the nature of the interstellar medium, much like a flashlight illuminating airborne dust in a dark room. Studies of pulsars in the Galactic center can help answer questions about the massive black hole in the region and the star formation history in its vicinity. Millisecond pulsars in globular clusters are long-lived tracers of their progenitors, low-mass X-ray binaries, and can be used to study the dynamical history of those clusters. Another source of interest in radio transient astronomy is the hitherto undetected engineered signal from extraterrestrial intelligence. The Search for Extraterrestrial Intelligence (SETI) is an ongoing attempt at discovering the presence of technological life elsewhere in the Galaxy. In this work, I present my forays into two aspects of the study of the radio transient universe---pulsars and SETI. Firstly, I describe my work on the luminosity function and population size of pulsars in the globular

  7. Creativity Styles and Emotional Intelligence of Filipino Student Teachers: A Search for Congruity

    Directory of Open Access Journals (Sweden)

    Gilbert C. Magulod Jr.

    2017-02-01

    Full Text Available The purpose of this study is to determine the congruity between the creativity styles and emotional intelligence of Filipino student teachers. Descriptive correlational research design was employed. The participants of the study were the 76 fourth year students of Bachelor in Elementary Education (BEED and Bachelor in Secondary Education (BSED in one state university in the Philippines. Data of the study were obtained using two standardized instruments relating to creativity styles and emotional intelligence. Findings of the study revealed that the student teachers espoused themselves to have high creative capacity while they assessed themselves to have high creativity styles along belief in unconscious processes, use of techniques, use of other people and final product orientation. With regards to their emotional intelligence, they assessed themselves to have high attributes on self-awareness, management of emotions, self-motivation, empathy and social skills. Significantly, this study also revealed that gender, birth order, course and scholastic standing in high school spelled differences on the creativity styles of Filipino student teachers. Moreover, test of difference also showed that scholastic standing in high school and family income defined differences along emotional intelligence. Finally, it was also revealed in the study that there is a significant relationship between creativity styles and emotional intelligence of Filipino student teachers. Implications of the congruity between emotional intelligence and creativity styles would help Teacher Education Institutions (TEIs to implement curriculum enhancement which is vital to the preparation of twenty-first century teachers.

  8. Space Environment Modelling with the Use of Artificial Intelligence Methods

    Science.gov (United States)

    Lundstedt, H.; Wintoft, P.; Wu, J.-G.; Gleisner, H.; Dovheden, V.

    1996-12-01

    Space based technological systems are affected by the space weather in many ways. Several severe failures of satellites have been reported at times of space storms. Our society also increasingly depends on satellites for communication, navigation, exploration, and research. Predictions of the conditions in the satellite environment have therefore become very important. We will here present predictions made with the use of artificial intelligence (AI) techniques, such as artificial neural networks (ANN) and hybrids of AT methods. We are developing a space weather model based on intelligence hybrid systems (IHS). The model consists of different forecast modules, each module predicts the space weather on a specific time-scale. The time-scales range from minutes to months with the fundamental time-scale of 1-5 minutes, 1-3 hours, 1-3 days, and 27 days. Solar and solar wind data are used as input data. From solar magnetic field measurements, either made on the ground at Wilcox Solar Observatory (WSO) at Stanford, or made from space by the satellite SOHO, solar wind parameters can be predicted and modelled with ANN and MHD models. Magnetograms from WSO are available on a daily basis. However, from SOHO magnetograms will be available every 90 minutes. SOHO magnetograms as input to ANNs will therefore make it possible to even predict solar transient events. Geomagnetic storm activity can today be predicted with very high accuracy by means of ANN methods using solar wind input data. However, at present real-time solar wind data are only available during part of the day from the satellite WIND. With the launch of ACE in 1997, solar wind data will on the other hand be available during 24 hours per day. The conditions of the satellite environment are not only disturbed at times of geomagnetic storms but also at times of intense solar radiation and highly energetic particles. These events are associated with increased solar activity. Predictions of these events are therefore

  9. Artificial Intelligence: Bayesian versus Heuristic Method for Diagnostic Decision Support.

    Science.gov (United States)

    Elkin, Peter L; Schlegel, Daniel R; Anderson, Michael; Komm, Jordan; Ficheur, Gregoire; Bisson, Leslie

    2018-04-01

    Evoking strength is one of the important contributions of the field of Biomedical Informatics to the discipline of Artificial Intelligence. The University at Buffalo's Orthopedics Department wanted to create an expert system to assist patients with self-diagnosis of knee problems and to thereby facilitate referral to the right orthopedic subspecialist. They had two independent sports medicine physicians review 469 cases. A board-certified orthopedic sports medicine practitioner, L.B., reviewed any disagreements until a gold standard diagnosis was reached. For each case, the patients entered 126 potential answers to 26 questions into a Web interface. These were modeled by an expert sports medicine physician and the answers were reviewed by L.B. For each finding, the clinician specified the sensitivity (term frequency) and both specificity (Sp) and the heuristic evoking strength (ES). Heuristics are methods of reasoning with only partial evidence. An expert system was constructed that reflected the posttest odds of disease-ranked list for each case. We compare the accuracy of using Sp to that of using ES (original model, p  < 0.0008; term importance * disease importance [DItimesTI] model, p  < 0.0001: Wilcoxon ranked sum test). For patient referral assignment, Sp in the DItimesTI model was superior to the use of ES. By the fifth diagnosis, the advantage was lost and so there is no difference between the techniques when serving as a reminder system. Schattauer GmbH Stuttgart.

  10. Using artificial intelligence methods to design new conducting polymers

    Directory of Open Access Journals (Sweden)

    Ronaldo Giro

    2003-12-01

    Full Text Available In the last years the possibility of creating new conducting polymers exploring the concept of copolymerization (different structural monomeric units has attracted much attention from experimental and theoretical points of view. Due to the rich carbon reactivity an almost infinite number of new structures is possible and the procedure of trial and error has been the rule. In this work we have used a methodology able of generating new structures with pre-specified properties. It combines the use of negative factor counting (NFC technique with artificial intelligence methods (genetic algorithms - GAs. We present the results for a case study for poly(phenylenesulfide phenyleneamine (PPSA, a copolymer formed by combination of homopolymers: polyaniline (PANI and polyphenylenesulfide (PPS. The methodology was successfully applied to the problem of obtaining binary up to quinternary disordered polymeric alloys with a pre-specific gap value or exhibiting metallic properties. It is completely general and can be in principle adapted to the design of new classes of materials with pre-specified properties.

  11. Artificial intelligence in medicine.

    OpenAIRE

    Ramesh, A. N.; Kambhampati, C.; Monson, J. R. T.; Drew, P. J.

    2004-01-01

    INTRODUCTION: Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. METHODS: Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of ...

  12. iPixel: a visual content-based and semantic search engine for retrieving digitized mammograms by using collective intelligence.

    Science.gov (United States)

    Alor-Hernández, Giner; Pérez-Gallardo, Yuliana; Posada-Gómez, Rubén; Cortes-Robles, Guillermo; Rodríguez-González, Alejandro; Aguilar-Laserre, Alberto A

    2012-09-01

    Nowadays, traditional search engines such as Google, Yahoo and Bing facilitate the retrieval of information in the format of images, but the results are not always useful for the users. This is mainly due to two problems: (1) the semantic keywords are not taken into consideration and (2) it is not always possible to establish a query using the image features. This issue has been covered in different domains in order to develop content-based image retrieval (CBIR) systems. The expert community has focussed their attention on the healthcare domain, where a lot of visual information for medical analysis is available. This paper provides a solution called iPixel Visual Search Engine, which involves semantics and content issues in order to search for digitized mammograms. iPixel offers the possibility of retrieving mammogram features using collective intelligence and implementing a CBIR algorithm. Our proposal compares not only features with similar semantic meaning, but also visual features. In this sense, the comparisons are made in different ways: by the number of regions per image, by maximum and minimum size of regions per image and by average intensity level of each region. iPixel Visual Search Engine supports the medical community in differential diagnoses related to the diseases of the breast. The iPixel Visual Search Engine has been validated by experts in the healthcare domain, such as radiologists, in addition to experts in digital image analysis.

  13. A literature search tool for intelligent extraction of disease-associated genes.

    Science.gov (United States)

    Jung, Jae-Yoon; DeLuca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2014-01-01

    To extract disorder-associated genes from the scientific literature in PubMed with greater sensitivity for literature-based support than existing methods. We developed a PubMed query to retrieve disorder-related, original research articles. Then we applied a rule-based text-mining algorithm with keyword matching to extract target disorders, genes with significant results, and the type of study described by the article. We compared our resulting candidate disorder genes and supporting references with existing databases. We demonstrated that our candidate gene set covers nearly all genes in manually curated databases, and that the references supporting the disorder-gene link are more extensive and accurate than other general purpose gene-to-disorder association databases. We implemented a novel publication search tool to find target articles, specifically focused on links between disorders and genotypes. Through comparison against gold-standard manually updated gene-disorder databases and comparison with automated databases of similar functionality we show that our tool can search through the entirety of PubMed to extract the main gene findings for human diseases rapidly and accurately.

  14. Method of Improving Personal Name Search in Academic Information Service

    Directory of Open Access Journals (Sweden)

    Heejun Han

    2012-12-01

    Full Text Available All academic information on the web or elsewhere has its creator, that is, a subject who has created the information. The subject can be an individual, a group, or an institution, and can be a nation depending on the nature of the relevant information. Most information is composed of a title, an author, and contents. An essay which is under the academic information category has metadata including a title, an author, keyword, abstract, data about publication, place of publication, ISSN, and the like. A patent has metadata including the title, an applicant, an inventor, an attorney, IPC, number of application, and claims of the invention. Most web-based academic information services enable users to search the information by processing the meta-information. An important element is to search information by using the author field which corresponds to a personal name. This study suggests a method of efficient indexing and using the adjacent operation result ranking algorithm to which phrase search-based boosting elements are applied, and thus improving the accuracy of the search results of personal names. It also describes a method for providing the results of searching co-authors and related researchers in searching personal names. This method can be effectively applied to providing accurate and additional search results in the academic information services.

  15. Searching for South Asian intelligence: psychometry in British India, 1919-1940.

    Science.gov (United States)

    Setlur, Shivrang

    2014-01-01

    This paper describes the introduction and development of intelligence testing in British India. Between 1919 and 1940 experimenters such as C. Herbert Rice, Prasanta Chandra Mahalanobis, and Venkatrao Vithal Kamat imported a number of intelligence tests, adapting them to suit a variety of South Asian languages and contexts. Charting South Asian psychometry's gradual move from American missionary efforts toward the state, this paper argues that political reforms in the 1920s and 1930s affected how psychometry was "indigenized" in South Asia. Describing how approaches to race and caste shifted across instruments and over time, this paper charts the gradual recession, within South Asian psychometry, of a "race" theory of caste. Describing some of the ways in which this "late colonial" period affected the postcolonial landscape, the paper concludes by suggesting potential lines for further inquiry into the later career of intelligence testing in India and Pakistan. © 2014 Wiley Periodicals, Inc.

  16. A Survey of Formal Methods for Intelligent Swarms

    Science.gov (United States)

    Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.

    2004-01-01

    Swarms of intelligent autonomous spacecraft, involving complex behaviors and interactions, are being proposed for future space exploration missions. Such missions provide greater flexibility and offer the possibility of gathering more science data than traditional single spacecraft missions. The emergent properties of swarms make these missions powerful, but simultaneously far more difficult to design, and to assure that the proper behaviors will emerge. These missions are also considerably more complex than previous types of missions, and NASA, like other organizations, has little experience in developing or in verifying and validating these types of missions. A significant challenge when verifying and validating swarms of intelligent interacting agents is how to determine that the possible exponential interactions and emergent behaviors are producing the desired results. Assuring correct behavior and interactions of swarms will be critical to mission success. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm types of missions NASA is considering. The ANTS mission will use a swarm of picospacecraft that will fly from Earth orbit to the Asteroid Belt. Using an insect colony analogy, ANTS will be composed of specialized workers for asteroid exploration. Exploration would consist of cataloguing the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. To perform this task, ANTS would carry miniaturized instruments, such as imagers, spectrometers, and detectors. Since ANTS and other similar missions are going to consist of autonomous spacecraft that may be out of contact with the earth for extended periods of time, and have low bandwidths due to weight constraints, it will be difficult to observe improper behavior and to correct any errors after launch. Providing V&V (verification and validation) for this type of mission is new to NASA, and represents the

  17. Remarks on search methods for stable, massive, elementary particles

    International Nuclear Information System (INIS)

    Perl, Martin L.

    2001-01-01

    This paper was presented at the 69th birthday celebration of Professor Eugene Commins, honoring his research achievements. These remarks are about the experimental techniques used in the search for new stable, massive particles, particles at least as massive as the electron. A variety of experimental methods such as accelerator experiments, cosmic ray studies, searches for halo particles in the galaxy and searches for exotic particles in bulk matter are described. A summary is presented of the measured limits on the existence of new stable, massive particle

  18. ARSTEC, Nonlinear Optimization Program Using Random Search Method

    International Nuclear Information System (INIS)

    Rasmuson, D. M.; Marshall, N. H.

    1979-01-01

    1 - Description of problem or function: The ARSTEC program was written to solve nonlinear, mixed integer, optimization problems. An example of such a problem in the nuclear industry is the allocation of redundant parts in the design of a nuclear power plant to minimize plant unavailability. 2 - Method of solution: The technique used in ARSTEC is the adaptive random search method. The search is started from an arbitrary point in the search region and every time a point that improves the objective function is found, the search region is centered at that new point. 3 - Restrictions on the complexity of the problem: Presently, the maximum number of independent variables allowed is 10. This can be changed by increasing the dimension of the arrays

  19. Interface Design Concepts in the Development of ELSA, an Intelligent Electronic Library Search Assistant.

    Science.gov (United States)

    Denning, Rebecca; Smith, Philip J.

    1994-01-01

    Describes issues and advances in the design of appropriate inference engines and knowledge structures needed by commercially feasible intelligent intermediary systems for information retrieval. Issues associated with the design of interfaces to such functions are discussed in detail. Design principles for guiding implementation of these interfaces…

  20. Budget constraints and optimization in sponsored search auctions

    CERN Document Server

    Yang, Yanwu

    2013-01-01

    The Intelligent Systems Series publishes reference works and handbooks in three core sub-topic areas: Intelligent Automation, Intelligent Transportation Systems, and Intelligent Computing. They include theoretical studies, design methods, and real-world implementations and applications. The series' readership is broad, but focuses on engineering, electronics, and computer science. Budget constraints and optimization in sponsored search auctions takes into account consideration of the entire life cycle of campaigns for researchers and developers working on search systems and ROI maximization

  1. Intelligent Learning System using cognitive science theory and artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Cristensen, D.L.

    1986-01-01

    This dissertation is a presentation of a theoretical model of an intelligent Learning System (ILS). The approach view intelligent computer-based instruction on a curricular-level and educational-theory base, instead of the conventional instructional-only level. The ILS is divided into two components: (1) macro-level, curricular; and (2) micro-level (MAIS), instructional. The primary purpose of the ILS macro level is to establish the initial conditions of learning by considering individual difference variables within specification of the curriculum content domain. Second, the ILS macro-level will iteratively update the conditions of learning as the individual student progresses through the given curriculum. The term dynamic is used to describe the expert tutor that establishes and monitors the conditions of instruction between the ILS macro level and the micro level. As the student progresses through the instruction, appropriate information is sent back continuously to the macro level to constantly improve decision making for succeeding conditions of instruction.

  2. Intelligent methods for the process parameter determination of plastic injection molding

    Science.gov (United States)

    Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn

    2018-03-01

    Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.

  3. Classical methods for interpreting objective function minimization as intelligent inference

    Energy Technology Data Exchange (ETDEWEB)

    Golden, R.M. [Univ. of Texas, Dallas, TX (United States)

    1996-12-31

    Most recognition algorithms and neural networks can be formally viewed as seeking a minimum value of an appropriate objective function during either classification or learning phases. The goal of this paper is to argue that in order to show a recognition algorithm is making intelligent inferences, it is not sufficient to show that the recognition algorithm is computing (or trying to compute) the global minimum of some objective function. One must explicitly define a {open_quotes}relational system{close_quotes} for the recognition algorithm or neural network which identifies the: (i) sample space, (ii) the relevant sigmafield of events generated by the sample space, and (iii) the {open_quotes}relation{close_quotes} for that relational system. Only when such a {open_quotes}relational system{close_quotes} is properly defined, is it possible to formally establish the sense in which computing the global minimum of an objective function is an intelligent, inference.

  4. The Breakthrough Listen Search for Intelligent Life: 1.1-1.9 GHz Observations of 692 Nearby Stars

    Science.gov (United States)

    Enriquez, J. Emilio; Siemion, Andrew; Foster, Griffin; Gajjar, Vishal; Hellbourg, Greg; Hickish, Jack; Isaacson, Howard; Price, Danny C.; Croft, Steve; DeBoer, David; Lebofsky, Matt; MacMahon, David H. E.; Werthimer, Dan

    2017-11-01

    We report on a search for engineered signals from a sample of 692 nearby stars using the Robert C. Byrd Green Bank Telescope, undertaken as part of the Breakthrough Listen Initiative search for extraterrestrial intelligence. Observations were made over 1.1-1.9 GHz (L band), with three sets of five-minute observations of the 692 primary targets, interspersed with five-minute observations of secondary targets. By comparing the “ON” and “OFF” observations, we are able to identify terrestrial interference and place limits on the presence of engineered signals from putative extraterrestrial civilizations inhabiting the environs of the target stars. During the analysis, 11 events passed our thresholding algorithm, but a detailed analysis of their properties indicates that they are consistent with known examples of anthropogenic radio-frequency interference. We conclude that, at the time of our observations, none of the observed systems host high-duty-cycle radio transmitters emitting between 1.1 and 1.9 GHz with an Equivalent Isotropic Radiated Power of ˜1013 W, which is readily achievable by our own civilization. Our results suggest that fewer than ˜0.1% of the stellar systems within 50 pc possess the type of transmitters searched in this survey.

  5. Extended-Search, Bézier Curve-Based Lane Detection and Reconstruction System for an Intelligent Vehicle

    Directory of Open Access Journals (Sweden)

    Xiaoyun Huang

    2015-09-01

    Full Text Available To improve the real-time performance and detection rate of a Lane Detection and Reconstruction (LDR system, an extended-search-based lane detection method and a Bézier curve-based lane reconstruction algorithm are proposed in this paper. The extended-search-based lane detection method is designed to search boundary blocks from the initial position, in an upwards direction and along the lane, with small search areas including continuous search, discontinuous search and bending search in order to detect different lane boundaries. The Bézier curve-based lane reconstruction algorithm is employed to describe a wide range of lane boundary forms with comparatively simple expressions. In addition, two Bézier curves are adopted to reconstruct the lanes' outer boundaries with large curvature variation. The lane detection and reconstruction algorithm — including initial-blocks' determining, extended search, binarization processing and lane boundaries' fitting in different scenarios — is verified in road tests. The results show that this algorithm is robust against different shadows and illumination variations; the average processing time per frame is 13 ms. Significantly, it presents an 88.6% high-detection rate on curved lanes with large or variable curvatures, where the accident rate is higher than that of straight lanes.

  6. Research on intelligent machine self-perception method based on LSTM

    Science.gov (United States)

    Wang, Qiang; Cheng, Tao

    2018-05-01

    In this paper, we use the advantages of LSTM in feature extraction and processing high-dimensional and complex nonlinear data, and apply it to the autonomous perception of intelligent machines. Compared with the traditional multi-layer neural network, this model has memory, can handle time series information of any length. Since the multi-physical domain signals of processing machines have a certain timing relationship, and there is a contextual relationship between states and states, using this deep learning method to realize the self-perception of intelligent processing machines has strong versatility and adaptability. The experiment results show that the method proposed in this paper can obviously improve the sensing accuracy under various working conditions of the intelligent machine, and also shows that the algorithm can well support the intelligent processing machine to realize self-perception.

  7. Implementation Of Haversine Formula And Best First Search Method In Searching Of Tsunami Evacuation Route

    Science.gov (United States)

    Anisya; Yoga Swara, Ganda

    2017-12-01

    Padang is one of the cities prone to earthquake disaster with tsunami due to its position at the meeting of two active plates, this is, a source of potentially powerful earthquake and tsunami. Central government and most offices are located in the red zone (vulnerable areas), it will also affect the evacuation of the population during the earthquake and tsunami disaster. In this study, researchers produced a system of search nearest shelter using best-first-search method. This method uses the heuristic function, the amount of cost taken and the estimated value or travel time, path length and population density. To calculate the length of the path, researchers used method of haversine formula. The value obtained from the calculation process is implemented on a web-based system. Some alternative paths and some of the closest shelters will be displayed in the system.

  8. Database in Artificial Intelligence.

    Science.gov (United States)

    Wilkinson, Julia

    1986-01-01

    Describes a specialist bibliographic database of literature in the field of artificial intelligence created by the Turing Institute (Glasgow, Scotland) using the BRS/Search information retrieval software. The subscription method for end-users--i.e., annual fee entitles user to unlimited access to database, document provision, and printed awareness…

  9. The commission errors search and assessment (CESA) method

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B.; Dang, V. N

    2007-05-15

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  10. The commission errors search and assessment (CESA) method

    International Nuclear Information System (INIS)

    Reer, B.; Dang, V. N.

    2007-05-01

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  11. Intelligence Ethics:

    DEFF Research Database (Denmark)

    Rønn, Kira Vrist

    2016-01-01

    Questions concerning what constitutes a morally justified conduct of intelligence activities have received increased attention in recent decades. However, intelligence ethics is not yet homogeneous or embedded as a solid research field. The aim of this article is to sketch the state of the art...... of intelligence ethics and point out subjects for further scrutiny in future research. The review clusters the literature on intelligence ethics into two groups: respectively, contributions on external topics (i.e., the accountability of and the public trust in intelligence agencies) and internal topics (i.......e., the search for an ideal ethical framework for intelligence actions). The article concludes that there are many holes to fill for future studies on intelligence ethics both in external and internal discussions. Thus, the article is an invitation – especially, to moral philosophers and political theorists...

  12. Study on boundary search method for DFM mesh generation

    Directory of Open Access Journals (Sweden)

    Li Ri

    2012-08-01

    Full Text Available The boundary mesh of the casting model was determined by direct calculation on the triangular facets extracted from the STL file of the 3D model. Then the inner and outer grids of the model were identified by the algorithm in which we named Inner Seed Grid Method. Finally, a program to automatically generate a 3D FDM mesh was compiled. In the paper, a method named Triangle Contraction Search Method (TCSM was put forward to ensure not losing the boundary grids; while an algorithm to search inner seed grids to identify inner/outer grids of the casting model was also brought forward. Our algorithm was simple, clear and easy to construct program. Three examples for the casting mesh generation testified the validity of the program.

  13. Artificial Intelligence Mechanisms on Interactive Modified Simplex Method with Desirability Function for Optimising Surface Lapping Process

    Directory of Open Access Journals (Sweden)

    Pongchanun Luangpaiboon

    2014-01-01

    Full Text Available A study has been made to optimise the influential parameters of surface lapping process. Lapping time, lapping speed, downward pressure, and charging pressure were chosen from the preliminary studies as parameters to determine process performances in terms of material removal, lap width, and clamp force. The desirability functions of the-nominal-the-best were used to compromise multiple responses into the overall desirability function level or D response. The conventional modified simplex or Nelder-Mead simplex method and the interactive desirability function are performed to optimise online the parameter levels in order to maximise the D response. In order to determine the lapping process parameters effectively, this research then applies two powerful artificial intelligence optimisation mechanisms from harmony search and firefly algorithms. The recommended condition of (lapping time, lapping speed, downward pressure, and charging pressure at (33, 35, 6.0, and 5.0 has been verified by performing confirmation experiments. It showed that the D response level increased to 0.96. When compared with the current operating condition, there is a decrease of the material removal and lap width with the improved process performance indices of 2.01 and 1.14, respectively. Similarly, there is an increase of the clamp force with the improved process performance index of 1.58.

  14. The Use of Resistivity Methods in Terrestrial Forensic Searches

    Science.gov (United States)

    Wolf, R. C.; Raisuddin, I.; Bank, C.

    2013-12-01

    The increasing use of near-surface geophysical methods in forensic searches has demonstrated the need for further studies to identify the ideal physical, environmental and temporal settings for each geophysical method. Previous studies using resistivity methods have shown promising results, but additional work is required to more accurately interpret and analyze survey findings. The Ontario Provincial Police's UCRT (Urban Search and Rescue; Chemical, Biolgical, Radiological, Nuclear and Explosives; Response Team) is collaborating with the University of Toronto and two additional universities in a multi-year study investigating the applications of near-surface geophysical methods to terrestrial forensic searches. In the summer of 2012, on a test site near Bolton, Ontario, the OPP buried weapons, drums and pigs (naked, tarped, and clothed) to simulate clandestine graves and caches. Our study aims to conduct repeat surveys using an IRIS Syscal Junior with 48 electrode switching system resistivity-meter. These surveys will monitor changes in resistivity reflecting decomposition of the object since burial, and identify the strengths and weaknesses of resistivity when used in a rural, clandestine burial setting. Our initial findings indicate the usefulness of this method, as prominent resistivity changes have been observed. We anticipate our results will help to assist law enforcement agencies in determining the type of resistivity results to expect based on time since burial, depth of burial and state of dress of the body.

  15. Modeling an Optical and Infrared Search for Extraterrestrial Intelligence Survey with Exoplanet Direct Imaging

    Science.gov (United States)

    Vides, Christina; Macintosh, Bruce; Ruffio, Jean-Baptiste; Nielsen, Eric; Povich, Matthew Samuel

    2018-01-01

    Gemini Planet Imager (GPI) is a direct high contrast imaging instrument coupled to the Gemini South Telescope. Its purpose is to image extrasolar planets around young (~Intelligence), we modeled GPI’s capabilities to detect an extraterrestrial continuous wave (CW) laser broadcasted within the H-band have been modeled. By using sensitivity evaluated for actual GPI observations of young target stars, we produced models of the CW laser power as a function of distance from the star that could be detected if GPI were to observe nearby (~ 3-5 pc) planet-hosting G-type stars. We took a variety of transmitters into consideration in producing these modeled values. GPI is known to be sensitive to both pulsed and CW coherent electromagnetic radiation. The results were compared to similar studies and it was found that these values are competitive to other optical and infrared observations.

  16. AM: An Artificial Intelligence Approach to Discovery in Mathematics as Heuristic Search

    Science.gov (United States)

    1976-07-01

    deficiency . The idea of "Intuitions" facets was a flop. Intuitions were meant to model reality, at least little pieces of it, so that AM could...Discovery in Mathematic, as Heuristic Search -323- s Tk2 ** Check examples of Single-ADD, because many examples have recently been found, but not yet

  17. Exploration of Stellarator Configuration Space with Global Search Methods

    International Nuclear Information System (INIS)

    Mynick, H.E.; Pomphrey, N.; Ethier, S.

    2001-01-01

    An exploration of stellarator configuration space z for quasi-axisymmetric stellarator (QAS) designs is discussed, using methods which provide a more global view of that space. To this end, we have implemented a ''differential evolution'' (DE) search algorithm in an existing stellarator optimizer, which is much less prone to become trapped in local, suboptimal minima of the cost function chi than the local search methods used previously. This search algorithm is complemented by mapping studies of chi over z aimed at gaining insight into the results of the automated searches. We find that a wide range of the attractive QAS configurations previously found fall into a small number of classes, with each class corresponding to a basin of chi(z). We develop maps on which these earlier stellarators can be placed, the relations among them seen, and understanding gained into the physics differences between them. It is also found that, while still large, the region of z space containing practically realizable QAS configurations is much smaller than earlier supposed

  18. Comparison of two solution ways of district heating control: Using analysis methods, using artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Balate, J.; Sysala, T. [Technical Univ., Zlin (Czech Republic). Dept. of Automation and Control Technology

    1997-12-31

    The District Heating Systems - DHS (Centralized Heat Supply Systems - CHSS) are being developed in large cities in accordance with their growth. The systems are formed by enlarging networks of heat distribution to consumers and at the same time they interconnect the heat sources gradually built. The heat is distributed to the consumers through the circular networks, that are supplied by several cooperating heat sources, that means by power and heating plants and heating plants. The complicated process of heat production technology and supply requires the system approach when solving the concept of automatized control. The paper deals with comparison of the solution way using the analysis methods and using the artificial intelligence methods. (orig.)

  19. A novel optimization method, Gravitational Search Algorithm (GSA), for PWR core optimization

    International Nuclear Information System (INIS)

    Mahmoudi, S.M.; Aghaie, M.; Bahonar, M.; Poursalehi, N.

    2016-01-01

    Highlights: • The Gravitational Search Algorithm (GSA) is introduced. • The advantage of GSA is verified in Shekel’s Foxholes. • Reload optimizing in WWER-1000 and WWER-440 cases are performed. • Maximizing K eff , minimizing PPFs and flattening power density is considered. - Abstract: In-core fuel management optimization (ICFMO) is one of the most challenging concepts of nuclear engineering. In recent decades several meta-heuristic algorithms or computational intelligence methods have been expanded to optimize reactor core loading pattern. This paper presents a new method of using Gravitational Search Algorithm (GSA) for in-core fuel management optimization. The GSA is constructed based on the law of gravity and the notion of mass interactions. It uses the theory of Newtonian physics and searcher agents are the collection of masses. In this work, at the first step, GSA method is compared with other meta-heuristic algorithms on Shekel’s Foxholes problem. In the second step for finding the best core, the GSA algorithm has been performed for three PWR test cases including WWER-1000 and WWER-440 reactors. In these cases, Multi objective optimizations with the following goals are considered, increment of multiplication factor (K eff ), decrement of power peaking factor (PPF) and power density flattening. It is notable that for neutronic calculation, PARCS (Purdue Advanced Reactor Core Simulator) code is used. The results demonstrate that GSA algorithm have promising performance and could be proposed for other optimization problems of nuclear engineering field.

  20. Assessment of the effectiveness of uranium deposit searching methods

    International Nuclear Information System (INIS)

    Suran, J.

    1998-01-01

    The following groups of uranium deposit searching methods are described: radiometric review of foreign work; aerial radiometric survey; automobile radiometric survey; emanation survey up to 1 m; emanation survey up to 2 m; ground radiometric survey; radiometric survey in pits; deep radiometric survey; combination of the above methods; and other methods (drilling survey). For vein-type deposits, the majority of Czech deposits were discovered in 1945-1965 by radiometric review of foreign work, automobile radiometric survey, and emanation survey up to 1 m. The first significant indications of sandstone type uranium deposits were observed in the mid-1960 by aerial radiometric survey and confirmed later by drilling. (P.A.)

  1. Bayesian methods in the search for MH370

    CERN Document Server

    Davey, Sam; Holland, Ian; Rutten, Mark; Williams, Jason

    2016-01-01

    This book demonstrates how nonlinear/non-Gaussian Bayesian time series estimation methods were used to produce a probability distribution of potential MH370 flight paths. It provides details of how the probabilistic models of aircraft flight dynamics, satellite communication system measurements, environmental effects and radar data were constructed and calibrated. The probability distribution was used to define the search zone in the southern Indian Ocean. The book describes particle-filter based numerical calculation of the aircraft flight-path probability distribution and validates the method using data from several of the involved aircraft’s previous flights. Finally it is shown how the Reunion Island flaperon debris find affects the search probability distribution.

  2. [Development and effects of emotional intelligence program for undergraduate nursing students: mixed methods research].

    Science.gov (United States)

    Lee, Oi Sun; Gu, Mee Ock

    2014-12-01

    This study was conducted to develop and test the effects of an emotional intelligence program for undergraduate nursing students. The study design was a mixed method research. Participants were 36 nursing students (intervention group: 17, control group: 19). The emotional intelligence program was provided for 4 weeks (8 sessions, 20 hours). Data were collected between August 6 and October 4, 2013. Quantitative data were analyzed using Chi-square, Fisher's exact test, t-test, repeated measure ANOVA, and paired t-test with SPSS/WIN 18.0. Qualitative data were analyzed using content analysis. Quantitative results showed that emotional intelligence, communication skills, resilience, stress coping strategy, and clinical competence were significantly better in the experimental group compared to the control group. According to the qualitative results, the nursing students experienced improvement in emotional intelligence, interpersonal relationships, and empowerment, as well as a reduction in clinical practice stress after participation in the emotional intelligence program. Study findings indicate that the emotional intelligence program for undergraduate nursing students is effective and can be recommended as an intervention for improving the clinical competence of undergraduate students in a nursing curriculum.

  3. A method of searching LDAP directories using XQuery

    International Nuclear Information System (INIS)

    Hesselroth, Ted

    2011-01-01

    A method by which an LDAP directory can be searched using XQuery is described. The strategy behind the tool consists of four steps. First the XQuery script is examined and relevant XPath expressions are extracted, determined to be sufficient to define all information needed to perform the query. Then the XPath expressions are converted into their equivalent LDAP search filters by use of the published LDAP schema of the service, and search requests are made to the LDAP host. The search results are then merged and converted to an XML document that conforms to the hierarchy of the LDAP schema. Finally, the XQuery script is executed on the working XML document by conventional means. Examples are given of application of the tool in the Open Science Grid, which for discovery purposes operates an LDAP server that contains Glue schema-based information on site configuration and authorization policies. The XQuery scripts compactly replace hundreds of lines of custom python code that relied on the unix ldapsearch utility. Installation of the tool is available through the Virtual Data Toolkit.

  4. Cumulative query method for influenza surveillance using search engine data.

    Science.gov (United States)

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  5. Searching for a neurologic injury's Wechsler Adult Intelligence Scale-Third Edition profile.

    Science.gov (United States)

    Gonçalves, Marta A; Moura, Octávio; Castro-Caldas, Alexandre; Simões, Mário R

    2017-01-01

    This study aimed to investigate the presence of a Wechsler Adult Intelligence Scale-Third Edition (WAIS-III) cognitive profile in a Portuguese neurologic injured sample. The Portuguese WAIS-III was administered to 81 mixed neurologic patients and 81 healthy matched controls selected from the Portuguese standardization sample. Although the mixed neurologic injury group performed significantly lower than the healthy controls for the majority of the WAIS-III scores (i.e., composite measures, discrepancies, and subtests), the mean scores were within the normal range and, therefore, at risk of being unobserved in a clinical evaluation. ROC curves analysis showed poor to acceptable diagnostic accuracy for the WAIS-III composite measures and subtests (Working Memory Index and Digit Span revealed the highest accuracy for discriminating between participants, respectively). Multiple regression analysis showed that both literacy and the presence of brain injury were significant predictors for all of the composite measures. In addition, multiple regression analysis also showed that literacy, age of injury onset, and years of survival predicted all seven composite measures for the mixed neurologic injured group. Despite the failure to find a WAIS-III cognitive profile for mixed neurologic patients, the results showed a significant influence of brain lesion and literacy in the performance of the WAIS-III.

  6. Methods and models for quantative assessment of speech intelligibility in cross-language communication

    NARCIS (Netherlands)

    Wijngaarden, S.J. van; Steeneken, H.J.M.; Houtgast, T.

    2001-01-01

    To deal with the effects of nonnative speech communication on speech intelligibility, one must know the magnitude of these effects. To measure this magnitude, suitable test methods must be available. Many of the methods used in cross-language speech communication research are not very suitable for

  7. Artificial Intelligence in Civil Engineering

    Directory of Open Access Journals (Sweden)

    Pengzhen Lu

    2012-01-01

    Full Text Available Artificial intelligence is a branch of computer science, involved in the research, design, and application of intelligent computer. Traditional methods for modeling and optimizing complex structure systems require huge amounts of computing resources, and artificial-intelligence-based solutions can often provide valuable alternatives for efficiently solving problems in the civil engineering. This paper summarizes recently developed methods and theories in the developing direction for applications of artificial intelligence in civil engineering, including evolutionary computation, neural networks, fuzzy systems, expert system, reasoning, classification, and learning, as well as others like chaos theory, cuckoo search, firefly algorithm, knowledge-based engineering, and simulated annealing. The main research trends are also pointed out in the end. The paper provides an overview of the advances of artificial intelligence applied in civil engineering.

  8. On construction method of shipborne and airborne radar intelligence and related equipment knowledge graph

    Science.gov (United States)

    Hao, Ruizhe; Huang, Jian

    2017-08-01

    Knowledge graph construction in military intelligence domain is sprouting but technically immature. This paper presents a method to construct the heterogeneous knowledge graph in the field of shipborne and airborne radar and equipment. Based on the expert knowledge and the up-to-date Internet open source information, we construct the knowledge graph of radar characteristic information and the equipment respectively, and establish relationships between two graphs, providing the pipeline and method for the intelligence organization and management in the context of the crowding battlefields big data.

  9. An Intelligent Optical Dissolved Oxygen Measurement Method Based on a Fluorescent Quenching Mechanism.

    Science.gov (United States)

    Li, Fengmei; Wei, Yaoguang; Chen, Yingyi; Li, Daoliang; Zhang, Xu

    2015-12-09

    Dissolved oxygen (DO) is a key factor that influences the healthy growth of fishes in aquaculture. The DO content changes with the aquatic environment and should therefore be monitored online. However, traditional measurement methods, such as iodometry and other chemical analysis methods, are not suitable for online monitoring. The Clark method is not stable enough for extended periods of monitoring. To solve these problems, this paper proposes an intelligent DO measurement method based on the fluorescence quenching mechanism. The measurement system is composed of fluorescent quenching detection, signal conditioning, intelligent processing, and power supply modules. The optical probe adopts the fluorescent quenching mechanism to detect the DO content and solves the problem, whereas traditional chemical methods are easily influenced by the environment. The optical probe contains a thermistor and dual excitation sources to isolate visible parasitic light and execute a compensation strategy. The intelligent processing module adopts the IEEE 1451.2 standard and realizes intelligent compensation. Experimental results show that the optical measurement method is stable, accurate, and suitable for online DO monitoring in aquaculture applications.

  10. An Intelligent Optical Dissolved Oxygen Measurement Method Based on a Fluorescent Quenching Mechanism

    Directory of Open Access Journals (Sweden)

    Fengmei Li

    2015-12-01

    Full Text Available Dissolved oxygen (DO is a key factor that influences the healthy growth of fishes in aquaculture. The DO content changes with the aquatic environment and should therefore be monitored online. However, traditional measurement methods, such as iodometry and other chemical analysis methods, are not suitable for online monitoring. The Clark method is not stable enough for extended periods of monitoring. To solve these problems, this paper proposes an intelligent DO measurement method based on the fluorescence quenching mechanism. The measurement system is composed of fluorescent quenching detection, signal conditioning, intelligent processing, and power supply modules. The optical probe adopts the fluorescent quenching mechanism to detect the DO content and solves the problem, whereas traditional chemical methods are easily influenced by the environment. The optical probe contains a thermistor and dual excitation sources to isolate visible parasitic light and execute a compensation strategy. The intelligent processing module adopts the IEEE 1451.2 standard and realizes intelligent compensation. Experimental results show that the optical measurement method is stable, accurate, and suitable for online DO monitoring in aquaculture applications.

  11. SIGNAL - Search for Intelligence in the Galactic Nucleus with the Array of the Lowlands

    International Nuclear Information System (INIS)

    Shostak, G.S.; Tarter, J.

    1982-01-01

    In August, 1981, the Westerbork Synthesis Radio Telescope was used for 4 hours to search for narrow-band radio beacons in the direction of the Galactic Center. By using both the spatial discrimination and temporal stability available to an interferometric measurement, weak intermittent signals can be detected even in the face of the strong, naturally-caused radiation from this region. A radio beacon within the bandwidth used here, centered on the 21-cm neutral hydrogen line, would be recognizable if it had a repetition period between 40 sec and several hours. The rms sensitivity to point sources is approximately 20 mJy, or many orders of magnitude better than typical sensitivities achieved by scanning single-dish instruments

  12. A Fast Radio Burst Search Method for VLBI Observation

    Science.gov (United States)

    Liu, Lei; Tong, Fengxian; Zheng, Weimin; Zhang, Juan; Tong, Li

    2018-02-01

    We introduce the cross-spectrum-based fast radio burst (FRB) search method for Very Long Baseline Interferometer (VLBI) observation. This method optimizes the fringe fitting scheme in geodetic VLBI data post-processing, which fully utilizes the cross-spectrum fringe phase information and therefore maximizes the power of single-pulse signals. Working with cross-spectrum greatly reduces the effect of radio frequency interference compared with using auto-power spectrum. Single-pulse detection confidence increases by cross-identifying detections from multiple baselines. By combining the power of multiple baselines, we may improve the detection sensitivity. Our method is similar to that of coherent beam forming, but without the computational expense to form a great number of beams to cover the whole field of view of our telescopes. The data processing pipeline designed for this method is easy to implement and parallelize, which can be deployed in various kinds of VLBI observations. In particular, we point out that VGOS observations are very suitable for FRB search.

  13. Intelligent Evaluation Method of Tank Bottom Corrosion Status Based on Improved BP Artificial Neural Network

    Science.gov (United States)

    Qiu, Feng; Dai, Guang; Zhang, Ying

    According to the acoustic emission information and the appearance inspection information of tank bottom online testing, the external factors associated with tank bottom corrosion status are confirmed. Applying artificial neural network intelligent evaluation method, three tank bottom corrosion status evaluation models based on appearance inspection information, acoustic emission information, and online testing information are established. Comparing with the result of acoustic emission online testing through the evaluation of test sample, the accuracy of the evaluation model based on online testing information is 94 %. The evaluation model can evaluate tank bottom corrosion accurately and realize acoustic emission online testing intelligent evaluation of tank bottom.

  14. Survey of artificial intelligence methods for detection and identification of component faults in nuclear power plants

    International Nuclear Information System (INIS)

    Reifman, J.

    1997-01-01

    A comprehensive survey of computer-based systems that apply artificial intelligence methods to detect and identify component faults in nuclear power plants is presented. Classification criteria are established that categorize artificial intelligence diagnostic systems according to the types of computing approaches used (e.g., computing tools, computer languages, and shell and simulation programs), the types of methodologies employed (e.g., types of knowledge, reasoning and inference mechanisms, and diagnostic approach), and the scope of the system. The major issues of process diagnostics and computer-based diagnostic systems are identified and cross-correlated with the various categories used for classification. Ninety-five publications are reviewed

  15. IMPROVING NEAREST NEIGHBOUR SEARCH IN 3D SPATIAL ACCESS METHOD

    Directory of Open Access Journals (Sweden)

    A. Suhaibaha

    2016-10-01

    Full Text Available Nearest Neighbour (NN is one of the important queries and analyses for spatial application. In normal practice, spatial access method structure is used during the Nearest Neighbour query execution to retrieve information from the database. However, most of the spatial access method structures are still facing with unresolved issues such as overlapping among nodes and repetitive data entry. This situation will perform an excessive Input/Output (IO operation which is inefficient for data retrieval. The situation will become more crucial while dealing with 3D data. The size of 3D data is usually large due to its detail geometry and other attached information. In this research, a clustered 3D hierarchical structure is introduced as a 3D spatial access method structure. The structure is expected to improve the retrieval of Nearest Neighbour information for 3D objects. Several tests are performed in answering Single Nearest Neighbour search and k Nearest Neighbour (kNN search. The tests indicate that clustered hierarchical structure is efficient in handling Nearest Neighbour query compared to its competitor. From the results, clustered hierarchical structure reduced the repetitive data entry and the accessed page. The proposed structure also produced minimal Input/Output operation. The query response time is also outperformed compared to the other competitor. For future outlook of this research several possible applications are discussed and summarized.

  16. New procedure for criticality search using coarse mesh nodal methods

    International Nuclear Information System (INIS)

    Pereira, Wanderson F.; Silva, Fernando C. da; Martinez, Aquilino S.

    2011-01-01

    The coarse mesh nodal methods have as their primary goal to calculate the neutron flux inside the reactor core. Many computer systems use a specific form of calculation, which is called nodal method. In classical computing systems that use the criticality search is made after the complete convergence of the iterative process of calculating the neutron flux. In this paper, we proposed a new method for the calculation of criticality, condition which will be over very iterative process of calculating the neutron flux. Thus, the processing time for calculating the neutron flux was reduced by half compared with the procedure developed by the Nuclear Engineering Program of COPPE/UFRJ (PEN/COPPE/UFRJ). (author)

  17. New procedure for criticality search using coarse mesh nodal methods

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Wanderson F.; Silva, Fernando C. da; Martinez, Aquilino S., E-mail: wneto@con.ufrj.b, E-mail: fernando@con.ufrj.b, E-mail: Aquilino@lmp.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    The coarse mesh nodal methods have as their primary goal to calculate the neutron flux inside the reactor core. Many computer systems use a specific form of calculation, which is called nodal method. In classical computing systems that use the criticality search is made after the complete convergence of the iterative process of calculating the neutron flux. In this paper, we proposed a new method for the calculation of criticality, condition which will be over very iterative process of calculating the neutron flux. Thus, the processing time for calculating the neutron flux was reduced by half compared with the procedure developed by the Nuclear Engineering Program of COPPE/UFRJ (PEN/COPPE/UFRJ). (author)

  18. The Breakthrough Listen Search for Intelligent Life: Data Calibration using Pulsars

    Science.gov (United States)

    Brinkman-Traverse, Casey Lynn; Gajjar, Vishal; BSRC

    2018-01-01

    The ability to distinguish ET signals requires a deep understanding of the radio telescopes with which we search; therefore, before we observe stars of interest, the Breathrough Listen scientists at Berkeley SETI Research Center first observe a Pulsar with well-documented flux and polarization properties. The process of calibrating the flux and polarization is a lengthy process by hand, so we produced a pipeline code that will automatically calibrate the pulsar in under an hour. Using PSRCHIVE the code coherently dedisperses the pulsed radio signals, and then calibrates the flux using observation files with a noise diode turning on and off. The code was developed using PSR B1937+ 21 and is primarily used on PSR B0329+54. This will expedite the process of assessing the quality of data collected from the Green Bank Telescope in West Virginia and will allow us to more efficiently find life beyond Planet Earth. Additionally, the stability of the B0329+54 calibration data will allow us to analyze data taken on FRB's with confidence of its cosmic origin.

  19. Hooke–Jeeves Method-used Local Search in a Hybrid Global Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    V. D. Sulimov

    2014-01-01

    Full Text Available Modern methods for optimization investigation of complex systems are based on development and updating the mathematical models of systems because of solving the appropriate inverse problems. Input data desirable for solution are obtained from the analysis of experimentally defined consecutive characteristics for a system or a process. Causal characteristics are the sought ones to which equation coefficients of mathematical models of object, limit conditions, etc. belong. The optimization approach is one of the main ones to solve the inverse problems. In the main case it is necessary to find a global extremum of not everywhere differentiable criterion function. Global optimization methods are widely used in problems of identification and computation diagnosis system as well as in optimal control, computing to-mography, image restoration, teaching the neuron networks, other intelligence technologies. Increasingly complicated systems of optimization observed during last decades lead to more complicated mathematical models, thereby making solution of appropriate extreme problems significantly more difficult. A great deal of practical applications may have the problem con-ditions, which can restrict modeling. As a consequence, in inverse problems the criterion functions can be not everywhere differentiable and noisy. Available noise means that calculat-ing the derivatives is difficult and unreliable. It results in using the optimization methods without calculating the derivatives.An efficiency of deterministic algorithms of global optimization is significantly restrict-ed by their dependence on the extreme problem dimension. When the number of variables is large they use the stochastic global optimization algorithms. As stochastic algorithms yield too expensive solutions, so this drawback restricts their applications. Developing hybrid algo-rithms that combine a stochastic algorithm for scanning the variable space with deterministic local search

  20. Selected business intelligence methods for decision-making support in a finance institution

    OpenAIRE

    Mezera, Filip; Křupka, Jiří

    2017-01-01

    This article deals with decision-making support methods’ implementation in a medium size financial company with international operations. The objective of this article is to show the abilities of these methods to precise decision-making of management. At the beginning of this article there is briefly described the existing situation in this business sector in Central Europe. After that part Business Intelligence methods are described as well as the reasons while these methods have been introd...

  1. Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.

    Science.gov (United States)

    Musen, M A

    1998-01-01

    When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.

  2. A novel method for intelligent fault diagnosis of rolling bearings using ensemble deep auto-encoders

    Science.gov (United States)

    Shao, Haidong; Jiang, Hongkai; Lin, Ying; Li, Xingqiu

    2018-03-01

    Automatic and accurate identification of rolling bearings fault categories, especially for the fault severities and fault orientations, is still a major challenge in rotating machinery fault diagnosis. In this paper, a novel method called ensemble deep auto-encoders (EDAEs) is proposed for intelligent fault diagnosis of rolling bearings. Firstly, different activation functions are employed as the hidden functions to design a series of auto-encoders (AEs) with different characteristics. Secondly, EDAEs are constructed with various auto-encoders for unsupervised feature learning from the measured vibration signals. Finally, a combination strategy is designed to ensure accurate and stable diagnosis results. The proposed method is applied to analyze the experimental bearing vibration signals. The results confirm that the proposed method can get rid of the dependence on manual feature extraction and overcome the limitations of individual deep learning models, which is more effective than the existing intelligent diagnosis methods.

  3. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  4. SOLVING TRANSPORT LOGISTICS PROBLEMS IN A VIRTUAL ENTERPRISE THROUGH ARTIFICIAL INTELLIGENCE METHODS

    OpenAIRE

    PAVLENKO, Vitaliy; PAVLENKO, Tetiana; MOROZOVA, Olga; KUZNETSOVA, Anna; VOROPAI, Olena

    2017-01-01

    The paper offers a solution to the problem of material flow allocation within a virtual enterprise by using artificial intelligence methods. The research is based on the use of fuzzy relations when planning for optimal transportation modes to deliver components for manufactured products. The Fuzzy Logic Toolbox is used to determine the optimal route for transportation of components for manufactured products. The methods offered have been exemplified in the present research. The authors have b...

  5. A Reasoning Method of Cyber-Attack Attribution Based on Threat Intelligence

    OpenAIRE

    Li Qiang; Yang Ze-Ming; Liu Bao-Xu; Jiang Zheng-Wei

    2016-01-01

    With the increasing complexity of cyberspace security, the cyber-attack attribution has become an important challenge of the security protection systems. The difficult points of cyber-attack attribution were forced on the problems of huge data handling and key data missing. According to this situation, this paper presented a reasoning method of cyber-attack attribution based on threat intelligence. The method utilizes the intrusion kill chain model and Bayesian network to build attack chain a...

  6. Development of a method of continuous improvement of services using the Business Intelligence tools

    Directory of Open Access Journals (Sweden)

    Svetlana V. Kulikova

    2018-01-01

    Full Text Available The purpose of the study was to develop a method of continuous improvement of services using the Business Intelligence tools.Materials and methods: the materials are used on the concept of the Deming Cycle, methods and Business Intelligence technologies, Agile methodology and SCRUM.Results: the article considers the problem of continuous improvement of services and offers solutions using methods and technologies of Business Intelligence. In this case, the purpose of this technology is to solve and make the final decision regarding what needs to be improved in the current organization of services. In other words, Business Intelligence helps the product manager to see what is hidden from the “human eye” on the basis of received and processed data. Development of a method based on the concept of the Deming Cycle and Agile methodologies, and SCRUM.The article describes the main stages of development of method based on activity of the enterprise. It is necessary to fully build the Business Intelligence system in the enterprise to identify bottlenecks and justify the need for their elimination and, in general, for continuous improvement of the services. This process is represented in the notation of DFD. The article presents a scheme for the selection of suitable agile methodologies.The proposed concept of the solution of the stated objectives, including methods of identification of problems through Business Intelligence technology, development of the system for troubleshooting and analysis of results of the introduced changes. The technical description of the project is given.Conclusion: following the work of the authors there was formed the concept of the method for the continuous improvement of the services, using the Business Intelligence technology with the specifics of the enterprises, offering SaaS solutions. It was also found that when using this method, the recommended development methodology is SCRUM. The result of this scientific

  7. A States of Matter Search-Based Approach for Solving the Problem of Intelligent Power Allocation in Plug-in Hybrid Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Arturo Valdivia-Gonzalez

    2017-01-01

    Full Text Available Recently, many researchers have proved that the electrification of the transport sector is a key for reducing both the emissions of green-house pollutants and the dependence on oil for transportation. As a result, Plug-in Hybrid Electric Vehicles (or PHEVs are receiving never before seen increased attention. Consequently, large-scale penetration of PHEVs into the market is expected to take place in the near future, however, an unattended increase in the PHEVs needs may cause several technical problems which could potentially compromise the stability of power systems. As a result of the growing necessity for addressing such issues, topics related to the optimization of PHEVs’ charging infrastructures have captured the attention of many researchers. Related to this, several state-of-the-art swarm optimization methods (such as the well-known Particle Swarm Optimization (PSO or the recently proposed Gravitational Search Algorithm (GSA approach have been successfully applied in the optimization of the average State of Charge (SoC, which represents one of the most important performance indicators in the context of PHEVs’ intelligent power allocation. Many of these swarm optimization methods, however, are known to be subject to several critical flaws, including premature convergence and a lack of balance between the exploration and exploitation of solutions. Such problems are usually related to the evolutionary operators employed by each of the methods on the exploration and exploitation of new solutions. In this paper, the recently proposed States of Matter Search (SMS swarm optimization method is proposed for maximizing the average State of Charge of PHEVs within a charging station. In our experiments, several different scenarios consisting on different numbers of PHEVs were considered. To test the feasibility of the proposed approach, comparative experiments were performed against other popular PHEVs’ State of Charge maximization approaches

  8. Intelligent Knowledge Recommendation Methods for R&D Knowledge Portals

    Institute of Scientific and Technical Information of China (English)

    KIM Jongwoo; LEE Hongjoo; PARK Sungjoo

    2004-01-01

    The personalization in knowledge portals and knowledge management systems is mainly performed based on users' explicitly specified categories and keywords. The explicit specification approach requires users' participation to start personalization services, and has limitation to adapt changes of users' preference. This paper suggests two implicit personalization approaches: automatic user category assignment method and automatic keyword profile generation method. The performances of the implicit personalization approaches are compared with traditional personalization approach using an Internet news site experiment. The result of the experiment shows that the suggested personalization approaches provide sufficient recommendation effectiveness with lessening users'unwanted involvement in personalization process.

  9. Topology optimization based on the harmony search method

    International Nuclear Information System (INIS)

    Lee, Seung-Min; Han, Seog-Young

    2017-01-01

    A new topology optimization scheme based on a Harmony search (HS) as a metaheuristic method was proposed and applied to static stiffness topology optimization problems. To apply the HS to topology optimization, the variables in HS were transformed to those in topology optimization. Compliance was used as an objective function, and harmony memory was defined as the set of the optimized topology. Also, a parametric study for Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), and Bandwidth (BW) was performed to find the appropriate range for topology optimization. Various techniques were employed such as a filtering scheme, simple average scheme and harmony rate. To provide a robust optimized topology, the concept of the harmony rate update rule was also implemented. Numerical examples are provided to verify the effectiveness of the HS by comparing the optimal layouts of the HS with those of Bidirectional evolutionary structural optimization (BESO) and Artificial bee colony algorithm (ABCA). The following conclu- sions could be made: (1) The proposed topology scheme is very effective for static stiffness topology optimization problems in terms of stability, robustness and convergence rate. (2) The suggested method provides a symmetric optimized topology despite the fact that the HS is a stochastic method like the ABCA. (3) The proposed scheme is applicable and practical in manufacturing since it produces a solid-void design of the optimized topology. (4) The suggested method appears to be very effective for large scale problems like topology optimization.

  10. Topology optimization based on the harmony search method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung-Min; Han, Seog-Young [Hanyang University, Seoul (Korea, Republic of)

    2017-06-15

    A new topology optimization scheme based on a Harmony search (HS) as a metaheuristic method was proposed and applied to static stiffness topology optimization problems. To apply the HS to topology optimization, the variables in HS were transformed to those in topology optimization. Compliance was used as an objective function, and harmony memory was defined as the set of the optimized topology. Also, a parametric study for Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), and Bandwidth (BW) was performed to find the appropriate range for topology optimization. Various techniques were employed such as a filtering scheme, simple average scheme and harmony rate. To provide a robust optimized topology, the concept of the harmony rate update rule was also implemented. Numerical examples are provided to verify the effectiveness of the HS by comparing the optimal layouts of the HS with those of Bidirectional evolutionary structural optimization (BESO) and Artificial bee colony algorithm (ABCA). The following conclu- sions could be made: (1) The proposed topology scheme is very effective for static stiffness topology optimization problems in terms of stability, robustness and convergence rate. (2) The suggested method provides a symmetric optimized topology despite the fact that the HS is a stochastic method like the ABCA. (3) The proposed scheme is applicable and practical in manufacturing since it produces a solid-void design of the optimized topology. (4) The suggested method appears to be very effective for large scale problems like topology optimization.

  11. Application of Computational Intelligence Methods to In-Core Fuel Management

    International Nuclear Information System (INIS)

    Erdogan, A.

    2001-01-01

    k e ff higher than reference values were stored as candidate optimum patterns. At the last stage of the work, an alternative loading pattern generator based on genetic algorithm method was developed. In this method, an initial loading pattern is improved by applying the genetic operators to obtain the optimum. The loading patterns obtained from the rule-based and the genetic algorithm methods were compared, and the genetic algorithm was shown to be more successful than the former. It was seen that, it is possible to automate in-core fuel management activities by applying artificial intelligence techniques

  12. Development of Pulsar Detection Methods for a Galactic Center Search

    Science.gov (United States)

    Thornton, Stephen; Wharton, Robert; Cordes, James; Chatterjee, Shami

    2018-01-01

    Finding pulsars within the inner parsec of the galactic center would be incredibly beneficial: for pulsars sufficiently close to Sagittarius A*, extremely precise tests of general relativity in the strong field regime could be performed through measurement of post-Keplerian parameters. Binary pulsar systems with sufficiently short orbital periods could provide the same laboratories with which to test existing theories. Fast and efficient methods are needed to parse large sets of time-domain data from different telescopes to search for periodicity in signals and differentiate radio frequency interference (RFI) from pulsar signals. Here we demonstrate several techniques to reduce red noise (low-frequency interference), generate signals from pulsars in binary orbits, and create plots that allow for fast detection of both RFI and pulsars.

  13. A REVIEW OF VIBRATION MACHINE DIAGNOSTICS BY USING ARTIFICIAL INTELLIGENCE METHODS

    Directory of Open Access Journals (Sweden)

    Grover Zurita

    2016-09-01

    Full Text Available In the industry, gears and rolling bearings failures are one of the foremost causes of breakdown in rotating machines, reducing availability time of the production and resulting in costly systems downtime. Therefore, there are growing demands for vibration condition based monitoring of gears and bearings, and any method in order to improve the effectiveness, reliability, and accuracy of the bearing faults diagnosis ought to be evaluated. In order to perform machine diagnosis efficiently, researchers have extensively investigated different advanced digital signal processing techniques and artificial intelligence methods to accurately extract fault characteristics from vibration signals. The main goal of this article is to present the state-of-the-art development in vibration analysis for machine diagnosis based on artificial intelligence methods.

  14. Intelligent screening of electrofusion-polyethylene joints based on a thermal NDT method

    Science.gov (United States)

    Doaei, Marjan; Tavallali, M. Sadegh

    2018-05-01

    The combinations of infrared thermal images and artificial intelligence methods have opened new avenues for pushing the boundaries of available testing methods. Hence, in the current study, a novel thermal non-destructive testing method for polyethylene electrofusion joints was combined with k-means clustering algorithms as an intelligent screening tool. The experiments focused on ovality of pipes in the coupler, as well as misalignment of pipes-couplers in 25 mm diameter joints. The temperature responses of each joint to an internal heat pulse were recorded by an IR thermal camera, and further processed to identify the faulty joints. The results represented clustering accuracy of 92%, as well as more than 90% abnormality detection capabilities.

  15. Evaluation of a new method for librarian-mediated literature searches for systematic reviews

    NARCIS (Netherlands)

    W.M. Bramer (Wichor); Rethlefsen, M.L. (Melissa L.); F. Mast (Frans); J. Kleijnen (Jos)

    2017-01-01

    textabstractObjective: To evaluate and validate the time of completion and results of a new method of searching for systematic reviews, the exhaustive search method (ESM), using a pragmatic comparison. Methods: Single-line search strategies were prepared in a text document. Term completeness was

  16. Integrating Symbolic and Statistical Methods for Testing Intelligent Systems Applications to Machine Learning and Computer Vision

    Energy Technology Data Exchange (ETDEWEB)

    Jha, Sumit Kumar [University of Central Florida, Orlando; Pullum, Laura L [ORNL; Ramanathan, Arvind [ORNL

    2016-01-01

    Embedded intelligent systems ranging from tiny im- plantable biomedical devices to large swarms of autonomous un- manned aerial systems are becoming pervasive in our daily lives. While we depend on the flawless functioning of such intelligent systems, and often take their behavioral correctness and safety for granted, it is notoriously difficult to generate test cases that expose subtle errors in the implementations of machine learning algorithms. Hence, the validation of intelligent systems is usually achieved by studying their behavior on representative data sets, using methods such as cross-validation and bootstrapping.In this paper, we present a new testing methodology for studying the correctness of intelligent systems. Our approach uses symbolic decision procedures coupled with statistical hypothesis testing to. We also use our algorithm to analyze the robustness of a human detection algorithm built using the OpenCV open-source computer vision library. We show that the human detection implementation can fail to detect humans in perturbed video frames even when the perturbations are so small that the corresponding frames look identical to the naked eye.

  17. Effects of Cooperative Learning Method Type Stad, Language Aptitude, and Intelligence on the Achievement English Hotel at Medan Tourism Academy

    Directory of Open Access Journals (Sweden)

    Abdul Kadir Ritonga

    2017-01-01

    Full Text Available STAD cooperative learning method which is considered effective in achieving the goal of learning the English language, especially for students majoring in Tourism Academy who are required to master English for Specific Purposes (ESP in accordance with their needs. This study uses factorial design 2x3x3 version of the non-equivalent control group design with ANOVA 3 Ways. The subjects were students MDK III / 5 A and B courses MDK III.5 Rooms Division department Hospitality Academy Year 2015/2016. The samples are saturated samples. Data were collected through a pretest, posttest, and instrument of Language Aptitude and Intelligence parametric statistics analyzed by parametric statistics with significance level of 0.05%. The results showed that: (1 there are differences between method STAD cooperative learning and expository on Hospitality English achievement, (2 there are differences between the students who have high language aptitude and low language aptitude on English achievement, (3 there are differences between students who have high language aptitude and medium on Hospitality English achievement, (4 there are differences between students who have the medium language aptitude and low language aptitude on Hospitality English achievement, (5 there are differences between students who have high intelligence and low intelligence\\ on Hospitality English achievement, (6 there are no differences between who have high intelligence and medium intelligence on Hospitality English achievement, (7 there are differences between students who have the medium intelligence and low intelligence on Hospitality English achievement, (8 there is no interaction between the learning method and language aptitude on Hospitality English achievement, (9 there is an interaction between the learning method and the intelligence on Hospitality English achievement, (10 there is no interaction between intelligence and language aptitude on Hospitality English achievement. (11

  18. Control and Driving Methods for LED Based Intelligent Light Sources

    DEFF Research Database (Denmark)

    Beczkowski, Szymon

    of the diode is controlled either by varying the magnitude of the current or by driving the LED with a pulsed current and regulate the width of the pulse. It has been shown previously, that these two methods yield different effects on diode's efficacy and colour point. A hybrid dimming strategy has been...... proposed where two variable quantities control the intensity of the diode. This increases the controllability of the diode giving new optimisation possibilities. It has been shown that it is possible to compensate for temperature drift of white diode's colour point using hybrid dimming strategy. Also...

  19. Artificial Intelligence Methods Applied to Parameter Detection of Atrial Fibrillation

    Science.gov (United States)

    Arotaritei, D.; Rotariu, C.

    2015-09-01

    In this paper we present a novel method to develop an atrial fibrillation (AF) based on statistical descriptors and hybrid neuro-fuzzy and crisp system. The inference of system produce rules of type if-then-else that care extracted to construct a binary decision system: normal of atrial fibrillation. We use TPR (Turning Point Ratio), SE (Shannon Entropy) and RMSSD (Root Mean Square of Successive Differences) along with a new descriptor, Teager- Kaiser energy, in order to improve the accuracy of detection. The descriptors are calculated over a sliding window that produce very large number of vectors (massive dataset) used by classifier. The length of window is a crisp descriptor meanwhile the rest of descriptors are interval-valued type. The parameters of hybrid system are adapted using Genetic Algorithm (GA) algorithm with fitness single objective target: highest values for sensibility and sensitivity. The rules are extracted and they are part of the decision system. The proposed method was tested using the Physionet MIT-BIH Atrial Fibrillation Database and the experimental results revealed a good accuracy of AF detection in terms of sensitivity and specificity (above 90%).

  20. Aerial robot intelligent control method based on back-stepping

    Science.gov (United States)

    Zhou, Jian; Xue, Qian

    2018-05-01

    The aerial robot is characterized as strong nonlinearity, high coupling and parameter uncertainty, a self-adaptive back-stepping control method based on neural network is proposed in this paper. The uncertain part of the aerial robot model is compensated online by the neural network of Cerebellum Model Articulation Controller and robust control items are designed to overcome the uncertainty error of the system during online learning. At the same time, particle swarm algorithm is used to optimize and fix parameters so as to improve the dynamic performance, and control law is obtained by the recursion of back-stepping regression. Simulation results show that the designed control law has desired attitude tracking performance and good robustness in case of uncertainties and large errors in the model parameters.

  1. Advances in intelligent diagnosis methods for pulmonary ground-glass opacity nodules.

    Science.gov (United States)

    Yang, Jing; Wang, Hailin; Geng, Chen; Dai, Yakang; Ji, Jiansong

    2018-02-07

    Pulmonary nodule is one of the important lesions of lung cancer, mainly divided into two categories of solid nodules and ground glass nodules. The improvement of diagnosis of lung cancer has significant clinical significance, which could be realized by machine learning techniques. At present, there have been a lot of researches focusing on solid nodules. But the research on ground glass nodules started late, and lacked research results. This paper summarizes the research progress of the method of intelligent diagnosis for pulmonary nodules since 2014. It is described in details from four aspects: nodular signs, data analysis methods, prediction models and system evaluation. This paper aims to provide the research material for researchers of the clinical diagnosis and intelligent analysis of lung cancer, and further improve the precision of pulmonary ground glass nodule diagnosis.

  2. An intelligent detection method for high-field asymmetric waveform ion mobility spectrometry.

    Science.gov (United States)

    Li, Yue; Yu, Jianwen; Ruan, Zhiming; Chen, Chilai; Chen, Ran; Wang, Han; Liu, Youjiang; Wang, Xiaozhi; Li, Shan

    2018-04-01

    In conventional high-field asymmetric waveform ion mobility spectrometry signal acquisition, multi-cycle detection is time consuming and limits somewhat the technique's scope for rapid field detection. In this study, a novel intelligent detection approach has been developed in which a threshold was set on the relative error of α parameters, which can eliminate unnecessary time spent on detection. In this method, two full-spectrum scans were made in advance to obtain the estimated compensation voltage at different dispersion voltages, resulting in a narrowing down of the whole scan area to just the peak area(s) of interest. This intelligent detection method can reduce the detection time to 5-10% of that of the original full-spectrum scan in a single cycle.

  3. A Lateral Control Method of Intelligent Vehicle Based on Fuzzy Neural Network

    Directory of Open Access Journals (Sweden)

    Linhui Li

    2015-01-01

    Full Text Available A lateral control method is proposed for intelligent vehicle to track the desired trajectory. Firstly, a lateral control model is established based on the visual preview and dynamic characteristics of intelligent vehicle. Then, the lateral error and orientation error are melded into an integrated error. Considering the system parameter perturbation and the external interference, a sliding model control is introduced in this paper. In order to design a sliding surface, the integrated error is chosen as the parameter of the sliding mode switching function. The sliding mode switching function and its derivative are selected as two inputs of the controller, and the front wheel angle is selected as the output. Next, a fuzzy neural network is established, and the self-learning functions of neural network is utilized to construct the fuzzy rules. Finally, the simulation results demonstrate the effectiveness and robustness of the proposed method.

  4. Condition Monitoring Using Computational Intelligence Methods Applications in Mechanical and Electrical Systems

    CERN Document Server

    Marwala, Tshilidzi

    2012-01-01

    Condition monitoring uses the observed operating characteristics of a machine or structure to diagnose trends in the signal being monitored and to predict the need for maintenance before a breakdown occurs. This reduces the risk, inherent in a fixed maintenance schedule, of performing maintenance needlessly early or of having a machine fail before maintenance is due either of which can be expensive with the latter also posing a risk of serious accident especially in systems like aeroengines in which a catastrophic failure would put lives at risk. The technique also measures responses from the whole of the system under observation so it can detect the effects of faults which might be hidden deep within a system, hidden from traditional methods of inspection. Condition Monitoring Using Computational Intelligence Methods promotes the various approaches gathered under the umbrella of computational intelligence to show how condition monitoring can be used to avoid equipment failures and lengthen its useful life, m...

  5. Developing an Intelligent Automatic Appendix Extraction Method from Ultrasonography Based on Fuzzy ART and Image Processing

    Directory of Open Access Journals (Sweden)

    Kwang Baek Kim

    2015-01-01

    Full Text Available Ultrasound examination (US does a key role in the diagnosis and management of the patients with clinically suspected appendicitis which is the most common abdominal surgical emergency. Among the various sonographic findings of appendicitis, outer diameter of the appendix is most important. Therefore, clear delineation of the appendix on US images is essential. In this paper, we propose a new intelligent method to extract appendix automatically from abdominal sonographic images as a basic building block of developing such an intelligent tool for medical practitioners. Knowing that the appendix is located at the lower organ area below the bottom fascia line, we conduct a series of image processing techniques to find the fascia line correctly. And then we apply fuzzy ART learning algorithm to the organ area in order to extract appendix accurately. The experiment verifies that the proposed method is highly accurate (successful in 38 out of 40 cases in extracting appendix.

  6. Long Term Solar Radiation Forecast Using Computational Intelligence Methods

    Directory of Open Access Journals (Sweden)

    João Paulo Coelho

    2014-01-01

    Full Text Available The point prediction quality is closely related to the model that explains the dynamic of the observed process. Sometimes the model can be obtained by simple algebraic equations but, in the majority of the physical systems, the relevant reality is too hard to model with simple ordinary differential or difference equations. This is the case of systems with nonlinear or nonstationary behaviour which require more complex models. The discrete time-series problem, obtained by sampling the solar radiation, can be framed in this type of situation. By observing the collected data it is possible to distinguish multiple regimes. Additionally, due to atmospheric disturbances such as clouds, the temporal structure between samples is complex and is best described by nonlinear models. This paper reports the solar radiation prediction by using hybrid model that combines support vector regression paradigm and Markov chains. The hybrid model performance is compared with the one obtained by using other methods like autoregressive (AR filters, Markov AR models, and artificial neural networks. The results obtained suggests an increasing prediction performance of the hybrid model regarding both the prediction error and dynamic behaviour.

  7. An Intelligent Fleet Condition-Based Maintenance Decision Making Method Based on Multi-Agent

    OpenAIRE

    Bo Sun; Qiang Feng; Songjie Li

    2012-01-01

    According to the demand for condition-based maintenance online decision making among a mission oriented fleet, an intelligent maintenance decision making method based on Multi-agent and heuristic rules is proposed. The process of condition-based maintenance within an aircraft fleet (each containing one or more Line Replaceable Modules) based on multiple maintenance thresholds is analyzed. Then the process is abstracted into a Multi-Agent Model, a 2-layer model structure containing host negoti...

  8. Method and apparatus for optimizing operation of a power generating plant using artificial intelligence techniques

    Science.gov (United States)

    Wroblewski, David [Mentor, OH; Katrompas, Alexander M [Concord, OH; Parikh, Neel J [Richmond Heights, OH

    2009-09-01

    A method and apparatus for optimizing the operation of a power generating plant using artificial intelligence techniques. One or more decisions D are determined for at least one consecutive time increment, where at least one of the decisions D is associated with a discrete variable for the operation of a power plant device in the power generating plant. In an illustrated embodiment, the power plant device is a soot cleaning device associated with a boiler.

  9. In search of new methods. Qigong in stuttering therapy

    Directory of Open Access Journals (Sweden)

    Paweł Półrola

    2013-10-01

    Full Text Available Introduction : Even though stuttering is probably as old a phenomenon as the human speech itself, the stuttering therapy is still a challenge for the therapist and requires constant searching for new methods. Qigong may prove to be one of them. Aim of the research: The research paper presents the results of an experimental investigation evaluating the usefulness of qigong practice in stuttering therapy. Material and methods: Two groups of stuttering adults underwent 6-month therapy. In group I – the experimental one (n = 11 – the therapy consisted of speech fluency training, psychotherapy and qigong practice. In group II – the control one (n = 12 – it included speech fluency training and psychotherapy. In both groups 2-hour sessions of speech fluency training and psychotherapy were conducted twice a week. Two-hour qigong sessions took place once a week. Results: After 6 months the therapy results were compared with regard to the basic stuttering parameters, such as the degree of speech disfluency, the level of logophobia and speech disfluency symptoms. Improvement was observed in both groups, the beneficial effects, however, being more prominent in the qigong-practising group. Conclusions : Qigong exercises used in the therapy of stuttering people along with speech fluency training and psychotherapy give beneficial effects.

  10. Comparison of artificial intelligence methods and empirical equations to estimate daily solar radiation

    Science.gov (United States)

    Mehdizadeh, Saeid; Behmanesh, Javad; Khalili, Keivan

    2016-08-01

    In the present research, three artificial intelligence methods including Gene Expression Programming (GEP), Artificial Neural Networks (ANN) and Adaptive Neuro-Fuzzy Inference System (ANFIS) as well as, 48 empirical equations (10, 12 and 26 equations were temperature-based, sunshine-based and meteorological parameters-based, respectively) were used to estimate daily solar radiation in Kerman, Iran in the period of 1992-2009. To develop the GEP, ANN and ANFIS models, depending on the used empirical equations, various combinations of minimum air temperature, maximum air temperature, mean air temperature, extraterrestrial radiation, actual sunshine duration, maximum possible sunshine duration, sunshine duration ratio, relative humidity and precipitation were considered as inputs in the mentioned intelligent methods. To compare the accuracy of empirical equations and intelligent models, root mean square error (RMSE), mean absolute error (MAE), mean absolute relative error (MARE) and determination coefficient (R2) indices were used. The results showed that in general, sunshine-based and meteorological parameters-based scenarios in ANN and ANFIS models presented high accuracy than mentioned empirical equations. Moreover, the most accurate method in the studied region was ANN11 scenario with five inputs. The values of RMSE, MAE, MARE and R2 indices for the mentioned model were 1.850 MJ m-2 day-1, 1.184 MJ m-2 day-1, 9.58% and 0.935, respectively.

  11. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    OpenAIRE

    Zheluk, Andrey; Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's a...

  12. Performance comparison of a new hybrid conjugate gradient method under exact and inexact line searches

    Science.gov (United States)

    Ghani, N. H. A.; Mohamed, N. S.; Zull, N.; Shoid, S.; Rivaie, M.; Mamat, M.

    2017-09-01

    Conjugate gradient (CG) method is one of iterative techniques prominently used in solving unconstrained optimization problems due to its simplicity, low memory storage, and good convergence analysis. This paper presents a new hybrid conjugate gradient method, named NRM1 method. The method is analyzed under the exact and inexact line searches in given conditions. Theoretically, proofs show that the NRM1 method satisfies the sufficient descent condition with both line searches. The computational result indicates that NRM1 method is capable in solving the standard unconstrained optimization problems used. On the other hand, the NRM1 method performs better under inexact line search compared with exact line search.

  13. A modified harmony search based method for optimal rural radial ...

    African Journals Online (AJOL)

    International Journal of Engineering, Science and Technology. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 2, No 3 (2010) >. Log in or Register to get access to full text downloads.

  14. Application of artificial intelligence methods for prediction of steel mechanical properties

    Directory of Open Access Journals (Sweden)

    Z. Jančíková

    2008-10-01

    Full Text Available The target of the contribution is to outline possibilities of applying artificial neural networks for the prediction of mechanical steel properties after heat treatment and to judge their perspective use in this field. The achieved models enable the prediction of final mechanical material properties on the basis of decisive parameters influencing these properties. By applying artificial intelligence methods in combination with mathematic-physical analysis methods it will be possible to create facilities for designing a system of the continuous rationalization of existing and also newly developing industrial technologies.

  15. SOLVING TRANSPORT LOGISTICS PROBLEMS IN A VIRTUAL ENTERPRISE THROUGH ARTIFICIAL INTELLIGENCE METHODS

    Directory of Open Access Journals (Sweden)

    Vitaliy PAVLENKO

    2017-06-01

    Full Text Available The paper offers a solution to the problem of material flow allocation within a virtual enterprise by using artificial intelligence methods. The research is based on the use of fuzzy relations when planning for optimal transportation modes to deliver components for manufactured products. The Fuzzy Logic Toolbox is used to determine the optimal route for transportation of components for manufactured products. The methods offered have been exemplified in the present research. The authors have built a simulation model for component transportation and delivery for manufactured products using the Simulink graphical environment for building models.

  16. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    Science.gov (United States)

    Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. Methods A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. Results We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r s = 0.88, P < .001), “Bychkov” (r s = .78, P < .001) and “Khimki”(r s = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for “Bychkov” and

  17. Intelligent tuning method of PID parameters based on iterative learning control for atomic force microscopy.

    Science.gov (United States)

    Liu, Hui; Li, Yingzi; Zhang, Yingxu; Chen, Yifu; Song, Zihang; Wang, Zhenyu; Zhang, Suoxin; Qian, Jianqiang

    2018-01-01

    Proportional-integral-derivative (PID) parameters play a vital role in the imaging process of an atomic force microscope (AFM). Traditional parameter tuning methods require a lot of manpower and it is difficult to set PID parameters in unattended working environments. In this manuscript, an intelligent tuning method of PID parameters based on iterative learning control is proposed to self-adjust PID parameters of the AFM according to the sample topography. This method gets enough information about the output signals of PID controller and tracking error, which will be used to calculate the proper PID parameters, by repeated line scanning until convergence before normal scanning to learn the topography. Subsequently, the appropriate PID parameters are obtained by fitting method and then applied to the normal scanning process. The feasibility of the method is demonstrated by the convergence analysis. Simulations and experimental results indicate that the proposed method can intelligently tune PID parameters of the AFM for imaging different topographies and thus achieve good tracking performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Cognitive Artificial Intelligence Method for Interpreting Transformer Condition Based on Maintenance Data

    Directory of Open Access Journals (Sweden)

    Karel Octavianus Bachri

    2017-07-01

    Full Text Available A3S(Arwin-Adang-Aciek-Sembiring is a method of information fusion at a single observation and OMA3S(Observation Multi-time A3S is a method of information fusion for time-series data. This paper proposes OMA3S-based Cognitive Artificial-Intelligence method for interpreting Transformer Condition, which is calculated based on maintenance data from Indonesia National Electric Company (PLN. First, the proposed method is tested using the previously published data, and then followed by implementation on maintenance data. Maintenance data are fused to obtain part condition, and part conditions are fused to obtain transformer condition. Result shows proposed method is valid for DGA fault identification with the average accuracy of 91.1%. The proposed method not only can interpret the major fault, it can also identify the minor fault occurring along with the major fault, allowing early warning feature. Result also shows part conditions can be interpreted using information fusion on maintenance data, and the transformer condition can be interpreted using information fusion on part conditions. The future works on this research is to gather more data, to elaborate more factors to be fused, and to design a cognitive processor that can be used to implement this concept of intelligent instrumentation.

  19. Searching for truth: internet search patterns as a method of investigating online responses to a Russian illicit drug policy debate.

    Science.gov (United States)

    Zheluk, Andrey; Gillespie, James A; Quinn, Casey

    2012-12-13

    This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r(s) = 0.88, P < .001), "Bychkov" (r(s) = .78, P < .001) and "Khimki"(r(s) = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for "Bychkov" and 48,084 for "Egor Bychkov", compared to 53

  20. Research on Intelligent Avoidance Method of Shipwreck Based on Bigdata Analysis

    Directory of Open Access Journals (Sweden)

    Li Wei

    2017-11-01

    Full Text Available In order to solve the problem that current avoidance method of shipwreck has the problem of low success rate of avoidance, this paper proposes a method of intelligent avoidance of shipwreck based on big data analysis. Firstly,our method used big data analysis to calculate the safe distance of approach of ship under the head-on situation, the crossing situation and the overtaking situation.On this basis, by calculating the risk-degree of collision of ships,our research determined the degree of immediate danger of ships.Finally, we calculated the three kinds of evaluation function of ship navigation, and used genetic algorithm to realize the intelligent avoidance of shipwreck.Experimental result shows that compared the proposed method with the traditional method in two in a recent meeting when the distance to closest point of approach between two ships is 0.13nmile, they can effectively evade.The success rate of avoidance is high.

  1. A human-machine interface evaluation method: A difficulty evaluation method in information searching (DEMIS)

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2009-01-01

    A human-machine interface (HMI) evaluation method, which is named 'difficulty evaluation method in information searching (DEMIS)', is proposed and demonstrated with an experimental study. The DEMIS is based on a human performance model and two measures of attentional-resource effectiveness in monitoring and detection tasks in nuclear power plants (NPPs). Operator competence and HMI design are modeled to be most significant factors to human performance. One of the two effectiveness measures is fixation-to-importance ratio (FIR) which represents attentional resource (eye fixations) spent on an information source compared to importance of the information source. The other measure is selective attention effectiveness (SAE) which incorporates FIRs for all information sources. The underlying principle of the measures is that the information source should be selectively attended to according to its informational importance. In this study, poor performance in information searching tasks is modeled to be coupled with difficulties caused by poor mental models of operators or/and poor HMI design. Human performance in information searching tasks is evaluated by analyzing the FIR and the SAE. Operator mental models are evaluated by a questionnaire-based method. Then difficulties caused by a poor HMI design are evaluated by a focused interview based on the FIR evaluation and then root causes leading to poor performance are identified in a systematic way.

  2. Search Method Based on Figurative Indexation of Folksonomic Features of Graphic Files

    Directory of Open Access Journals (Sweden)

    Oleg V. Bisikalo

    2013-11-01

    Full Text Available In this paper the search method based on usage of figurative indexation of folksonomic characteristics of graphical files is described. The method takes into account extralinguistic information, is based on using a model of figurative thinking of humans. The paper displays the creation of a method of searching image files based on their formal, including folksonomical clues.

  3. Hybrid Modeling and Optimization of Manufacturing Combining Artificial Intelligence and Finite Element Method

    CERN Document Server

    Quiza, Ramón; Davim, J Paulo

    2012-01-01

    Artificial intelligence (AI) techniques and the finite element method (FEM) are both powerful computing tools, which are extensively used for modeling and optimizing manufacturing processes. The combination of these tools has resulted in a new flexible and robust approach as several recent studies have shown. This book aims to review the work already done in this field as well as to expose the new possibilities and foreseen trends. The book is expected to be useful for postgraduate students and researchers, working in the area of modeling and optimization of manufacturing processes.

  4. Searching methods for biometric identification systems: Fundamental limits

    NARCIS (Netherlands)

    Willems, F.M.J.

    2009-01-01

    We study two-stage search procedures for biometric identification systems in an information-theoretical setting. Our main conclusion is that clustering based on vector-quantization achieves the optimum trade-off between the number of clusters (cluster rate) and the number of individuals within a

  5. Intelligent Detection of Structure from Remote Sensing Images Based on Deep Learning Method

    Science.gov (United States)

    Xin, L.

    2018-04-01

    Utilizing high-resolution remote sensing images for earth observation has become the common method of land use monitoring. It requires great human participation when dealing with traditional image interpretation, which is inefficient and difficult to guarantee the accuracy. At present, the artificial intelligent method such as deep learning has a large number of advantages in the aspect of image recognition. By means of a large amount of remote sensing image samples and deep neural network models, we can rapidly decipher the objects of interest such as buildings, etc. Whether in terms of efficiency or accuracy, deep learning method is more preponderant. This paper explains the research of deep learning method by a great mount of remote sensing image samples and verifies the feasibility of building extraction via experiments.

  6. An Intelligent Method of Product Scheme Design Based on Product Gene

    Directory of Open Access Journals (Sweden)

    Qing Song Ai

    2013-01-01

    Full Text Available Nowadays, in order to have some featured products, many customers tend to buy customized products instead of buying common ones in supermarket. The manufacturing enterprises, with the purpose of improving their competitiveness, are focusing on providing customized products with high quality and low cost as well. At present, how to produce customized products rapidly and cheaply has been the key challenge to manufacturing enterprises. In this paper, an intelligent modeling approach applied to supporting the modeling of customized products is proposed, which may improve the efficiency during the product design process. Specifically, the product gene (PG method, which is an analogy of biological evolution in engineering area, is employed to model products in a new way. Based on product gene, we focus on the intelligent modeling method to generate product schemes rapidly and automatically. The process of our research includes three steps: (1 develop a product gene model for customized products; (2 find the obtainment and storage method for product gene; and (3 propose a specific genetic algorithm used for calculating the solution of customized product and generating new product schemes. Finally, a case study is applied to test the usefulness of our study.

  7. A study of certain Monte Carlo search and optimisation methods

    International Nuclear Information System (INIS)

    Budd, C.

    1984-11-01

    Studies are described which might lead to the development of a search and optimisation facility for the Monte Carlo criticality code MONK. The facility envisaged could be used to maximise a function of k-effective with respect to certain parameters of the system or, alternatively, to find the system (in a given range of systems) for which that function takes a given value. (UK)

  8. Prediction of shear wave velocity using empirical correlations and artificial intelligence methods

    Science.gov (United States)

    Maleki, Shahoo; Moradzadeh, Ali; Riabi, Reza Ghavami; Gholami, Raoof; Sadeghzadeh, Farhad

    2014-06-01

    Good understanding of mechanical properties of rock formations is essential during the development and production phases of a hydrocarbon reservoir. Conventionally, these properties are estimated from the petrophysical logs with compression and shear sonic data being the main input to the correlations. This is while in many cases the shear sonic data are not acquired during well logging, which may be for cost saving purposes. In this case, shear wave velocity is estimated using available empirical correlations or artificial intelligent methods proposed during the last few decades. In this paper, petrophysical logs corresponding to a well drilled in southern part of Iran were used to estimate the shear wave velocity using empirical correlations as well as two robust artificial intelligence methods knows as Support Vector Regression (SVR) and Back-Propagation Neural Network (BPNN). Although the results obtained by SVR seem to be reliable, the estimated values are not very precise and considering the importance of shear sonic data as the input into different models, this study suggests acquiring shear sonic data during well logging. It is important to note that the benefits of having reliable shear sonic data for estimation of rock formation mechanical properties will compensate the possible additional costs for acquiring a shear log.

  9. Methods of Computational Intelligence in the Context of Quality Assurance in Foundry Products

    Directory of Open Access Journals (Sweden)

    Rojek G.

    2016-06-01

    Full Text Available One way to ensure the required technical characteristics of castings is the strict control of production parameters affecting the quality of the finished products. If the production process is improperly configured, the resulting defects in castings lead to huge losses. Therefore, from the point of view of economics, it is advisable to use the methods of computational intelligence in the field of quality assurance and adjustment of parameters of future production. At the same time, the development of knowledge in the field of metallurgy, aimed to raise the technical level and efficiency of the manufacture of foundry products, should be followed by the development of information systems to support production processes in order to improve their effectiveness and compliance with the increasingly more stringent requirements of ergonomics, occupational safety, environmental protection and quality. This article is a presentation of artificial intelligence methods used in practical applications related to quality assurance. The problem of control of the production process involves the use of tools such as the induction of decision trees, fuzzy logic, rough set theory, artificial neural networks or case-based reasoning.

  10. Prediction of shear wave velocity using empirical correlations and artificial intelligence methods

    Directory of Open Access Journals (Sweden)

    Shahoo Maleki

    2014-06-01

    Full Text Available Good understanding of mechanical properties of rock formations is essential during the development and production phases of a hydrocarbon reservoir. Conventionally, these properties are estimated from the petrophysical logs with compression and shear sonic data being the main input to the correlations. This is while in many cases the shear sonic data are not acquired during well logging, which may be for cost saving purposes. In this case, shear wave velocity is estimated using available empirical correlations or artificial intelligent methods proposed during the last few decades. In this paper, petrophysical logs corresponding to a well drilled in southern part of Iran were used to estimate the shear wave velocity using empirical correlations as well as two robust artificial intelligence methods knows as Support Vector Regression (SVR and Back-Propagation Neural Network (BPNN. Although the results obtained by SVR seem to be reliable, the estimated values are not very precise and considering the importance of shear sonic data as the input into different models, this study suggests acquiring shear sonic data during well logging. It is important to note that the benefits of having reliable shear sonic data for estimation of rock formation mechanical properties will compensate the possible additional costs for acquiring a shear log.

  11. Intelligent Continuous Double Auction method For Service Allocation in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Nima Farajian

    2013-10-01

    Full Text Available Market-oriented approach is an effective method for resource management because of its regulation of supply and demand and is suitable for cloud environment where the computing resources, either software or hardware, are virtualized and allocated as services from providers to users. In this paper a continuous double auction method for efficient cloud service allocation is presented in which i enables consumers to order various resources (services for workflows and coallocation, ii consumers and providers make bid and request prices based on deadline and workload time and in addition providers can tradeoff between utilization time and price of bids, iii auctioneers can intelligently find optimum matching by sharing and merging resources which result more trades. Experimental results show that proposed method is efficient in terms of successful allocation rate and resource utilization.

  12. Artificial intelligence methods applied for quantitative analysis of natural radioactive sources

    International Nuclear Information System (INIS)

    Medhat, M.E.

    2012-01-01

    Highlights: ► Basic description of artificial neural networks. ► Natural gamma ray sources and problem of detections. ► Application of neural network for peak detection and activity determination. - Abstract: Artificial neural network (ANN) represents one of artificial intelligence methods in the field of modeling and uncertainty in different applications. The objective of the proposed work was focused to apply ANN to identify isotopes and to predict uncertainties of their activities of some natural radioactive sources. The method was tested for analyzing gamma-ray spectra emitted from natural radionuclides in soil samples detected by a high-resolution gamma-ray spectrometry based on HPGe (high purity germanium). The principle of the suggested method is described, including, relevant input parameters definition, input data scaling and networks training. It is clear that there is satisfactory agreement between obtained and predicted results using neural network.

  13. Risk assessment for pipelines with active defects based on artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Anghel, Calin I. [Department of Chemical Engineering, Faculty of Chemistry and Chemical Engineering, University ' Babes-Bolyai' , Cluj-Napoca (Romania)], E-mail: canghel@chem.ubbcluj.ro

    2009-07-15

    The paper provides another insight into the pipeline risk assessment for in-service pressure piping containing defects. Beside of the traditional analytical approximation methods or sampling-based methods safety index and failure probability of pressure piping containing defects will be obtained based on a novel type of support vector machine developed in a minimax manner. The safety index or failure probability is carried out based on a binary classification approach. The procedure named classification reliability procedure, involving a link between artificial intelligence and reliability methods was developed as a user-friendly computer program in MATLAB language. To reveal the capacity of the proposed procedure two comparative numerical examples replicating a previous related work and predicting the failure probabilities of pressured pipeline with defects were presented.

  14. Risk assessment for pipelines with active defects based on artificial intelligence methods

    International Nuclear Information System (INIS)

    Anghel, Calin I.

    2009-01-01

    The paper provides another insight into the pipeline risk assessment for in-service pressure piping containing defects. Beside of the traditional analytical approximation methods or sampling-based methods safety index and failure probability of pressure piping containing defects will be obtained based on a novel type of support vector machine developed in a minimax manner. The safety index or failure probability is carried out based on a binary classification approach. The procedure named classification reliability procedure, involving a link between artificial intelligence and reliability methods was developed as a user-friendly computer program in MATLAB language. To reveal the capacity of the proposed procedure two comparative numerical examples replicating a previous related work and predicting the failure probabilities of pressured pipeline with defects were presented.

  15. Study on Fault Diagnostics of a Turboprop Engine Using Inverse Performance Model and Artificial Intelligent Methods

    Science.gov (United States)

    Kong, Changduk; Lim, Semyeong

    2011-12-01

    Recently, the health monitoring system of major gas path components of gas turbine uses mostly the model based method like the Gas Path Analysis (GPA). This method is to find quantity changes of component performance characteristic parameters such as isentropic efficiency and mass flow parameter by comparing between measured engine performance parameters such as temperatures, pressures, rotational speeds, fuel consumption, etc. and clean engine performance parameters without any engine faults which are calculated by the base engine performance model. Currently, the expert engine diagnostic systems using the artificial intelligent methods such as Neural Networks (NNs), Fuzzy Logic and Genetic Algorithms (GAs) have been studied to improve the model based method. Among them the NNs are mostly used to the engine fault diagnostic system due to its good learning performance, but it has a drawback due to low accuracy and long learning time to build learning data base if there are large amount of learning data. In addition, it has a very complex structure for finding effectively single type faults or multiple type faults of gas path components. This work builds inversely a base performance model of a turboprop engine to be used for a high altitude operation UAV using measured performance data, and proposes a fault diagnostic system using the base engine performance model and the artificial intelligent methods such as Fuzzy logic and Neural Network. The proposed diagnostic system isolates firstly the faulted components using Fuzzy Logic, then quantifies faults of the identified components using the NN leaned by fault learning data base, which are obtained from the developed base performance model. In leaning the NN, the Feed Forward Back Propagation (FFBP) method is used. Finally, it is verified through several test examples that the component faults implanted arbitrarily in the engine are well isolated and quantified by the proposed diagnostic system.

  16. Learning Search Algorithms: An Educational View

    Directory of Open Access Journals (Sweden)

    Ales Janota

    2014-12-01

    Full Text Available Artificial intelligence methods find their practical usage in many applications including maritime industry. The paper concentrates on the methods of uninformed and informed search, potentially usable in solving of complex problems based on the state space representation. The problem of introducing the search algorithms to newcomers has its technical and psychological dimensions. The authors show how it is possible to cope with both of them through design and use of specialized authoring systems. A typical example of searching a path through the maze is used to demonstrate how to test, observe and compare properties of various search strategies. Performance of search methods is evaluated based on the common criteria.

  17. A comprehensive review of the use of computational intelligence methods in mineral exploration

    Directory of Open Access Journals (Sweden)

    Habibollah Bazdar

    2017-11-01

    Full Text Available Introduction Mineral exploration is a process by which it is decided whether or not continuing explorations at the end of each stage t will be cost-effective or not. This decision is dependent upon many factors including technical factors, economic, social and other related factors. All new methods used in mineral exploration are meant to make this decision making more simplified. In recent years, advanced computational intelligence methods for modeling along with many other disciplines of science, including the science of mineral exploration have been used. Although the results of the application of these methods show a good performance, it is essential to determine the mineral potential in terms of geology, mineralogy, petrology and other factors for a final decision. The purpose of this paper is to provide a comprehensive set of mineral exploration research and different applications of computational intelligence techniques in this respect during the last decades. Materials and methods Artificial neural network and its application in mineral exploration Artificial neural network (ANN is a series of communications between the units or nodes that try to function like neurons of the human brain (Jorjani et al., 2008. The network processing capability of communication between the units and the weights connection originates or comes from learning or are predetermined (Monjezi and Dehghani, 2008. The ANN method has been applied in different branches of mining exploration in the last decades (Brown et al., 2000; Leite and de Souza Filho, 2009; Porwal et al., 2003. Support vector machines (SVM and its application in mineral exploration SVM uses a set of examples with known class of information to build a linear hyperplane separating samples of different classes. This initial dataset is known as a training set and every sample within it is characterized by features upon which the classification is based (Smirnoff et al., 2008. The SVM classifier is a

  18. On the Application of Formal Methods to Clinical Guidelines, an Artificial Intelligence Perspective

    NARCIS (Netherlands)

    Hommersom, A.J.

    2008-01-01

    In computer science, all kinds of methods and techniques have been developed to study systems, such as simulation of the behaviour of a system. Furthermore, it is possible to study these systems by proving formal formal properties or by searching through all the possible states that a system may be

  19. Quality control of intelligence research

    International Nuclear Information System (INIS)

    Lu Yan; Xin Pingping; Wu Jian

    2014-01-01

    Quality control of intelligence research is the core issue of intelligence management, is a problem in study of information science This paper focuses on the performance of intelligence to explain the significance of intelligence research quality control. In summing up the results of the study on the basis of the analysis, discusses quality control methods in intelligence research, introduces the experience of foreign intelligence research quality control, proposes some recommendations to improve quality control in intelligence research. (authors)

  20. Research on Large-Scale Road Network Partition and Route Search Method Combined with Traveler Preferences

    Directory of Open Access Journals (Sweden)

    De-Xin Yu

    2013-01-01

    Full Text Available Combined with improved Pallottino parallel algorithm, this paper proposes a large-scale route search method, which considers travelers’ route choice preferences. And urban road network is decomposed into multilayers effectively. Utilizing generalized travel time as road impedance function, the method builds a new multilayer and multitasking road network data storage structure with object-oriented class definition. Then, the proposed path search algorithm is verified by using the real road network of Guangzhou city as an example. By the sensitive experiments, we make a comparative analysis of the proposed path search method with the current advanced optimal path algorithms. The results demonstrate that the proposed method can increase the road network search efficiency by more than 16% under different search proportion requests, node numbers, and computing process numbers, respectively. Therefore, this method is a great breakthrough in the guidance field of urban road network.

  1. Intelligence diagnosis method for roller bearings using features of AE signal

    International Nuclear Information System (INIS)

    Pan, J; Wang, H Q; Wang, F; Yang, J F; Liu, W B

    2012-01-01

    Rolling bearings are important components in rotating machines, which are wildly used in industrial production. The fault diagnosis technology plays a very important role for quality and life of machines. Based on symptom parameters of acoustic emission (AE) signals, this paper presents an intelligent diagnosis method for roller bearings using the principal component analysis, rough sets, and BP neural network to detect faults and distinguish fault types. The principal component analysis and the rough sets algorithm are used to reduce details of time-domain symptom parameters for training the BP neural network. The BP neural network, which is used for condition diagnosis of roller bearings, can obtain good convergence using the symptom parameters acquired by the principal component analysis and the rough sets during learning, and automatically distinguish fault types during diagnosing. Practical examples are provided to verify the efficiency of the proposed method.

  2. Methods and Technologies of XML Data Modeling for IP Mode Intelligent Measuring and Controlling System

    International Nuclear Information System (INIS)

    Liu, G X; Hong, X B; Liu, J G

    2006-01-01

    This paper presents the IP mode intelligent measuring and controlling system (IMIMCS). Based on object-oriented modeling technology of UML and XML Schema, the innovative methods and technologies of some key problems for XML data modeling in the IMIMCS were especially discussed, including refinement for systemic business by means of use-case diagram of UML, the confirmation of the content of XML data model and logic relationship of the objects of XML Schema with the aid of class diagram of UML, the mapping rules from the UML object model to XML Schema. Finally, the application of the IMIMCS based on XML for a modern greenhouse was presented. The results show that the modeling methods of the measuring and controlling data in the IMIMCS involving the multi-layer structure and many operating systems process strong reliability and flexibility, guarantee uniformity of complex XML documents and meet the requirement of data communication across platform

  3. Nuclear power plant monitoring and fault diagnosis methods based on the artificial intelligence technique

    International Nuclear Information System (INIS)

    Yoshikawa, S.; Saiki, A.; Ugolini, D.; Ozawa, K.

    1996-01-01

    The main objective of this paper is to develop an advanced diagnosis system based on the artificial intelligence technique to monitor the operation and to improve the operational safety of nuclear power plants. Three different methods have been elaborated in this study: an artificial neural network local diagnosis (NN ds ) scheme that acting at the component level discriminates between normal and abnormal transients, a model-based diagnostic reasoning mechanism that combines a physical causal network model-based knowledge compiler (KC) that generates applicable diagnostic rules from widely accepted physical knowledge compiler (KC) that generates applicable diagnostic rules from widely accepted physical knowledge. Although the three methods have been developed and verified independently, they are highly correlated and, when connected together, form a effective and robust diagnosis and monitoring tool. (authors)

  4. Text mining for search term development in systematic reviewing: A discussion of some methods and challenges.

    Science.gov (United States)

    Stansfield, Claire; O'Mara-Eves, Alison; Thomas, James

    2017-09-01

    Using text mining to aid the development of database search strings for topics described by diverse terminology has potential benefits for systematic reviews; however, methods and tools for accomplishing this are poorly covered in the research methods literature. We briefly review the literature on applications of text mining for search term development for systematic reviewing. We found that the tools can be used in 5 overarching ways: improving the precision of searches; identifying search terms to improve search sensitivity; aiding the translation of search strategies across databases; searching and screening within an integrated system; and developing objectively derived search strategies. Using a case study and selected examples, we then reflect on the utility of certain technologies (term frequency-inverse document frequency and Termine, term frequency, and clustering) in improving the precision and sensitivity of searches. Challenges in using these tools are discussed. The utility of these tools is influenced by the different capabilities of the tools, the way the tools are used, and the text that is analysed. Increased awareness of how the tools perform facilitates the further development of methods for their use in systematic reviews. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Simulation embedded artificial intelligence search method for supplier trading portfolio decision

    DEFF Research Database (Denmark)

    Feng, Donghan; Yan, Z.; Østergaard, Jacob

    2010-01-01

    An electric power supplier in the deregulated environment needs to allocate its generation capacities to participate in contract and spot markets. Different trading portfolios will provide suppliers with different future revenue streams of various distributions. The classical mean-variance (MV) m...

  6. New hybrid conjugate gradient methods with the generalized Wolfe line search.

    Science.gov (United States)

    Xu, Xiao; Kong, Fan-Yu

    2016-01-01

    The conjugate gradient method was an efficient technique for solving the unconstrained optimization problem. In this paper, we made a linear combination with parameters β k of the DY method and the HS method, and putted forward the hybrid method of DY and HS. We also proposed the hybrid of FR and PRP by the same mean. Additionally, to present the two hybrid methods, we promoted the Wolfe line search respectively to compute the step size α k of the two hybrid methods. With the new Wolfe line search, the two hybrid methods had descent property and global convergence property of the two hybrid methods that can also be proved.

  7. Search for new and improved radiolabeling methods for monoclonal antibodies

    International Nuclear Information System (INIS)

    Hiltunen, J.V.

    1993-01-01

    In this review the selection of different radioisotopes is discussed as well as the various traditional or newer methods to introduce the radiolabel into the antibody structure. Labeling methods for radiohalogens, for technetium and rhenium isotopes, and for 3-valent cation radiometals are reviewed. Some of the newer methods offer simplified labeling procedures, but usually the new methods are more complicated than the earlier ones. However, new labeling methods are available for almost any radioelement group and they may result in better preserved original natural of the antibody and lead to better clinical results. (orig./MG)

  8. Intelligence in Artificial Intelligence

    OpenAIRE

    Datta, Shoumen Palit Austin

    2016-01-01

    The elusive quest for intelligence in artificial intelligence prompts us to consider that instituting human-level intelligence in systems may be (still) in the realm of utopia. In about a quarter century, we have witnessed the winter of AI (1990) being transformed and transported to the zenith of tabloid fodder about AI (2015). The discussion at hand is about the elements that constitute the canonical idea of intelligence. The delivery of intelligence as a pay-per-use-service, popping out of ...

  9. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    Science.gov (United States)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  10. Managing Sustainability with the Support of Business Intelligence Methods and Tools

    Science.gov (United States)

    Petrini, Maira; Pozzebon, Marlei

    In this paper we explore the role of business intelligence (BI) in helping to support the management of sustainability in contemporary firms. The concepts of sustainability and corporate social responsibility (CSR) are among the most important themes to have emerged in the last decade at the global level. We suggest that BI methods and tools have an important but not yet well studied role to play in helping organizations implement and monitor sustainable and socially responsible business practices. Using grounded theory, the main contribution of our study is to propose a conceptual model that seeks to support the process of definition and monitoring of socio-environmental indicators and the relationship between their management and business strategy.

  11. An intelligent despeckling method for swept source optical coherence tomography images of skin

    Science.gov (United States)

    Adabi, Saba; Mohebbikarkhoran, Hamed; Mehregan, Darius; Conforto, Silvia; Nasiriavanaki, Mohammadreza

    2017-03-01

    Optical Coherence Optical coherence tomography is a powerful high-resolution imaging method with a broad biomedical application. Nonetheless, OCT images suffer from a multiplicative artefacts so-called speckle, a result of coherent imaging of system. Digital filters become ubiquitous means for speckle reduction. Addressing the fact that there still a room for despeckling in OCT, we proposed an intelligent speckle reduction framework based on OCT tissue morphological, textural and optical features that through a trained network selects the winner filter in which adaptively suppress the speckle noise while preserve structural information of OCT signal. These parameters are calculated for different steps of the procedure to be used in designed Artificial Neural Network decider that select the best denoising technique for each segment of the image. Results of training shows the dominant filter is BM3D from the last category.

  12. Modification of the Armijo line search to satisfy the convergence properties of HS method

    Directory of Open Access Journals (Sweden)

    Mohammed Belloufi

    2013-07-01

    Full Text Available The Hestenes-Stiefel (HS conjugate gradient algorithm is a useful tool of unconstrainednumerical optimization, which has good numerical performance but no global convergence result under traditional line searches. This paper proposes a line search technique that guarantee the globalconvergence of the Hestenes-Stiefel (HS conjugate gradient method. Numerical tests are presented tovalidate the different approaches.

  13. The use of foresight methods in strategic raw materials intelligence - an international review

    Science.gov (United States)

    Konrat Martins, Marco Antonio; Bodo, Balazs; Falck, Eberhard

    2017-04-01

    Foresight methods are systematic attempts to look into the longer term future of science, society, economy and technology. There is a range of tools and techniques that can be used individually or combined, most commonly classified into qualitative, quantitative or semi-quantitative methods, that follow an exploratory or normative approach. These tools can help to identify the longer term visions, orienting policy formulation and decisions, triggering actions, among other objectives. There is an identified lack of European strategic foresight knowledge in the raw materials domain. Since the European Raw Materials Initiative was launched in 2008, the EU has been attempting to overcome challenges related to the future access of non-energy and non-agricultural raw materials. In this context, the ongoing H2020 project, MICA (Mineral Intelligence Capacity Analysis, Grant Agreement No. 689648), has been launched to answer to stakeholders needs by consolidating relevant data, determining relevant methods and tools, and investigating Raw Materials Intelligence options for European mineral policy development, all tailored to fit under the umbrella of a European Raw Materials Intelligence Capacity Platform (EU-RMICP). As part of the MICA activities, an assessment of best practices and benchmarks of international raw materials foresight case studies has been carried out in order to review how EU and non-EU countries have employed foresight. A pool of 30 case studies has been collected and reviewed internationally, one third of which were selected for detailed assessment. These were classified according to their background and goals, in function of methods employed, and to the purpose of each method in the study: a total of 12 different methods were identified in these studies. For longer time frames, qualitative predictive methods such as Scenario Development have been repeatedly observed for mineral raw materials foresight studies. Substantial variations were observed in

  14. Intelligent method for diagnosing structural faults of rotating machinery using ant colony optimization.

    Science.gov (United States)

    Li, Ke; Chen, Peng

    2011-01-01

    Structural faults, such as unbalance, misalignment and looseness, etc., often occur in the shafts of rotating machinery. These faults may cause serious machine accidents and lead to great production losses. This paper proposes an intelligent method for diagnosing structural faults of rotating machinery using ant colony optimization (ACO) and relative ratio symptom parameters (RRSPs) in order to detect faults and distinguish fault types at an early stage. New symptom parameters called "relative ratio symptom parameters" are defined for reflecting the features of vibration signals measured in each state. Synthetic detection index (SDI) using statistical theory has also been defined to evaluate the applicability of the RRSPs. The SDI can be used to indicate the fitness of a RRSP for ACO. Lastly, this paper also compares the proposed method with the conventional neural networks (NN) method. Practical examples of fault diagnosis for a centrifugal fan are provided to verify the effectiveness of the proposed method. The verification results show that the structural faults often occurring in the centrifugal fan, such as unbalance, misalignment and looseness states are effectively identified by the proposed method, while these faults are difficult to detect using conventional neural networks.

  15. Program for searching for semiempirical parameters by the MNDO method

    International Nuclear Information System (INIS)

    Bliznyuk, A.A.; Voityuk, A.A.

    1987-01-01

    The authors describe an program for optimizing atomic models constructed using the MNDO method which varies not only the parameters but also the scope for simple changes in the calculation scheme. The target function determines properties such as formation enthalpies, dipole moments, ionization potentials, and geometrical parameters. Software used to minimize the target function is based on the simplex method on the Nelder-Mead algorithm and on the Fletcher variable-metric method. The program is written in FORTRAN IV and implemented on the ES computer

  16. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  17. An effective suggestion method for keyword search of databases

    KAUST Repository

    Huang, Hai; Chen, Zonghai; Liu, Chengfei; Huang, He; Zhang, Xiangliang

    2016-01-01

    This paper solves the problem of providing high-quality suggestions for user keyword queries over databases. With the assumption that the returned suggestions are independent, existing query suggestion methods over databases score candidate

  18. Search and foraging behaviors from movement data: A comparison of methods.

    Science.gov (United States)

    Bennison, Ashley; Bearhop, Stuart; Bodey, Thomas W; Votier, Stephen C; Grecian, W James; Wakefield, Ewan D; Hamer, Keith C; Jessopp, Mark

    2018-01-01

    Search behavior is often used as a proxy for foraging effort within studies of animal movement, despite it being only one part of the foraging process, which also includes prey capture. While methods for validating prey capture exist, many studies rely solely on behavioral annotation of animal movement data to identify search and infer prey capture attempts. However, the degree to which search correlates with prey capture is largely untested. This study applied seven behavioral annotation methods to identify search behavior from GPS tracks of northern gannets ( Morus bassanus ), and compared outputs to the occurrence of dives recorded by simultaneously deployed time-depth recorders. We tested how behavioral annotation methods vary in their ability to identify search behavior leading to dive events. There was considerable variation in the number of dives occurring within search areas across methods. Hidden Markov models proved to be the most successful, with 81% of all dives occurring within areas identified as search. k -Means clustering and first passage time had the highest rates of dives occurring outside identified search behavior. First passage time and hidden Markov models had the lowest rates of false positives, identifying fewer search areas with no dives. All behavioral annotation methods had advantages and drawbacks in terms of the complexity of analysis and ability to reflect prey capture events while minimizing the number of false positives and false negatives. We used these results, with consideration of analytical difficulty, to provide advice on the most appropriate methods for use where prey capture behavior is not available. This study highlights a need to critically assess and carefully choose a behavioral annotation method suitable for the research question being addressed, or resulting species management frameworks established.

  19. The use of artificial intelligence methods for visual analysis of properties of surface layers

    Directory of Open Access Journals (Sweden)

    Tomasz Wójcicki

    2014-12-01

    Full Text Available [b]Abstract[/b]. The article presents a selected area of research on the possibility of automatic prediction of material properties based on the analysis of digital images. Original, holistic model of forecasting properties of surface layers based on a multi-step process that includes the selected methods of processing and analysis of images, inference with the use of a priori knowledge bases and multi-valued fuzzy logic, and simulation with the use of finite element methods is presented. Surface layers characteristics and core technologies of their production processes such as mechanical, thermal, thermo-mechanical, thermo-chemical, electrochemical, physical are discussed. Developed methods used in the model for the classification of images of the surface layers are shown. The objectives of the use of selected methods of processing and analysis of digital images, including techniques for improving the quality of images, segmentation, morphological transformation, pattern recognition and simulation of physical phenomena in the structures of materials are described.[b]Keywords[/b]: image analysis, surface layer, artificial intelligence, fuzzy logic

  20. Study (Prediction of Main Pipes Break Rates in Water Distribution Systems Using Intelligent and Regression Methods

    Directory of Open Access Journals (Sweden)

    Massoud Tabesh

    2011-07-01

    Full Text Available Optimum operation of water distribution networks is one of the priorities of sustainable development of water resources, considering the issues of increasing efficiency and decreasing the water losses. One of the key subjects in optimum operational management of water distribution systems is preparing rehabilitation and replacement schemes, prediction of pipes break rate and evaluation of their reliability. Several approaches have been presented in recent years regarding prediction of pipe failure rates which each one requires especial data sets. Deterministic models based on age and deterministic multi variables and stochastic group modeling are examples of the solutions which relate pipe break rates to parameters like age, material and diameters. In this paper besides the mentioned parameters, more factors such as pipe depth and hydraulic pressures are considered as well. Then using multi variable regression method, intelligent approaches (Artificial neural network and neuro fuzzy models and Evolutionary polynomial Regression method (EPR pipe burst rate are predicted. To evaluate the results of different approaches, a case study is carried out in a part ofMashhadwater distribution network. The results show the capability and advantages of ANN and EPR methods to predict pipe break rates, in comparison with neuro fuzzy and multi-variable regression methods.

  1. Improved Ordinary Measure and Image Entropy Theory based intelligent Copy Detection Method

    Directory of Open Access Journals (Sweden)

    Dengpan Ye

    2011-10-01

    Full Text Available Nowadays, more and more multimedia websites appear in social network. It brings some security problems, such as privacy, piracy, disclosure of sensitive contents and so on. Aiming at copyright protection, the copy detection technology of multimedia contents becomes a hot topic. In our previous work, a new computer-based copyright control system used to detect the media has been proposed. Based on this system, this paper proposes an improved media feature matching measure and an entropy based copy detection method. The Levenshtein Distance was used to enhance the matching degree when using for feature matching measure in copy detection. For entropy based copy detection, we make a fusion of the two features of entropy matrix of the entropy feature we extracted. Firstly,we extract the entropy matrix of the image and normalize it. Then, we make a fusion of the eigenvalue feature and the transfer matrix feature of the entropy matrix. The fused features will be used for image copy detection. The experiments show that compared to use these two kinds of features for image detection singly, using feature fusion matching method is apparent robustness and effectiveness. The fused feature has a high detection for copy images which have been received some attacks such as noise, compression, zoom, rotation and so on. Comparing with referred methods, the method proposed is more intelligent and can be achieved good performance.

  2. Artificial intelligence applications in information and communication technologies

    CERN Document Server

    Bouguila, Nizar

    2015-01-01

    This book presents various recent applications of Artificial Intelligence in Information and Communication Technologies such as Search and Optimization methods, Machine Learning, Data Representation and Ontologies, and Multi-agent Systems. The main aim of this book is to help Information and Communication Technologies (ICT) practitioners in managing efficiently their platforms using AI tools and methods and to provide them with sufficient Artificial Intelligence background to deal with real-life problems.  .

  3. Algorithms in ambient intelligence

    NARCIS (Netherlands)

    Aarts, E.H.L.; Korst, J.H.M.; Verhaegh, W.F.J.; Verhaegh, W.F.J.; Aarts, E.H.L.; Korst, J.H.M.

    2004-01-01

    In this chapter, we discuss the new paradigm for user-centered computing known as ambient intelligence and its relation with methods and techniques from the field of computational intelligence, including problem solving, machine learning, and expert systems.

  4. Design of Intelligent Hydraulic Excavator Control System Based on PID Method

    Science.gov (United States)

    Zhang, Jun; Jiao, Shengjie; Liao, Xiaoming; Yin, Penglong; Wang, Yulin; Si, Kuimao; Zhang, Yi; Gu, Hairong

    Most of the domestic designed hydraulic excavators adopt the constant power design method and set 85%~90% of engine power as the hydraulic system adoption power, it causes high energy loss due to mismatching of power between the engine and the pump. While the variation of the rotational speed of engine could sense the power shift of the load, it provides a new method to adjust the power matching between engine and pump through engine speed. Based on negative flux hydraulic system, an intelligent hydraulic excavator control system was designed based on rotational speed sensing method to improve energy efficiency. The control system was consisted of engine control module, pump power adjusted module, engine idle module and system fault diagnosis module. Special PLC with CAN bus was used to acquired the sensors and adjusts the pump absorption power according to load variation. Four energy saving control strategies with constant power method were employed to improve the fuel utilization. Three power modes (H, S and L mode) were designed to meet different working status; Auto idle function was employed to save energy through two work status detected pressure switches, 1300rpm was setting as the idle speed according to the engine consumption fuel curve. Transient overload function was designed for deep digging within short time without spending extra fuel. An increasing PID method was employed to realize power matching between engine and pump, the rotational speed's variation was taken as the PID algorithm's input; the current of proportional valve of variable displacement pump was the PID's output. The result indicated that the auto idle could decrease fuel consumption by 33.33% compared to work in maximum speed of H mode, the PID control method could take full use of maximum engine power at each power mode and keep the engine speed at stable range. Application of rotational speed sensing method provides a reliable method to improve the excavator's energy efficiency and

  5. Short Term Gain, Long Term Pain:Informal Job Search Methods and Post-Displacement Outcomes

    OpenAIRE

    Green, Colin

    2012-01-01

    This paper examines the role of informal job search methods on the labour market outcomes of displaced workers. Informal job search methods could alleviate short-term labour market difficulties of displaced workers by providing information on job opportunities, allowing them to signal their productivity and may mitigate wage losses through better post-displacement job matching. However if displacement results from reductions in demand for specific sectors/skills, the use of informal job searc...

  6. Wide Binaries in TGAS: Search Method and First Results

    Science.gov (United States)

    Andrews, Jeff J.; Chanamé, Julio; Agüeros, Marcel A.

    2018-04-01

    Half of all stars reside in binary systems, many of which have orbital separations in excess of 1000 AU. Such binaries are typically identified in astrometric catalogs by matching the proper motions vectors of close stellar pairs. We present a fully Bayesian method that properly takes into account positions, proper motions, parallaxes, and their correlated uncertainties to identify widely separated stellar binaries. After applying our method to the >2 × 106 stars in the Tycho-Gaia astrometric solution from Gaia DR1, we identify over 6000 candidate wide binaries. For those pairs with separations less than 40,000 AU, we determine the contamination rate to be ~5%. This sample has an orbital separation (a) distribution that is roughly flat in log space for separations less than ~5000 AU and follows a power law of a -1.6 at larger separations.

  7. Effectiveness of Agile Implementation Methods in Business Intelligence Projects from an End-user Perspective

    Directory of Open Access Journals (Sweden)

    Anna Maria Misiak

    2016-06-01

    Full Text Available The global Business Intelligence (BI market grew by 10% in 2013 according to the Gartner Report. Today organizations require better use of data and analytics to support their business decisions. Internet power and business trend changes have provided a broad term for data analytics – Big Data. To be able to handle it and leverage a value of having access to Big Data, organizations have no other choice than to get proper systems implemented and working. However traditional methods are not efficient for changing business needs. The long time between project start and go-live causes a gap between initial solution blueprint and actual user requirements in the end of the project. This article presents the latest market trends in BI systems implementation by comparing Agile with traditional methods. It presents a case study provided in a large telecommunications company (20K employees and the results of a pilot research provided in the three large companies: telecommunications, digital, and insurance. Both studies prove that Agile methods might be more effective in BI projects from an end-user perspective and give first results and added value in a much shorter time compared to a traditional approach.

  8. Intelligent Photovoltaic Systems by Combining the Improved Perturbation Method of Observation and Sun Location Tracking

    Science.gov (United States)

    Wang, Yajie; Shi, Yunbo; Yu, Xiaoyu; Liu, Yongjie

    2016-01-01

    Currently, tracking in photovoltaic (PV) systems suffers from some problems such as high energy consumption, poor anti-interference performance, and large tracking errors. This paper presents a solar PV tracking system on the basis of an improved perturbation and observation method, which maximizes photoelectric conversion efficiency. According to the projection principle, we design a sensor module with a light-intensity-detection module for environmental light-intensity measurement. The effect of environmental factors on the system operation is reduced, and intelligent identification of the weather is realized. This system adopts the discrete-type tracking method to reduce power consumption. A mechanical structure with a level-pitch double-degree-of-freedom is designed, and attitude correction is performed by closed-loop control. A worm-and-gear mechanism is added, and the reliability, stability, and precision of the system are improved. Finally, the perturbation and observation method designed and improved by this study was tested by simulated experiments. The experiments verified that the photoelectric sensor resolution can reach 0.344°, the tracking error is less than 2.5°, the largest improvement in the charge efficiency can reach 44.5%, and the system steadily and reliably works. PMID:27327657

  9. Review on applications of artificial intelligence methods for dam and reservoir-hydro-environment models.

    Science.gov (United States)

    Allawi, Mohammed Falah; Jaafar, Othman; Mohamad Hamzah, Firdaus; Abdullah, Sharifah Mastura Syed; El-Shafie, Ahmed

    2018-05-01

    Efficacious operation for dam and reservoir system could guarantee not only a defenselessness policy against natural hazard but also identify rule to meet the water demand. Successful operation of dam and reservoir systems to ensure optimal use of water resources could be unattainable without accurate and reliable simulation models. According to the highly stochastic nature of hydrologic parameters, developing accurate predictive model that efficiently mimic such a complex pattern is an increasing domain of research. During the last two decades, artificial intelligence (AI) techniques have been significantly utilized for attaining a robust modeling to handle different stochastic hydrological parameters. AI techniques have also shown considerable progress in finding optimal rules for reservoir operation. This review research explores the history of developing AI in reservoir inflow forecasting and prediction of evaporation from a reservoir as the major components of the reservoir simulation. In addition, critical assessment of the advantages and disadvantages of integrated AI simulation methods with optimization methods has been reported. Future research on the potential of utilizing new innovative methods based AI techniques for reservoir simulation and optimization models have also been discussed. Finally, proposal for the new mathematical procedure to accomplish the realistic evaluation of the whole optimization model performance (reliability, resilience, and vulnerability indices) has been recommended.

  10. An Intelligent Fleet Condition-Based Maintenance Decision Making Method Based on Multi-Agent

    Directory of Open Access Journals (Sweden)

    Bo Sun

    2012-01-01

    Full Text Available According to the demand for condition-based maintenance online decision making among a mission oriented fleet, an intelligent maintenance decision making method based on Multi-agent and heuristic rules is proposed. The process of condition-based maintenance within an aircraft fleet (each containing one or more Line Replaceable Modules based on multiple maintenance thresholds is analyzed. Then the process is abstracted into a Multi-Agent Model, a 2-layer model structure containing host negotiation and independent negotiation is established, and the heuristic rules applied to global and local maintenance decision making is proposed. Based on Contract Net Protocol and the heuristic rules, the maintenance decision making algorithm is put forward. Finally, a fleet consisting of 10 aircrafts on a 3-wave continuous mission is illustrated to verify this method. Simulation results indicate that this method can improve the availability of the fleet, meet mission demands, rationalize the utilization of support resources and provide support for online maintenance decision making among a mission oriented fleet.

  11. New evaluation methods for conceptual design selection using computational intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Hong Zhong; Liu, Yu; Li, Yanfeng; Wang, Zhonglai [University of Electronic Science and Technology of China, Chengdu (China); Xue, Lihua [Higher Education Press, Beijing (China)

    2013-03-15

    The conceptual design selection, which aims at choosing the best or most desirable design scheme among several candidates for the subsequent detailed design stage, oftentimes requires a set of tools to conduct design evaluation. Using computational intelligence techniques, such as fuzzy logic, neural network, genetic algorithm, and physical programming, several design evaluation methods are put forth in this paper to realize the conceptual design selection under different scenarios. Depending on whether an evaluation criterion can be quantified or not, the linear physical programming (LPP) model and the RAOGA-based fuzzy neural network (FNN) model can be utilized to evaluate design alternatives in conceptual design stage. Furthermore, on the basis of Vanegas and Labib's work, a multi-level conceptual design evaluation model based on the new fuzzy weighted average (NFWA) and the fuzzy compromise decision-making method is developed to solve the design evaluation problem consisting of many hierarchical criteria. The effectiveness of the proposed methods is demonstrated via several illustrative examples.

  12. New evaluation methods for conceptual design selection using computational intelligence techniques

    International Nuclear Information System (INIS)

    Huang, Hong Zhong; Liu, Yu; Li, Yanfeng; Wang, Zhonglai; Xue, Lihua

    2013-01-01

    The conceptual design selection, which aims at choosing the best or most desirable design scheme among several candidates for the subsequent detailed design stage, oftentimes requires a set of tools to conduct design evaluation. Using computational intelligence techniques, such as fuzzy logic, neural network, genetic algorithm, and physical programming, several design evaluation methods are put forth in this paper to realize the conceptual design selection under different scenarios. Depending on whether an evaluation criterion can be quantified or not, the linear physical programming (LPP) model and the RAOGA-based fuzzy neural network (FNN) model can be utilized to evaluate design alternatives in conceptual design stage. Furthermore, on the basis of Vanegas and Labib's work, a multi-level conceptual design evaluation model based on the new fuzzy weighted average (NFWA) and the fuzzy compromise decision-making method is developed to solve the design evaluation problem consisting of many hierarchical criteria. The effectiveness of the proposed methods is demonstrated via several illustrative examples.

  13. Intelligent Photovoltaic Systems by Combining the Improved Perturbation Method of Observation and Sun Location Tracking.

    Directory of Open Access Journals (Sweden)

    Yajie Wang

    Full Text Available Currently, tracking in photovoltaic (PV systems suffers from some problems such as high energy consumption, poor anti-interference performance, and large tracking errors. This paper presents a solar PV tracking system on the basis of an improved perturbation and observation method, which maximizes photoelectric conversion efficiency. According to the projection principle, we design a sensor module with a light-intensity-detection module for environmental light-intensity measurement. The effect of environmental factors on the system operation is reduced, and intelligent identification of the weather is realized. This system adopts the discrete-type tracking method to reduce power consumption. A mechanical structure with a level-pitch double-degree-of-freedom is designed, and attitude correction is performed by closed-loop control. A worm-and-gear mechanism is added, and the reliability, stability, and precision of the system are improved. Finally, the perturbation and observation method designed and improved by this study was tested by simulated experiments. The experiments verified that the photoelectric sensor resolution can reach 0.344°, the tracking error is less than 2.5°, the largest improvement in the charge efficiency can reach 44.5%, and the system steadily and reliably works.

  14. Intelligent Photovoltaic Systems by Combining the Improved Perturbation Method of Observation and Sun Location Tracking.

    Science.gov (United States)

    Wang, Yajie; Shi, Yunbo; Yu, Xiaoyu; Liu, Yongjie

    2016-01-01

    Currently, tracking in photovoltaic (PV) systems suffers from some problems such as high energy consumption, poor anti-interference performance, and large tracking errors. This paper presents a solar PV tracking system on the basis of an improved perturbation and observation method, which maximizes photoelectric conversion efficiency. According to the projection principle, we design a sensor module with a light-intensity-detection module for environmental light-intensity measurement. The effect of environmental factors on the system operation is reduced, and intelligent identification of the weather is realized. This system adopts the discrete-type tracking method to reduce power consumption. A mechanical structure with a level-pitch double-degree-of-freedom is designed, and attitude correction is performed by closed-loop control. A worm-and-gear mechanism is added, and the reliability, stability, and precision of the system are improved. Finally, the perturbation and observation method designed and improved by this study was tested by simulated experiments. The experiments verified that the photoelectric sensor resolution can reach 0.344°, the tracking error is less than 2.5°, the largest improvement in the charge efficiency can reach 44.5%, and the system steadily and reliably works.

  15. The 2P1/2 → 2P3/2 laser transition in atomic iodine and the problem of search for signals from extraterrestrial intelligence

    International Nuclear Information System (INIS)

    Kutaev, Yu F; Mankevich, S K; Nosach, O Yu; Orlov, E P

    2007-01-01

    It is proposed to search for signals from extraterrestrial intelligence (ETI) at a wavelength of 1.315 μm of the laser 2 P 1/2 → 2 P 3/2 transition in the atomic iodine, which can be used for this purpose as the natural frequency reference. The search at this wavelength is promising because active quantum filters (AQFs) with the quantum sensitivity limit have been developed for this wavelength, which are capable of receiving laser signals, consisting of only a few photons, against the background of emission from a star under study. In addition, high-power iodine lasers emitting diffraction-limited radiation at 1.315 μm have been created, which highly developed ETI also can have. If a ETI sends in our direction a diffraction-limited 10-ns, 1-kJ laser pulse with the beam diameter of 10 m, a receiver with an AQF mounted on a ten-meter extra-atmospheric optical telescope can detect this signal at a distance of up to 300 light years, irrespective of the ETI position on the celestial sphere. The realisation of the projects for manufacturing optical telescopes of diameter 30 m will increase the research range up to 2700 light years. A weak absorption of the 1.315-μm radiation in the Earth atmosphere (the signal is attenuated by less than 20%) allows the search for ETI signals by using ground telescopes equipped with adaptive optical systems. (laser applications and other topics in quantum electronics)

  16. Prediction of 5-year overall survival in cervical cancer patients treated with radical hysterectomy using computational intelligence methods.

    Science.gov (United States)

    Obrzut, Bogdan; Kusy, Maciej; Semczuk, Andrzej; Obrzut, Marzanna; Kluska, Jacek

    2017-12-12

    Computational intelligence methods, including non-linear classification algorithms, can be used in medical research and practice as a decision making tool. This study aimed to evaluate the usefulness of artificial intelligence models for 5-year overall survival prediction in patients with cervical cancer treated by radical hysterectomy. The data set was collected from 102 patients with cervical cancer FIGO stage IA2-IIB, that underwent primary surgical treatment. Twenty-three demographic, tumor-related parameters and selected perioperative data of each patient were collected. The simulations involved six computational intelligence methods: the probabilistic neural network (PNN), multilayer perceptron network, gene expression programming classifier, support vector machines algorithm, radial basis function neural network and k-Means algorithm. The prediction ability of the models was determined based on the accuracy, sensitivity, specificity, as well as the area under the receiver operating characteristic curve. The results of the computational intelligence methods were compared with the results of linear regression analysis as a reference model. The best results were obtained by the PNN model. This neural network provided very high prediction ability with an accuracy of 0.892 and sensitivity of 0.975. The area under the receiver operating characteristics curve of PNN was also high, 0.818. The outcomes obtained by other classifiers were markedly worse. The PNN model is an effective tool for predicting 5-year overall survival in cervical cancer patients treated with radical hysterectomy.

  17. Searching for Suicide Methods: Accessibility of Information About Helium as a Method of Suicide on the Internet.

    Science.gov (United States)

    Gunnell, David; Derges, Jane; Chang, Shu-Sen; Biddle, Lucy

    2015-01-01

    Helium gas suicides have increased in England and Wales; easy-to-access descriptions of this method on the Internet may have contributed to this rise. To investigate the availability of information on using helium as a method of suicide and trends in searching about this method on the Internet. We analyzed trends in (a) Google searching (2004-2014) and (b) hits on a Wikipedia article describing helium as a method of suicide (2013-2014). We also investigated the extent to which helium was described as a method of suicide on web pages and discussion forums identified via Google. We found no evidence of rises in Internet searching about suicide using helium. News stories about helium suicides were associated with increased search activity. The Wikipedia article may have been temporarily altered to increase awareness of suicide using helium around the time of a celebrity suicide. Approximately one third of the links retrieved using Google searches for suicide methods mentioned helium. Information about helium as a suicide method is readily available on the Internet; the Wikipedia article describing its use was highly accessed following celebrity suicides. Availability of online information about this method may contribute to rises in helium suicides.

  18. COMPUTER-IMPLEMENTED METHOD OF PERFORMING A SEARCH USING SIGNATURES

    DEFF Research Database (Denmark)

    2017-01-01

    A computer-implemented method of processing a query vector and a data vector), comprising: generating a set of masks and a first set of multiple signatures and a second set of multiple signatures by applying the set of masks to the query vector and the data vector, respectively, and generating...... candidate pairs, of a first signature and a second signature, by identifying matches of a first signature and a second signature. The set of masks comprises a configuration of the elements that is a Hadamard code; a permutation of a Hadamard code; or a code that deviates from a Hadamard code...

  19. An effective suggestion method for keyword search of databases

    KAUST Repository

    Huang, Hai

    2016-09-09

    This paper solves the problem of providing high-quality suggestions for user keyword queries over databases. With the assumption that the returned suggestions are independent, existing query suggestion methods over databases score candidate suggestions individually and return the top-k best of them. However, the top-k suggestions have high redundancy with respect to the topics. To provide informative suggestions, the returned k suggestions are expected to be diverse, i.e., maximizing the relevance to the user query and the diversity with respect to topics that the user might be interested in simultaneously. In this paper, an objective function considering both factors is defined for evaluating a suggestion set. We show that maximizing the objective function is a submodular function maximization problem subject to n matroid constraints, which is an NP-hard problem. An greedy approximate algorithm with an approximation ratio O((Formula presented.)) is also proposed. Experimental results show that our suggestion outperforms other methods on providing relevant and diverse suggestions. © 2016 Springer Science+Business Media New York

  20. Beam angle optimization for intensity-modulated radiation therapy using a guided pattern search method

    International Nuclear Information System (INIS)

    Rocha, Humberto; Dias, Joana M; Ferreira, Brígida C; Lopes, Maria C

    2013-01-01

    Generally, the inverse planning of radiation therapy consists mainly of the fluence optimization. The beam angle optimization (BAO) in intensity-modulated radiation therapy (IMRT) consists of selecting appropriate radiation incidence directions and may influence the quality of the IMRT plans, both to enhance better organ sparing and to improve tumor coverage. However, in clinical practice, most of the time, beam directions continue to be manually selected by the treatment planner without objective and rigorous criteria. The goal of this paper is to introduce a novel approach that uses beam’s-eye-view dose ray tracing metrics within a pattern search method framework in the optimization of the highly non-convex BAO problem. Pattern search methods are derivative-free optimization methods that require a few function evaluations to progress and converge and have the ability to better avoid local entrapment. The pattern search method framework is composed of a search step and a poll step at each iteration. The poll step performs a local search in a mesh neighborhood and ensures the convergence to a local minimizer or stationary point. The search step provides the flexibility for a global search since it allows searches away from the neighborhood of the current iterate. Beam’s-eye-view dose metrics assign a score to each radiation beam direction and can be used within the pattern search framework furnishing a priori knowledge of the problem so that directions with larger dosimetric scores are tested first. A set of clinical cases of head-and-neck tumors treated at the Portuguese Institute of Oncology of Coimbra is used to discuss the potential of this approach in the optimization of the BAO problem. (paper)

  1. Monitoring of operation with artificial intelligence methods; Betriebsueberwachung mit Verfahren der Kuenstlichen Intelligenz

    Energy Technology Data Exchange (ETDEWEB)

    Bruenninghaus, H. [DMT-Gesellschaft fuer Forschung und Pruefung mbH, Essen (Germany). Geschaeftsbereich Systemtechnik

    1999-03-11

    Taking the applications `early detection of fires` and `reduction of burst of messages` as an example, the usability of artificial intelligence (AI) methods in the monitoring of operation was checked in a R and D project. The contribution describes the concept, development and evaluation of solutions to the specified problems. A platform, which made it possible to investigate different AI methods (in particular artificial neuronal networks), had to be creaated as a basis for the project. At the same time ventilation data had to be acquired and processed by the networks for the classification. (orig.) [Deutsch] Am Beispiel der Anwendungsfaelle `Brandfrueherkennung` und `Meldungsschauerreduzierung` wurde im Rahmen eines F+E-Vorhabens die Einsetzbarkeit von Kuenstliche-Intelligenz-Methoden (KI) in der Betriebsueberwachung geprueft. Der Beitrag stellt Konzeption, Entwicklung und Bewertung von Loesungsansaetzen fuer die genannten Aufgabenstellungen vor. Als Grundlage fuer das Vorhaben wurden anhand KI-Methoden (speziell: Kuenstliche Neuronale Netze -KNN) auf der Grundlage gewonnener und aufbereiteter Wetterdaten die Beziehungen zwischen den Wettermessstellen im Laufe des Wetterwegs klassifiziert. (orig.)

  2. Residential building energy estimation method based on the application of artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, S.; Kajl, S.

    1999-07-01

    The energy requirements of a residential building five to twenty-five stories high can be measured using a newly proposed analytical method based on artificial intelligence. The method is fast and provides a wide range of results such as total energy consumption values, power surges, and heating or cooling consumption values. A series of database were created to take into account the particularities which influence the energy consumption of a building. In this study, DOE-2 software was created for use in 8 apartment models. A total of 27 neural networks were used, 3 for the estimation of energy consumption in the corridor, and 24 for inside the apartments. Three user interfaces were created to facilitate the estimation of energy consumption. These were named the Energy Estimation Assistance System (EEAS) interfaces and are only accessible using MATLAB software. The input parameters for EEAS are: climatic region, exterior wall resistance, roofing resistance, type of windows, infiltration, number of storeys, and corridor ventilation system operating schedule. By changing the parameters, the EEAS can determine annual heating, cooling and basic energy consumption levels for apartments and corridors. 2 tabs., 2 figs.

  3. Computational Intelligence Method for Early Diagnosis Dengue Haemorrhagic Fever Using Fuzzy on Mobile Device

    Directory of Open Access Journals (Sweden)

    Salman Afan

    2014-03-01

    Full Text Available Mortality from Dengue Haemorrhagic Fever (DHF is still increasing in Indonesia particularly in Jakarta. Diagnosis of the dengue shall be made as early as possible so that first aid can be given in expectation of decreasing death risk. The Study will be conducted by developing expert system based on Computational Intelligence Method. On the first year, study will use the Fuzzy Inference System (FIS Method to diagnose Dengue Haemorrhagic Fever particularly in Mobile Device consist of smart phone. Expert system application which particularly using fuzzy system can be applied in mobile device and it is useful to make early diagnosis of Dengue Haemorrhagic Fever that produce outcome faster than laboratory test. The evaluation of this application is conducted by performing accuracy test before and after validation using data of patient who has the Dengue Haemorrhagic Fever. This expert system application is easy, convenient, and practical to use, also capable of making the early diagnosis of Dengue Haemorraghic to avoid mortality in the first stage.

  4. Advances in intelligent process-aware information systems concepts, methods, and technologies

    CERN Document Server

    Oberhauser, Roy; Reichert, Manfred

    2017-01-01

    This book provides a state-of-the-art perspective on intelligent process-aware information systems and presents chapters on specific facets and approaches applicable to such systems. Further, it highlights novel advances and developments in various aspects of intelligent process-aware information systems and business process management systems. Intelligence capabilities are increasingly being integrated into or created in many of today’s software products and services. Process-aware information systems provide critical computing infrastructure to support the various processes involved in the creation and delivery of business products and services. Yet the integration of intelligence capabilities into process-aware information systems is a non-trivial yet necessary evolution of these complex systems. The book’s individual chapters address adaptive process management, case management processes, autonomically-capable processes, process-oriented information logistics, process recommendations, reasoning over ...

  5. On the Need for Artificial Intelligence and Advanced Test and Evaluation Methods for Space Exploration

    Science.gov (United States)

    Scheidt, D. H.; Hibbitts, C. A.; Chen, M. H.; Paxton, L. J.; Bekker, D. L.

    2017-02-01

    Implementing mature artificial intelligence would create the ability to significantly increase the science return from a mission, while potentially saving costs in mission and instrument operations, and solving currently intractable problems.

  6. Application of artificial intelligence (AI) methods for designing and analysis of reconfigurable cellular manufacturing system (RCMS)

    CSIR Research Space (South Africa)

    Xing, B

    2009-12-01

    Full Text Available This work focuses on the design and control of a novel hybrid manufacturing system: Reconfigurable Cellular Manufacturing System (RCMS) by using Artificial Intelligence (AI) approach. It is hybrid as it combines the advantages of Cellular...

  7. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    The purpose of this presentation is to illustrate the need for a quantitative analytical capability within organizations and staffs that provide intelligence analysis to Army, Joint, and Coalition Force headquarters...

  8. Intelligent System Design Using Hyper-Heuristics

    Directory of Open Access Journals (Sweden)

    Nelishia Pillay

    2015-07-01

    Full Text Available Determining the most appropriate search method or artificial intelligence technique to solve a problem is not always evident and usually requires implementation of the different approaches to ascertain this. In some instances a single approach may not be sufficient and hybridization of methods may be needed to find a solution. This process can be time consuming. The paper proposes the use of hyper-heuristics as a means of identifying which method or combination of approaches is needed to solve a problem. The research presented forms part of a larger initiative aimed at using hyper-heuristics to develop intelligent hybrid systems. As an initial step in this direction, this paper investigates this for classical artificial intelligence uninformed and informed search methods, namely depth first search, breadth first search, best first search, hill-climbing and the A* algorithm. The hyper-heuristic determines the search or combination of searches to use to solve the problem. An evolutionary algorithm hyper-heuristic is implemented for this purpose and its performance is evaluated in solving the 8-Puzzle, Towers of Hanoi and Blocks World problems. The hyper-heuristic employs a generational evolutionary algorithm which iteratively refines an initial population using tournament selection to select parents, which the mutation and crossover operators are applied to for regeneration. The hyper-heuristic was able to identify a search or combination of searches to produce solutions for the twenty 8-Puzzle, five Towers of Hanoi and five Blocks World problems. Furthermore, admissible solutions were produced for all problem instances.

  9. Emotional intelligence among medical students: a mixed methods study from Chennai, India.

    Science.gov (United States)

    Sundararajan, Subashini; Gopichandran, Vijayaprasad

    2018-05-04

    Emotional Intelligence is the ability of a person to understand and respond to one's own and others' emotions and use this understanding to guide one's thoughts and actions. To assess the level of emotional intelligence of medical students in a medical college in Chennai and to explore their understanding of the role of emotions in medical practice. A quantitative, cross sectional, questionnaire based, survey was conducted among 207 medical students in a college in Chennai, India using the Quick Emotional Intelligence Self Assessment Test and some hypothetical emotional clinical vignettes. This was followed by a qualitative moderated fish-bowl discussion to elicit the opinion of medical students on role of emotions in the practice of medicine. The mean score of Emotional Intelligence was 107.58 (SD 16.44) out of a maximum possible score of 160. Students who went to government schools for high school education had greater emotional intelligence than students from private schools (p = 0.044) and women were more emotionally intelligent in their response to emotional vignettes than men (p = 0.056). The fish bowl discussion highlighted several positive and negative impacts of emotions in clinical care. The students concluded at the end of the discussion that emotions are inevitable in the practice of medicine and a good physician should know how to handle them. Medical students, both men and women, had good level of emotional intelligence in the college that was studied. Students from collectivist social settings like government high schools have better emotional intelligence, which may indicate that a collectivist, community oriented medical education can serve the same purpose. Though students have diverse opinions on the role of emotions in clinical care, cognitive reflection exercises can help them understand its importance.

  10. Algorithms in ambient intelligence

    NARCIS (Netherlands)

    Aarts, E.H.L.; Korst, J.H.M.; Verhaegh, W.F.J.; Weber, W.; Rabaey, J.M.; Aarts, E.

    2005-01-01

    We briefly review the concept of ambient intelligence and discuss its relation with the domain of intelligent algorithms. By means of four examples of ambient intelligent systems, we argue that new computing methods and quantification measures are needed to bridge the gap between the class of

  11. Advance in study of intelligent diagnostic method for nuclear power plant

    International Nuclear Information System (INIS)

    Zhou Gang; Yang Li

    2008-01-01

    The advance of research on the application of three types of intelligent diagnostic approach based on neural network (ANN), fuzzy logic and expert system to the operation status monitoring and fault diagnosis of nuclear power plant (NPP) was reviewed. The research status and characters on status monitoring and fault diagnosis approaches based on neural network, fuzzy logic and expert system for nuclear power plant were analyzed. The development trend of applied research on intelligent diagnostic approaches for nuclear power plant was explored. The analysis results show that the research achievements on intelligent diagnostic approaches based on fuzzy logic and expert system for nuclear power plant are not much relatively. The research of intelligent diagnostic approaches for nuclear power plant concentrate on the aspect of operation status monitoring and fault diagnosis based on neural networks for nuclear power plant. The advancing tendency of intelligent diagnostic approaches for nuclear power plant is the combination of various intelligent diagnostic approaches, the combination of neural network diagnostic approaches and other diagnostic approaches as well as multiple neural network diagnostic approaches. (authors)

  12. Intelligent self-organization methods for wireless ad hoc sensor networks based on limited resources

    Science.gov (United States)

    Hortos, William S.

    2006-05-01

    A wireless ad hoc sensor network (WSN) is a configuration for area surveillance that affords rapid, flexible deployment in arbitrary threat environments. There is no infrastructure support and sensor nodes communicate with each other only when they are in transmission range. To a greater degree than the terminals found in mobile ad hoc networks (MANETs) for communications, sensor nodes are resource-constrained, with limited computational processing, bandwidth, memory, and power, and are typically unattended once in operation. Consequently, the level of information exchange among nodes, to support any complex adaptive algorithms to establish network connectivity and optimize throughput, not only deplete those limited resources and creates high overhead in narrowband communications, but also increase network vulnerability to eavesdropping by malicious nodes. Cooperation among nodes, critical to the mission of sensor networks, can thus be disrupted by the inappropriate choice of the method for self-organization. Recent published contributions to the self-configuration of ad hoc sensor networks, e.g., self-organizing mapping and swarm intelligence techniques, have been based on the adaptive control of the cross-layer interactions found in MANET protocols to achieve one or more performance objectives: connectivity, intrusion resistance, power control, throughput, and delay. However, few studies have examined the performance of these algorithms when implemented with the limited resources of WSNs. In this paper, self-organization algorithms for the initiation, operation and maintenance of a network topology from a collection of wireless sensor nodes are proposed that improve the performance metrics significant to WSNs. The intelligent algorithm approach emphasizes low computational complexity, energy efficiency and robust adaptation to change, allowing distributed implementation with the actual limited resources of the cooperative nodes of the network. Extensions of the

  13. A comparison of performance of several artificial intelligence methods for forecasting monthly discharge time series

    Science.gov (United States)

    Wang, Wen-Chuan; Chau, Kwok-Wing; Cheng, Chun-Tian; Qiu, Lin

    2009-08-01

    SummaryDeveloping a hydrological forecasting model based on past records is crucial to effective hydropower reservoir management and scheduling. Traditionally, time series analysis and modeling is used for building mathematical models to generate hydrologic records in hydrology and water resources. Artificial intelligence (AI), as a branch of computer science, is capable of analyzing long-series and large-scale hydrological data. In recent years, it is one of front issues to apply AI technology to the hydrological forecasting modeling. In this paper, autoregressive moving-average (ARMA) models, artificial neural networks (ANNs) approaches, adaptive neural-based fuzzy inference system (ANFIS) techniques, genetic programming (GP) models and support vector machine (SVM) method are examined using the long-term observations of monthly river flow discharges. The four quantitative standard statistical performance evaluation measures, the coefficient of correlation ( R), Nash-Sutcliffe efficiency coefficient ( E), root mean squared error (RMSE), mean absolute percentage error (MAPE), are employed to evaluate the performances of various models developed. Two case study river sites are also provided to illustrate their respective performances. The results indicate that the best performance can be obtained by ANFIS, GP and SVM, in terms of different evaluation criteria during the training and validation phases.

  14. An intelligent fault diagnosis method of rolling bearings based on regularized kernel Marginal Fisher analysis

    International Nuclear Information System (INIS)

    Jiang Li; Shi Tielin; Xuan Jianping

    2012-01-01

    Generally, the vibration signals of fault bearings are non-stationary and highly nonlinear under complicated operating conditions. Thus, it's a big challenge to extract optimal features for improving classification and simultaneously decreasing feature dimension. Kernel Marginal Fisher analysis (KMFA) is a novel supervised manifold learning algorithm for feature extraction and dimensionality reduction. In order to avoid the small sample size problem in KMFA, we propose regularized KMFA (RKMFA). A simple and efficient intelligent fault diagnosis method based on RKMFA is put forward and applied to fault recognition of rolling bearings. So as to directly excavate nonlinear features from the original high-dimensional vibration signals, RKMFA constructs two graphs describing the intra-class compactness and the inter-class separability, by combining traditional manifold learning algorithm with fisher criteria. Therefore, the optimal low-dimensional features are obtained for better classification and finally fed into the simplest K-nearest neighbor (KNN) classifier to recognize different fault categories of bearings. The experimental results demonstrate that the proposed approach improves the fault classification performance and outperforms the other conventional approaches.

  15. Low-Mode Conformational Search Method with Semiempirical Quantum Mechanical Calculations: Application to Enantioselective Organocatalysis.

    Science.gov (United States)

    Kamachi, Takashi; Yoshizawa, Kazunari

    2016-02-22

    A conformational search program for finding low-energy conformations of large noncovalent complexes has been developed. A quantitatively reliable semiempirical quantum mechanical PM6-DH+ method, which is able to accurately describe noncovalent interactions at a low computational cost, was employed in contrast to conventional conformational search programs in which molecular mechanical methods are usually adopted. Our approach is based on the low-mode method whereby an initial structure is perturbed along one of its low-mode eigenvectors to generate new conformations. This method was applied to determine the most stable conformation of transition state for enantioselective alkylation by the Maruoka and cinchona alkaloid catalysts and Hantzsch ester hydrogenation of imines by chiral phosphoric acid. Besides successfully reproducing the previously reported most stable DFT conformations, the conformational search with the semiempirical quantum mechanical calculations newly discovered a more stable conformation at a low computational cost.

  16. A cross-correlation method to search for gravitational wave bursts with AURIGA and Virgo

    NARCIS (Netherlands)

    Bignotto, M.; Bonaldi, M.; Camarda, M.; Cerdonio, M.; Conti, L.; Drago, M.; Falferi, P.; Liguori, N.; Longo, S.; Mezzena, R.; Mion, A.; Ortolan, A.; Prodi, G. A.; Re, V.; Salemi, F.; Taffarello, L.; Vedovato, G.; Vinante, A.; Vitale, S.; Zendri, J. -P.; Acernese, F.; Alshourbagy, Mohamed; Amico, Paolo; Antonucci, Federica; Aoudia, S.; Astone, P.; Avino, Saverio; Baggio, L.; Ballardin, G.; Barone, F.; Barsotti, L.; Barsuglia, M.; Bauer, Th. S.; Bigotta, Stefano; Birindelli, Simona; Boccara, Albert-Claude; Bondu, F.; Bosi, Leone; Braccini, Stefano; Bradaschia, C.; Brillet, A.; Brisson, V.; Buskulic, D.; Cagnoli, G.; Calloni, E.; Campagna, Enrico; Carbognani, F.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cesarini, E.; Chassande-Mottin, E.; Clapson, A-C; Cleva, F.; Coccia, E.; Corda, C.; Corsi, A.; Cottone, F.; Coulon, J. -P.; Cuoco, E.; D'Antonio, S.; Dari, A.; Dattilo, V.; Davier, M.; Rosa, R.; Del Prete, M.; Di Fiore, L.; Di Lieto, A.; Emilio, M. Di Paolo; Di Virgilio, A.; Evans, M.; Fafone, V.; Ferrante, I.; Fidecaro, F.; Fiori, I.; Flaminio, R.; Fournier, J. -D.; Frasca, S.; Frasconi, F.; Gammaitoni, L.; Garufi, F.; Genin, E.; Gennai, A.; Giazotto, A.; Giordano, L.; Granata, V.; Greverie, C.; Grosjean, D.; Guidi, G.; Hamdani, S.U.; Hebri, S.; Heitmann, H.; Hello, P.; Huet, D.; Kreckelbergh, S.; La Penna, P.; Laval, M.; Leroy, N.; Letendre, N.; Lopez, B.; Lorenzini, M.; Loriette, V.; Losurdo, G.; Mackowski, J. -M.; Majorana, E.; Man, C. N.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marque, J.; Martelli, F.; Masserot, A.; Menzinger, F.; Milano, L.; Minenkov, Y.; Moins, C.; Moreau, J.; Morgado, N.; Mosca, S.; Mours, B.; Neri, I.; Nocera, F.; Pagliaroli, G.; Palomba, C.; Paoletti, F.; Pardi, S.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Piergiovanni, F.; Pinard, L.; Poggiani, R.; Punturo, M.; Puppo, P.; Rapagnani, P.; Regimbau, T.; Remillieux, A.; Ricci, F.; Ricciardi, I.; Rocchi, A.; Rolland, L.; Romano, R.; Ruggi, P.; Russo, G.; Solimeno, S.; Spallicci, A.; Swinkels, B. L.; Tarallo, M.; Terenzi, R.; Toncelli, A.; Tonelli, M.; Tournefier, E.; Travasso, F.; Vajente, G.; van den Brand, J. F. J.; van der Putten, S.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinet, J. -Y.; Vocca, H.; Yvert, M.

    2008-01-01

    We present a method to search for transient gravitational waves using a network of detectors with different spectral and directional sensitivities: the interferometer Virgo and the bar detector AURIGA. The data analysis method is based on the measurements of the correlated energy in the network by

  17. Cargo flows distribution over the loading sites of enterprises by using methods of artificial intelligence

    Directory of Open Access Journals (Sweden)

    Олександр Павлович Кіркін

    2017-06-01

    Full Text Available Development of information technologies and market requirements in effective control over cargo flows, forces enterprises to look for new ways and methods of automated control over the technological operations. For rail transportation one of the most complicated tasks of automation is the cargo flows distribution over the sites of loading and unloading. In this article the solution with the use of one of the methods of artificial intelligence – a fuzzy inference has been proposed. The analysis of the last publications showed that the fuzzy inference method is effective for the solution of similar tasks, it makes it possible to accumulate experience, it is stable to temporary impacts of the environmental conditions. The existing methods of the cargo flows distribution over the sites of loading and unloading are too simplified and can lead to incorrect decisions. The purpose of the article is to create a distribution model of cargo flows of the enterprises over the sites of loading and unloading, basing on the fuzzy inference method and to automate the control. To achieve the objective a mathematical model of the cargo flows distribution over the sites of loading and unloading has been made using fuzzy logic. The key input parameters of the model are: «number of loading sites», «arrival of the next set of cars», «availability of additional operations». The output parameter is «a variety of set of cars». Application of the fuzzy inference method made it possible to reduce loading time by 15% and to reduce costs for preparatory operations before loading by 20%. Thus this method is an effective means and holds the greatest promise for railway competitiveness increase. Interaction between different types of transportation and their influence on the cargo flows distribution over the sites of loading and unloading hasn’t been considered. These sites may be busy transshipping at that very time which is characteristic of large enterprises

  18. A three-term conjugate gradient method under the strong-Wolfe line search

    Science.gov (United States)

    Khadijah, Wan; Rivaie, Mohd; Mamat, Mustafa

    2017-08-01

    Recently, numerous studies have been concerned in conjugate gradient methods for solving large-scale unconstrained optimization method. In this paper, a three-term conjugate gradient method is proposed for unconstrained optimization which always satisfies sufficient descent direction and namely as Three-Term Rivaie-Mustafa-Ismail-Leong (TTRMIL). Under standard conditions, TTRMIL method is proved to be globally convergent under strong-Wolfe line search. Finally, numerical results are provided for the purpose of comparison.

  19. Thermal Unit Commitment Scheduling Problem in Utility System by Tabu Search Embedded Genetic Algorithm Method

    Directory of Open Access Journals (Sweden)

    C. Christober Asir Rajan

    2008-06-01

    Full Text Available The objective of this paper is to find the generation scheduling such that the total operating cost can be minimized, when subjected to a variety of constraints. This also means that it is desirable to find the optimal unit commitment in the power system for the next H hours. A 66-bus utility power system in India demonstrates the effectiveness of the proposed approach; extensive studies have also been performed for different IEEE test systems consist of 24, 57 and 175 buses. Numerical results are shown comparing the cost solutions and computation time obtained by different intelligence and conventional methods.

  20. The Breakthrough Listen Search for Intelligent Life: A Wideband Data Recorder System for the Robert C. Byrd Green Bank Telescope

    Science.gov (United States)

    MacMahon, David H. E.; Price, Danny C.; Lebofsky, Matthew; Siemion, Andrew P. V.; Croft, Steve; DeBoer, David; Enriquez, J. Emilio; Gajjar, Vishal; Hellbourg, Gregory; Isaacson, Howard; Werthimer, Dan; Abdurashidova, Zuhra; Bloss, Marty; Brandt, Joe; Creager, Ramon; Ford, John; Lynch, Ryan S.; Maddalena, Ronald J.; McCullough, Randy; Ray, Jason; Whitehead, Mark; Woody, Dave

    2018-04-01

    The Breakthrough Listen Initiative is undertaking a comprehensive search for radio and optical signatures from extraterrestrial civilizations. An integral component of the project is the design and implementation of wide-bandwidth data recorder and signal processing systems. The capabilities of these systems, particularly at radio frequencies, directly determine survey speed; further, given a fixed observing time and spectral coverage, they determine sensitivity as well. Here, we detail the Breakthrough Listen wide-bandwidth data recording system deployed at the 100 m aperture Robert C. Byrd Green Bank Telescope. The system digitizes up to 6 GHz of bandwidth at 8 bits for both polarizations, storing the resultant 24 GB s‑1 of data to disk. This system is among the highest data rate baseband recording systems in use in radio astronomy. A future system expansion will double recording capacity, to achieve a total Nyquist bandwidth of 12 GHz in two polarizations. In this paper, we present details of the system architecture, along with salient configuration and disk-write optimizations used to achieve high-throughput data capture on commodity compute servers and consumer-class hard disk drives.

  1. Advanced intelligent systems

    CERN Document Server

    Ryoo, Young; Jang, Moon-soo; Bae, Young-Chul

    2014-01-01

    Intelligent systems have been initiated with the attempt to imitate the human brain. People wish to let machines perform intelligent works. Many techniques of intelligent systems are based on artificial intelligence. According to changing and novel requirements, the advanced intelligent systems cover a wide spectrum: big data processing, intelligent control, advanced robotics, artificial intelligence and machine learning. This book focuses on coordinating intelligent systems with highly integrated and foundationally functional components. The book consists of 19 contributions that features social network-based recommender systems, application of fuzzy enforcement, energy visualization, ultrasonic muscular thickness measurement, regional analysis and predictive modeling, analysis of 3D polygon data, blood pressure estimation system, fuzzy human model, fuzzy ultrasonic imaging method, ultrasonic mobile smart technology, pseudo-normal image synthesis, subspace classifier, mobile object tracking, standing-up moti...

  2. Intelligent inversion method for pre-stack seismic big data based on MapReduce

    Science.gov (United States)

    Yan, Xuesong; Zhu, Zhixin; Wu, Qinghua

    2018-01-01

    Seismic exploration is a method of oil exploration that uses seismic information; that is, according to the inversion of seismic information, the useful information of the reservoir parameters can be obtained to carry out exploration effectively. Pre-stack data are characterised by a large amount of data, abundant information, and so on, and according to its inversion, the abundant information of the reservoir parameters can be obtained. Owing to the large amount of pre-stack seismic data, existing single-machine environments have not been able to meet the computational needs of the huge amount of data; thus, the development of a method with a high efficiency and the speed to solve the inversion problem of pre-stack seismic data is urgently needed. The optimisation of the elastic parameters by using a genetic algorithm easily falls into a local optimum, which results in a non-obvious inversion effect, especially for the optimisation effect of the density. Therefore, an intelligent optimisation algorithm is proposed in this paper and used for the elastic parameter inversion of pre-stack seismic data. This algorithm improves the population initialisation strategy by using the Gardner formula and the genetic operation of the algorithm, and the improved algorithm obtains better inversion results when carrying out a model test with logging data. All of the elastic parameters obtained by inversion and the logging curve of theoretical model are fitted well, which effectively improves the inversion precision of the density. This algorithm was implemented with a MapReduce model to solve the seismic big data inversion problem. The experimental results show that the parallel model can effectively reduce the running time of the algorithm.

  3. New Internet search volume-based weighting method for integrating various environmental impacts

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Changyoon, E-mail: changyoon@yonsei.ac.kr; Hong, Taehoon, E-mail: hong7@yonsei.ac.kr

    2016-01-15

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  4. New Internet search volume-based weighting method for integrating various environmental impacts

    International Nuclear Information System (INIS)

    Ji, Changyoon; Hong, Taehoon

    2016-01-01

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  5. A conjugate gradient method with descent properties under strong Wolfe line search

    Science.gov (United States)

    Zull, N.; ‘Aini, N.; Shoid, S.; Ghani, N. H. A.; Mohamed, N. S.; Rivaie, M.; Mamat, M.

    2017-09-01

    The conjugate gradient (CG) method is one of the optimization methods that are often used in practical applications. The continuous and numerous studies conducted on the CG method have led to vast improvements in its convergence properties and efficiency. In this paper, a new CG method possessing the sufficient descent and global convergence properties is proposed. The efficiency of the new CG algorithm relative to the existing CG methods is evaluated by testing them all on a set of test functions using MATLAB. The tests are measured in terms of iteration numbers and CPU time under strong Wolfe line search. Overall, this new method performs efficiently and comparable to the other famous methods.

  6. Surfing for suicide methods and help: content analysis of websites retrieved with search engines in Austria and the United States.

    Science.gov (United States)

    Till, Benedikt; Niederkrotenthaler, Thomas

    2014-08-01

    The Internet provides a variety of resources for individuals searching for suicide-related information. Structured content-analytic approaches to assess intercultural differences in web contents retrieved with method-related and help-related searches are scarce. We used the 2 most popular search engines (Google and Yahoo/Bing) to retrieve US-American and Austrian search results for the term suicide, method-related search terms (e.g., suicide methods, how to kill yourself, painless suicide, how to hang yourself), and help-related terms (e.g., suicidal thoughts, suicide help) on February 11, 2013. In total, 396 websites retrieved with US search engines and 335 websites from Austrian searches were analyzed with content analysis on the basis of current media guidelines for suicide reporting. We assessed the quality of websites and compared findings across search terms and between the United States and Austria. In both countries, protective outweighed harmful website characteristics by approximately 2:1. Websites retrieved with method-related search terms (e.g., how to hang yourself) contained more harmful (United States: P search engines generally had more protective characteristics (P search engines. Resources with harmful characteristics were better ranked than those with protective characteristics (United States: P < .01, Austria: P < .05). The quality of suicide-related websites obtained depends on the search terms used. Preventive efforts to improve the ranking of preventive web content, particularly regarding method-related search terms, seem necessary. © Copyright 2014 Physicians Postgraduate Press, Inc.

  7. OAST Space Theme Workshop. Volume 2: Theme summary. 3: Search for extraterrestrial intelligence (no. 9). A: Theme statement. B. 26 April 1976 presentation. C. Summary. D. Newer initiatives (form 4). E. Initiative actions (form 5)

    Science.gov (United States)

    1976-01-01

    Preliminary (1977-1983), intermediate (1982-1988), and long term (1989+) phases of the search for extraterrestrial intelligence (SETI) program are examined as well as the benefits to be derived in radioastronomy and the problems to be surmounted in radio frequency interference. The priorities, intrinsic value, criteria, and strategy for the search are discussed for both terrestrial and lunar-based CYCLOPS and for a space SETI system located at lunar liberation point L4. New initiatives related to antenna independent technology, multichannel analyzers, and radio frequency interference shielding are listed. Projected SETI program costs are included.

  8. Distributed Cooperative Search Control Method of Multiple UAVs for Moving Target

    Directory of Open Access Journals (Sweden)

    Chang-jian Ru

    2015-01-01

    Full Text Available To reduce the impact of uncertainties caused by unknown motion parameters on searching plan of moving targets and improve the efficiency of UAV’s searching, a novel distributed Multi-UAVs cooperative search control method for moving target is proposed in this paper. Based on detection results of onboard sensors, target probability map is updated using Bayesian theory. A Gaussian distribution of target transition probability density function is introduced to calculate prediction probability of moving target existence, and then target probability map can be further updated in real-time. A performance index function combining with target cost, environment cost, and cooperative cost is constructed, and the cooperative searching problem can be transformed into a central optimization problem. To improve computational efficiency, the distributed model predictive control method is presented, and thus the control command of each UAV can be obtained. The simulation results have verified that the proposed method can avoid the blindness of UAV searching better and improve overall efficiency of the team effectively.

  9. A semantics-based method for clustering of Chinese web search results

    Science.gov (United States)

    Zhang, Hui; Wang, Deqing; Wang, Li; Bi, Zhuming; Chen, Yong

    2014-01-01

    Information explosion is a critical challenge to the development of modern information systems. In particular, when the application of an information system is over the Internet, the amount of information over the web has been increasing exponentially and rapidly. Search engines, such as Google and Baidu, are essential tools for people to find the information from the Internet. Valuable information, however, is still likely submerged in the ocean of search results from those tools. By clustering the results into different groups based on subjects automatically, a search engine with the clustering feature allows users to select most relevant results quickly. In this paper, we propose an online semantics-based method to cluster Chinese web search results. First, we employ the generalised suffix tree to extract the longest common substrings (LCSs) from search snippets. Second, we use the HowNet to calculate the similarities of the words derived from the LCSs, and extract the most representative features by constructing the vocabulary chain. Third, we construct a vector of text features and calculate snippets' semantic similarities. Finally, we improve the Chameleon algorithm to cluster snippets. Extensive experimental results have shown that the proposed algorithm has outperformed over the suffix tree clustering method and other traditional clustering methods.

  10. The search conference as a method in planning community health promotion actions

    Directory of Open Access Journals (Sweden)

    Eva Magnus

    2016-08-01

    Full Text Available Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives.

  11. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    Science.gov (United States)

    Stoitsis, John; Valavanis, Ioannis; Mougiakakou, Stavroula G.; Golemati, Spyretta; Nikita, Alexandra; Nikita, Konstantina S.

    2006-12-01

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

  12. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    International Nuclear Information System (INIS)

    Stoitsis, John; Valavanis, Ioannis; Mougiakakou, Stavroula G.; Golemati, Spyretta; Nikita, Alexandra; Nikita, Konstantina S.

    2006-01-01

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis

  13. Computer aided diagnosis based on medical image processing and artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Stoitsis, John [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece)]. E-mail: stoitsis@biosim.ntua.gr; Valavanis, Ioannis [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Mougiakakou, Stavroula G. [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Golemati, Spyretta [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece); Nikita, Alexandra [University of Athens, Medical School 152 28 Athens (Greece); Nikita, Konstantina S. [National Technical University of Athens, School of Electrical and Computer Engineering, Athens 157 71 (Greece)

    2006-12-20

    Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

  14. A new greedy search method for the design of digital IIR filter

    Directory of Open Access Journals (Sweden)

    Ranjit Kaur

    2015-07-01

    Full Text Available A new greedy search method is applied in this paper to design the optimal digital infinite impulse response (IIR filter. The greedy search method is based on binary successive approximation (BSA and evolutionary search (ES. The suggested greedy search method optimizes the magnitude response and the phase response simultaneously and also finds the lowest order of the filter. The order of the filter is controlled by a control gene whose value is also optimized along with the filter coefficients to obtain optimum order of designed IIR filter. The stability constraints of IIR filter are taken care of during the design procedure. To determine the trade-off relationship between conflicting objectives in the non-inferior domain, the weighting method is exploited. The proposed approach is effectively applied to solve the multiobjective optimization problems of designing the digital low-pass (LP, high-pass (HP, bandpass (BP, and bandstop (BS filters. It has been demonstrated that this technique not only fulfills all types of filter performance requirements, but also the lowest order of the filter can be found. The computational experiments show that the proposed approach gives better digital IIR filters than the existing evolutionary algorithm (EA based methods.

  15. Introducing PALETTE: an iterative method for conducting a literature search for a review in palliative care.

    Science.gov (United States)

    Zwakman, Marieke; Verberne, Lisa M; Kars, Marijke C; Hooft, Lotty; van Delden, Johannes J M; Spijker, René

    2018-06-02

    In the rapidly developing specialty of palliative care, literature reviews have become increasingly important to inform and improve the field. When applying widely used methods for literature reviews developed for intervention studies onto palliative care, challenges are encountered such as the heterogeneity of palliative care in practice (wide range of domains in patient characteristics, stages of illness and stakeholders), the explorative character of review questions, and the poorly defined keywords and concepts. To overcome the challenges and to provide guidance for researchers to conduct a literature search for a review in palliative care, Palliative cAre Literature rEview iTeraTive mEthod (PALLETE), a pragmatic framework, was developed. We assessed PALETTE with a detailed description. PALETTE consists of four phases; developing the review question, building the search strategy, validating the search strategy and performing the search. The framework incorporates different information retrieval techniques: contacting experts, pearl growing, citation tracking and Boolean searching in a transparent way to maximize the retrieval of literature relevant to the topic of interest. The different components and techniques are repeated until no new articles are qualified for inclusion. The phases within PALETTE are interconnected by a recurrent process of validation on 'golden bullets' (articles that undoubtedly should be part of the review), citation tracking and concept terminology reflecting the review question. To give insight in the value of PALETTE, we compared PALETTE with the recommended search method for reviews of intervention studies. By using PALETTE on two palliative care literature reviews, we were able to improve our review questions and search strategies. Moreover, in comparison with the recommended search for intervention reviews, the number of articles needed to be screened was decreased whereas more relevant articles were retrieved. Overall, PALETTE

  16. Methods for Model-Based Reasoning within Agent-Based Ambient Intelligence Applications

    NARCIS (Netherlands)

    Bosse, T.; Both, F.; Gerritsen, C.; Hoogendoorn, M.; Treur, J.

    2012-01-01

    Within agent-based Ambient Intelligence applications agents react to humans based on information obtained by sensoring and their knowledge about human functioning. Appropriate types of reactions depend on the extent to which an agent understands the human and is able to interpret the available

  17. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    Science.gov (United States)

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  18. System and method for improving video recorder performance in a search mode

    NARCIS (Netherlands)

    2000-01-01

    A method and apparatus wherein video images are recorded on a plurality of tracks of a tape such that, for playback in a search mode at a speed, higher than the recording speed the displayed image will consist of a plurality of contiguous parts, some of the parts being read out from tracks each

  19. System and method for improving video recorder performance in a search mode

    NARCIS (Netherlands)

    1991-01-01

    A method and apparatus wherein video images are recorded on a plurality of tracks of a tape such that, for playback in a search mode at a speed higher than the recording speed the displayed image will consist of a plurality of contiguous parts, some of the parts being read out from tracks each

  20. A Teaching Approach from the Exhaustive Search Method to the Needleman-Wunsch Algorithm

    Science.gov (United States)

    Xu, Zhongneng; Yang, Yayun; Huang, Beibei

    2017-01-01

    The Needleman-Wunsch algorithm has become one of the core algorithms in bioinformatics; however, this programming requires more suitable explanations for students with different major backgrounds. In supposing sample sequences and using a simple store system, the connection between the exhaustive search method and the Needleman-Wunsch algorithm…

  1. Innovative issues in intelligent systems

    CERN Document Server

    Yager, Ronald; Kacprzyk, Janusz; Jotsov, Vladimir

    2016-01-01

    This book presents a broad variety of different contemporary IT methods and applications in Intelligent Systems is displayed. Every book chapter represents a detailed, specific, far reaching and original re-search in a respective scientific and practical field. However, all of the chapters share the common point of strong similarity in a sense of being innovative, applicable and mutually compatible with each other. In other words, the methods from the different chapters can be viewed as bricks for building the next generation “thinking machines” as well as for other futuristic logical applications that are rapidly changing our world nowadays.

  2. Fetal Intelligent Navigation Echocardiography (FINE): a novel method for rapid, simple, and automatic examination of the fetal heart.

    Science.gov (United States)

    Yeo, Lami; Romero, Roberto

    2013-09-01

    To describe a novel method (Fetal Intelligent Navigation Echocardiography (FINE)) for visualization of standard fetal echocardiography views from volume datasets obtained with spatiotemporal image correlation (STIC) and application of 'intelligent navigation' technology. We developed a method to: 1) demonstrate nine cardiac diagnostic planes; and 2) spontaneously navigate the anatomy surrounding each of the nine cardiac diagnostic planes (Virtual Intelligent Sonographer Assistance (VIS-Assistance®)). The method consists of marking seven anatomical structures of the fetal heart. The following echocardiography views are then automatically generated: 1) four chamber; 2) five chamber; 3) left ventricular outflow tract; 4) short-axis view of great vessels/right ventricular outflow tract; 5) three vessels and trachea; 6) abdomen/stomach; 7) ductal arch; 8) aortic arch; and 9) superior and inferior vena cava. The FINE method was tested in a separate set of 50 STIC volumes of normal hearts (18.6-37.2 weeks of gestation), and visualization rates for fetal echocardiography views using diagnostic planes and/or VIS-Assistance® were calculated. To examine the feasibility of identifying abnormal cardiac anatomy, we tested the method in four cases with proven congenital heart defects (coarctation of aorta, tetralogy of Fallot, transposition of great vessels and pulmonary atresia with intact ventricular septum). In normal cases, the FINE method was able to generate nine fetal echocardiography views using: 1) diagnostic planes in 78-100% of cases; 2) VIS-Assistance® in 98-100% of cases; and 3) a combination of diagnostic planes and/or VIS-Assistance® in 98-100% of cases. In all four abnormal cases, the FINE method demonstrated evidence of abnormal fetal cardiac anatomy. The FINE method can be used to visualize nine standard fetal echocardiography views in normal hearts by applying 'intelligent navigation' technology to STIC volume datasets. This method can simplify

  3. Mathematical programming models for solving in equal-sized facilities layout problems. A genetic search method

    International Nuclear Information System (INIS)

    Tavakkoli-Moghaddam, R.

    1999-01-01

    This paper present unequal-sized facilities layout solutions generated by a genetic search program. named Layout Design using a Genetic Algorithm) 9. The generalized quadratic assignment problem requiring pre-determined distance and material flow matrices as the input data and the continuous plane model employing a dynamic distance measure and a material flow matrix are discussed. Computational results on test problems are reported as compared with layout solutions generated by the branch - and bound algorithm a hybrid method merging simulated annealing and local search techniques, and an optimization process of an enveloped block

  4. An Efficient Hybrid Conjugate Gradient Method with the Strong Wolfe-Powell Line Search

    Directory of Open Access Journals (Sweden)

    Ahmad Alhawarat

    2015-01-01

    Full Text Available Conjugate gradient (CG method is an interesting tool to solve optimization problems in many fields, such as design, economics, physics, and engineering. In this paper, we depict a new hybrid of CG method which relates to the famous Polak-Ribière-Polyak (PRP formula. It reveals a solution for the PRP case which is not globally convergent with the strong Wolfe-Powell (SWP line search. The new formula possesses the sufficient descent condition and the global convergent properties. In addition, we further explained about the cases where PRP method failed with SWP line search. Furthermore, we provide numerical computations for the new hybrid CG method which is almost better than other related PRP formulas in both the number of iterations and the CPU time under some standard test functions.

  5. A peak value searching method of the MCA based on digital logic devices

    International Nuclear Information System (INIS)

    Sang Ziru; Huang Shanshan; Chen Lian; Jin Ge

    2010-01-01

    Digital multi-channel analyzers play a more important role in multi-channel pulse height analysis technique. The direction of digitalization are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper introduces a method of searching peak value of waveform based on digital logic with FPGA. This method reduce the dead time. Then data correction offline can improvement the non-linearity of MCA. It gives the α energy spectrum of 241 Am. (authors)

  6. Local Path Planning of Driverless Car Navigation Based on Jump Point Search Method Under Urban Environment

    Directory of Open Access Journals (Sweden)

    Kaijun Zhou

    2017-09-01

    Full Text Available The Jump Point Search (JPS algorithm is adopted for local path planning of the driverless car under urban environment, and it is a fast search method applied in path planning. Firstly, a vector Geographic Information System (GIS map, including Global Positioning System (GPS position, direction, and lane information, is built for global path planning. Secondly, the GIS map database is utilized in global path planning for the driverless car. Then, the JPS algorithm is adopted to avoid the front obstacle, and to find an optimal local path for the driverless car in the urban environment. Finally, 125 different simulation experiments in the urban environment demonstrate that JPS can search out the optimal and safety path successfully, and meanwhile, it has a lower time complexity compared with the Vector Field Histogram (VFH, the Rapidly Exploring Random Tree (RRT, A*, and the Probabilistic Roadmaps (PRM algorithms. Furthermore, JPS is validated usefully in the structured urban environment.

  7. Realization of Personalized Services for Intelligent Residential Space based on User Identification Method using Sequential Walking Footprints

    Directory of Open Access Journals (Sweden)

    Jin-Woo Jung

    2005-04-01

    Full Text Available A new human-friendly assistive home environment, Intelligent Sweet Home (ISH, developed at KAIST, Korea for testing advanced concepts for independent living of the elderly/the physically handicapped. The concept of ISH is to consider the home itself as an intelligent robot. ISH always checks the intention or health status of the resident. Therefore, ISH can do actively the most proper services considering the resident's life-style by the detected intention or emergency information. But, when there are more than two residents, ISH cannot consider the residents' characteristics or tastes if ISH cannot identify who he/she is before. To realize a personalized service system in the intelligent residential space like ISH, we deal with a human-friendly user identification method for ubiquitous computing environment, specially focused on dynamic human footprint recognition. And then, we address some case studies of personalized services that have been experienced by Human-friendly Welfare Robot System research center, KAIST.

  8. Analysis of operator support method based on intelligent dynamic interlock in lead-cooled fast reactor simulator

    International Nuclear Information System (INIS)

    Xu, Peng; Wang, Jianye; Yang, Minghan; Wang, Weitian; Bai, Yunqing; Song, Yong

    2017-01-01

    Highlights: • We development an operator support method based on intelligent dynamic interlock. • We offer an integrated aid system to reduce the working strength of operators. • The method can help operators avoid dangerous, irreversible operation. • This method can be used in the fusion research reactor in the further. - Abstract: In nuclear systems, operators have to carry out corrective actions when abnormal situations occur. However, operators might make mistakes under pressure. In order to avoid serious consequences of the human errors, a new method for operators support based on intelligent dynamic interlock was proposed. The new method based on full digital instrumentation and control system, contains real-time alarm analysis process, decision support process and automatic safety interlock process. Once abnormal conditions occur, necessary safety interlock parameter based on analysis of real-time alarm and decision support process can be loaded into human-machine interfaces and controllers automatically, and avoid human errors effectively. Furthermore, the new method can make recommendations for further use and development of this technique in nuclear power plant or fusion research reactor.

  9. An adaptive bin framework search method for a beta-sheet protein homopolymer model

    Directory of Open Access Journals (Sweden)

    Hoos Holger H

    2007-04-01

    Full Text Available Abstract Background The problem of protein structure prediction consists of predicting the functional or native structure of a protein given its linear sequence of amino acids. This problem has played a prominent role in the fields of biomolecular physics and algorithm design for over 50 years. Additionally, its importance increases continually as a result of an exponential growth over time in the number of known protein sequences in contrast to a linear increase in the number of determined structures. Our work focuses on the problem of searching an exponentially large space of possible conformations as efficiently as possible, with the goal of finding a global optimum with respect to a given energy function. This problem plays an important role in the analysis of systems with complex search landscapes, and particularly in the context of ab initio protein structure prediction. Results In this work, we introduce a novel approach for solving this conformation search problem based on the use of a bin framework for adaptively storing and retrieving promising locally optimal solutions. Our approach provides a rich and general framework within which a broad range of adaptive or reactive search strategies can be realized. Here, we introduce adaptive mechanisms for choosing which conformations should be stored, based on the set of conformations already stored in memory, and for biasing choices when retrieving conformations from memory in order to overcome search stagnation. Conclusion We show that our bin framework combined with a widely used optimization method, Monte Carlo search, achieves significantly better performance than state-of-the-art generalized ensemble methods for a well-known protein-like homopolymer model on the face-centered cubic lattice.

  10. A comparison of two search methods for determining the scope of systematic reviews and health technology assessments.

    Science.gov (United States)

    Forsetlund, Louise; Kirkehei, Ingvild; Harboe, Ingrid; Odgaard-Jensen, Jan

    2012-01-01

    This study aims to compare two different search methods for determining the scope of a requested systematic review or health technology assessment. The first method (called the Direct Search Method) included performing direct searches in the Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts of Reviews of Effects (DARE) and the Health Technology Assessments (HTA). Using the comparison method (called the NHS Search Engine) we performed searches by means of the search engine of the British National Health Service, NHS Evidence. We used an adapted cross-over design with a random allocation of fifty-five requests for systematic reviews. The main analyses were based on repeated measurements adjusted for the order in which the searches were conducted. The Direct Search Method generated on average fewer hits (48 percent [95 percent confidence interval {CI} 6 percent to 72 percent], had a higher precision (0.22 [95 percent CI, 0.13 to 0.30]) and more unique hits than when searching by means of the NHS Search Engine (50 percent [95 percent CI, 7 percent to 110 percent]). On the other hand, the Direct Search Method took longer (14.58 minutes [95 percent CI, 7.20 to 21.97]) and was perceived as somewhat less user-friendly than the NHS Search Engine (-0.60 [95 percent CI, -1.11 to -0.09]). Although the Direct Search Method had some drawbacks such as being more time-consuming and less user-friendly, it generated more unique hits than the NHS Search Engine, retrieved on average fewer references and fewer irrelevant results.

  11. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  12. Algorithms and architectures of artificial intelligence

    CERN Document Server

    Tyugu, E

    2007-01-01

    This book gives an overview of methods developed in artificial intelligence for search, learning, problem solving and decision-making. It gives an overview of algorithms and architectures of artificial intelligence that have reached the degree of maturity when a method can be presented as an algorithm, or when a well-defined architecture is known, e.g. in neural nets and intelligent agents. It can be used as a handbook for a wide audience of application developers who are interested in using artificial intelligence methods in their software products. Parts of the text are rather independent, so that one can look into the index and go directly to a description of a method presented in the form of an abstract algorithm or an architectural solution. The book can be used also as a textbook for a course in applied artificial intelligence. Exercises on the subject are added at the end of each chapter. Neither programming skills nor specific knowledge in computer science are expected from the reader. However, some p...

  13. A fast tomographic method for searching the minimum free energy path

    International Nuclear Information System (INIS)

    Chen, Changjun; Huang, Yanzhao; Xiao, Yi; Jiang, Xuewei

    2014-01-01

    Minimum Free Energy Path (MFEP) provides a lot of important information about the chemical reactions, like the free energy barrier, the location of the transition state, and the relative stability between reactant and product. With MFEP, one can study the mechanisms of the reaction in an efficient way. Due to a large number of degrees of freedom, searching the MFEP is a very time-consuming process. Here, we present a fast tomographic method to perform the search. Our approach first calculates the free energy surfaces in a sequence of hyperplanes perpendicular to a transition path. Based on an objective function and the free energy gradient, the transition path is optimized in the collective variable space iteratively. Applications of the present method to model systems show that our method is practical. It can be an alternative approach for finding the state-to-state MFEP

  14. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    Science.gov (United States)

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.

  15. Event classification and optimization methods using artificial intelligence and other relevant techniques: Sharing the experiences

    Science.gov (United States)

    Mohamed, Abdul Aziz; Hasan, Abu Bakar; Ghazali, Abu Bakar Mhd.

    2017-01-01

    Classification of large data into respected classes or groups could be carried out with the help of artificial intelligence (AI) tools readily available in the market. To get the optimum or best results, optimization tool could be applied on those data. Classification and optimization have been used by researchers throughout their works, and the outcomes were very encouraging indeed. Here, the authors are trying to share what they have experienced in three different areas of applied research.

  16. Novel citation-based search method for scientific literature: application to meta-analyses.

    Science.gov (United States)

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  17. A Novel Strain-Based Method to Estimate Tire Conditions Using Fuzzy Logic for Intelligent Tires

    Directory of Open Access Journals (Sweden)

    Daniel Garcia-Pozuelo

    2017-02-01

    Full Text Available The so-called intelligent tires are one of the most promising research fields for automotive engineers. These tires are equipped with sensors which provide information about vehicle dynamics. Up to now, the commercial intelligent tires only provide information about inflation pressure and their contribution to stability control systems is currently very limited. Nowadays one of the major problems for intelligent tire development is how to embed feasible and low cost sensors to obtain reliable information such as inflation pressure, vertical load or rolling speed. These parameters provide key information for vehicle dynamics characterization. In this paper, we propose a novel algorithm based on fuzzy logic to estimate the mentioned parameters by means of a single strain-based system. Experimental tests have been carried out in order to prove the suitability and durability of the proposed on-board strain sensor system, as well as its low cost advantages, and the accuracy of the obtained estimations by means of fuzzy logic.

  18. Development of a simplified method for intelligent glazed façade design under different control strategies and verified by building simulation tool BSim

    DEFF Research Database (Denmark)

    Liu, Mingzhe; Wittchen, Kim Bjarne; Heiselberg, Per

    2014-01-01

    The research aims to develop a simplified calculation method for intelligent glazed facade under different control conditions (night shutter, solar shading and natural ventilation) to simulate the energy performance and indoor environment of an office room installed with the intelligent facade......, it is possible to calculate the whole year performance of a room or building with intelligent glazed façade, which makes it a less time consuming tool to investigate the performance of the intelligent façade under different control strategies in the design stage with acceptable accuracy. Results showed good....... The method took the angle dependence of the solar characteristic into account, including the simplified hourly building model developed according to EN 13790 to evaluate the influence of the controlled façade on both the indoor environment (indoor air temperature, solar transmittance through the façade...

  19. Fast optimization of binary clusters using a novel dynamic lattice searching method

    International Nuclear Information System (INIS)

    Wu, Xia; Cheng, Wen

    2014-01-01

    Global optimization of binary clusters has been a difficult task despite of much effort and many efficient methods. Directing toward two types of elements (i.e., homotop problem) in binary clusters, two classes of virtual dynamic lattices are constructed and a modified dynamic lattice searching (DLS) method, i.e., binary DLS (BDLS) method, is developed. However, it was found that the BDLS can only be utilized for the optimization of binary clusters with small sizes because homotop problem is hard to be solved without atomic exchange operation. Therefore, the iterated local search (ILS) method is adopted to solve homotop problem and an efficient method based on the BDLS method and ILS, named as BDLS-ILS, is presented for global optimization of binary clusters. In order to assess the efficiency of the proposed method, binary Lennard-Jones clusters with up to 100 atoms are investigated. Results show that the method is proved to be efficient. Furthermore, the BDLS-ILS method is also adopted to study the geometrical structures of (AuPd) 79 clusters with DFT-fit parameters of Gupta potential

  20. Frequency domain optical tomography using a conjugate gradient method without line search

    International Nuclear Information System (INIS)

    Kim, Hyun Keol; Charette, Andre

    2007-01-01

    A conjugate gradient method without line search (CGMWLS) is presented. This method is used to retrieve the local maps of absorption and scattering coefficients inside the tissue-like test medium, with the synthetic data. The forward problem is solved with a discrete-ordinates finite-difference method based on the frequency domain formulation of radiative transfer equation. The inversion results demonstrate that the CGMWLS can retrieve simultaneously the spatial distributions of optical properties inside the medium within a reasonable accuracy, by reducing cross-talk between absorption and scattering coefficients

  1. Non-contact method of search and analysis of pulsating vessels

    Science.gov (United States)

    Avtomonov, Yuri N.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Despite the variety of existing methods of recording the human pulse and a solid history of their development, there is still considerable interest in this topic. The development of new non-contact methods, based on advanced image processing, caused a new wave of interest in this issue. We present a simple but quite effective method for analyzing the mechanical pulsations of blood vessels lying close to the surface of the skin. Our technique is a modification of imaging (or remote) photoplethysmography (i-PPG). We supplemented this method with the addition of a laser light source, which made it possible to use other methods of searching for the proposed pulsation zone. During the testing of the method, several series of experiments were carried out with both artificial oscillating objects as well as with the target signal source (human wrist). The obtained results show that our method allows correct interpretation of complex data. To summarize, we proposed and tested an alternative method for the search and analysis of pulsating vessels.

  2. An Efficient Method to Search Real-Time Bulk Data for an Information Processing System

    International Nuclear Information System (INIS)

    Kim, Seong Jin; Kim, Jong Myung; Suh, Yong Suk; Keum, Jong Yong; Park, Heui Youn

    2005-01-01

    The Man Machine Interface System (MMIS) of System-integrated Modular Advanced ReacTor (SMART) is designed with fully digitalized features. The Information Processing System (IPS) of the MMIS acquires and processes plant data from other systems. In addition, the IPS provides plant operation information to operators in the control room. The IPS is required to process bulky data in a real-time. So, it is necessary to consider a special processing method with regards to flexibility and performance because more than a few thousands of Plant Information converges on the IPS. Among other things, the processing time for searching for data from the bulk data consumes much more than other the processing times. Thus, this paper explores an efficient method for the search and examines its feasibility

  3. Neural Based Tabu Search method for solving unit commitment problem with cooling-banking constraints

    Directory of Open Access Journals (Sweden)

    Rajan Asir Christober Gnanakkan Charles

    2009-01-01

    Full Text Available This paper presents a new approach to solve short-term unit commitment problem (UCP using Neural Based Tabu Search (NBTS with cooling and banking constraints. The objective of this paper is to find the generation scheduling such that the total operating cost can be minimized, when subjected to a variety of constraints. This also means that it is desirable to find the optimal generating unit commitment in the power system for next H hours. A 7-unit utility power system in India demonstrates the effectiveness of the proposed approach; extensive studies have also been performed for different IEEE test systems consist of 10, 26 and 34 units. Numerical results are shown to compare the superiority of the cost solutions obtained using the Tabu Search (TS method, Dynamic Programming (DP and Lagrangian Relaxation (LR methods in reaching proper unit commitment.

  4. The Search Conference as a Method in Planning Community Health Promotion Actions

    Science.gov (United States)

    Magnus, Eva; Knudtsen, Margunn Skjei; Wist, Guri; Weiss, Daniel; Lillefjell, Monica

    2016-01-01

    Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives. Significance for public health This article describe and discuss how the Search conference can be used as a method when working with knowledge based health promotion actions in local communities. The article describe the sequences of the conference and shows how this have been adapted when planning and prioritizing health promotion actions in three Norwegian municipalities. The significance of the article is that it shows how central elements in the planning of health promotion actions, as participation and involvements as well as evidence was a fundamental thinking in how the conference were accomplished. The article continue discussing how the method function as both a top-down and a bottom-up strategy, and in what way working evidence based can be in conflict with a bottom-up strategy. The experiences described can be used as guidance planning knowledge based health promotion actions in communities. PMID:27747199

  5. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method.

    Science.gov (United States)

    Tien, Shin-Ming; Hsu, Chih-Yuan; Chen, Bor-Sen

    2016-01-01

    Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella's rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the "brake component" in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each "brake component" were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the "brake component". Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate "brake component" in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains.

  6. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method.

    Directory of Open Access Journals (Sweden)

    Shin-Ming Tien

    Full Text Available Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella's rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the "brake component" in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each "brake component" were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the "brake component". Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate "brake component" in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains.

  7. Study on intelligence fault diagnosis method for nuclear power plant equipment based on rough set and fuzzy neural network

    International Nuclear Information System (INIS)

    Liu Yongkuo; Xia Hong; Xie Chunli; Chen Zhihui; Chen Hongxia

    2007-01-01

    Rough set theory and fuzzy neural network are combined, to take full advantages of the two of them. Based on the reduction technology to knowledge of Rough set method, and by drawing the simple rule from a large number of initial data, the fuzzy neural network was set up, which was with better topological structure, improved study speed, accurate judgment, strong fault-tolerant ability, and more practical. In order to test the validity of the method, the inverted U-tubes break accident of Steam Generator and etc are used as examples, and many simulation experiments are performed. The test result shows that it is feasible to incorporate the fault intelligence diagnosis method based on rough set and fuzzy neural network in the nuclear power plant equipment, and the method is simple and convenience, with small calculation amount and reliable result. (authors)

  8. Modeling, control, and simulation of grid connected intelligent hybrid battery/photovoltaic system using new hybrid fuzzy-neural method.

    Science.gov (United States)

    Rezvani, Alireza; Khalili, Abbas; Mazareie, Alireza; Gandomkar, Majid

    2016-07-01

    Nowadays, photovoltaic (PV) generation is growing increasingly fast as a renewable energy source. Nevertheless, the drawback of the PV system is its dependence on weather conditions. Therefore, battery energy storage (BES) can be considered to assist for a stable and reliable output from PV generation system for loads and improve the dynamic performance of the whole generation system in grid connected mode. In this paper, a novel topology of intelligent hybrid generation systems with PV and BES in a DC-coupled structure is presented. Each photovoltaic cell has a specific point named maximum power point on its operational curve (i.e. current-voltage or power-voltage curve) in which it can generate maximum power. Irradiance and temperature changes affect these operational curves. Therefore, the nonlinear characteristic of maximum power point to environment has caused to development of different maximum power point tracking techniques. In order to capture the maximum power point (MPP), a hybrid fuzzy-neural maximum power point tracking (MPPT) method is applied in the PV system. Obtained results represent the effectiveness and superiority of the proposed method, and the average tracking efficiency of the hybrid fuzzy-neural is incremented by approximately two percentage points in comparison to the conventional methods. It has the advantages of robustness, fast response and good performance. A detailed mathematical model and a control approach of a three-phase grid-connected intelligent hybrid system have been proposed using Matlab/Simulink. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  9. An Effective Wormhole Attack Defence Method for a Smart Meter Mesh Network in an Intelligent Power Grid

    Directory of Open Access Journals (Sweden)

    Jungtaek Seo

    2012-08-01

    Full Text Available Smart meters are one of the key components of intelligent power grids. Wireless mesh networks based on smart meters could provide customer-oriented information on electricity use to the operational control systems, which monitor power grid status and estimate electric power demand. Using this information, an operational control system could regulate devices within the smart grid in order to provide electricity in a cost-efficient manner. Ensuring the availability of the smart meter mesh network is therefore a critical factor in securing the soundness of an intelligent power system. Wormhole attacks can be one of the most difficult-to-address threats to the availability of mesh networks, and although many methods to nullify wormhole attacks have been tried, these have been limited by high computational resource requirements and unnecessary overhead, as well as by the lack of ability of such methods to respond to attacks. In this paper, an effective defense mechanism that both detects and responds to wormhole attacks is proposed. In the proposed system, each device maintains information on its neighbors, allowing each node to identify replayed packets. The effectiveness and efficiency of the proposed method is analyzed in light of additional computational message and memory complexities.

  10. An intelligent service matching method for mechanical equipment condition monitoring using the fibre Bragg grating sensor network

    Science.gov (United States)

    Zhang, Fan; Zhou, Zude; Liu, Quan; Xu, Wenjun

    2017-02-01

    Due to the advantages of being able to function under harsh environmental conditions and serving as a distributed condition information source in a networked monitoring system, the fibre Bragg grating (FBG) sensor network has attracted considerable attention for equipment online condition monitoring. To provide an overall conditional view of the mechanical equipment operation, a networked service-oriented condition monitoring framework based on FBG sensing is proposed, together with an intelligent matching method for supporting monitoring service management. In the novel framework, three classes of progressive service matching approaches, including service-chain knowledge database service matching, multi-objective constrained service matching and workflow-driven human-interactive service matching, are developed and integrated with an enhanced particle swarm optimisation (PSO) algorithm as well as a workflow-driven mechanism. Moreover, the manufacturing domain ontology, FBG sensor network structure and monitoring object are considered to facilitate the automatic matching of condition monitoring services to overcome the limitations of traditional service processing methods. The experimental results demonstrate that FBG monitoring services can be selected intelligently, and the developed condition monitoring system can be re-built rapidly as new equipment joins the framework. The effectiveness of the service matching method is also verified by implementing a prototype system together with its performance analysis.

  11. Efficient and accurate Greedy Search Methods for mining functional modules in protein interaction networks.

    Science.gov (United States)

    He, Jieyue; Li, Chaojun; Ye, Baoliu; Zhong, Wei

    2012-06-25

    Most computational algorithms mainly focus on detecting highly connected subgraphs in PPI networks as protein complexes but ignore their inherent organization. Furthermore, many of these algorithms are computationally expensive. However, recent analysis indicates that experimentally detected protein complexes generally contain Core/attachment structures. In this paper, a Greedy Search Method based on Core-Attachment structure (GSM-CA) is proposed. The GSM-CA method detects densely connected regions in large protein-protein interaction networks based on the edge weight and two criteria for determining core nodes and attachment nodes. The GSM-CA method improves the prediction accuracy compared to other similar module detection approaches, however it is computationally expensive. Many module detection approaches are based on the traditional hierarchical methods, which is also computationally inefficient because the hierarchical tree structure produced by these approaches cannot provide adequate information to identify whether a network belongs to a module structure or not. In order to speed up the computational process, the Greedy Search Method based on Fast Clustering (GSM-FC) is proposed in this work. The edge weight based GSM-FC method uses a greedy procedure to traverse all edges just once to separate the network into the suitable set of modules. The proposed methods are applied to the protein interaction network of S. cerevisiae. Experimental results indicate that many significant functional modules are detected, most of which match the known complexes. Results also demonstrate that the GSM-FC algorithm is faster and more accurate as compared to other competing algorithms. Based on the new edge weight definition, the proposed algorithm takes advantages of the greedy search procedure to separate the network into the suitable set of modules. Experimental analysis shows that the identified modules are statistically significant. The algorithm can reduce the

  12. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    International Nuclear Information System (INIS)

    Gora, D.; Bernardini, E.; Cruz Silva, A.H.

    2011-04-01

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  13. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    Energy Technology Data Exchange (ETDEWEB)

    Gora, D. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institute of Nuclear Physics PAN, Cracow (Poland); Bernardini, E.; Cruz Silva, A.H. [Institute of Nuclear Physics PAN, Cracow (Poland)

    2011-04-15

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  14. Development and evaluation of a novel lossless image compression method (AIC: artificial intelligence compression method) using neural networks as artificial intelligence

    International Nuclear Information System (INIS)

    Fukatsu, Hiroshi; Naganawa, Shinji; Yumura, Shinnichiro

    2008-01-01

    This study was aimed to validate the performance of a novel image compression method using a neural network to achieve a lossless compression. The encoding consists of the following blocks: a prediction block; a residual data calculation block; a transformation and quantization block; an organization and modification block; and an entropy encoding block. The predicted image is divided into four macro-blocks using the original image for teaching; and then redivided into sixteen sub-blocks. The predicted image is compared to the original image to create the residual image. The spatial and frequency data of the residual image are compared and transformed. Chest radiography, computed tomography (CT), magnetic resonance imaging, positron emission tomography, radioisotope mammography, ultrasonography, and digital subtraction angiography images were compressed using the AIC lossless compression method; and the compression rates were calculated. The compression rates were around 15:1 for chest radiography and mammography, 12:1 for CT, and around 6:1 for other images. This method thus enables greater lossless compression than the conventional methods. This novel method should improve the efficiency of handling of the increasing volume of medical imaging data. (author)

  15. An efficient search method for finding the critical slip surface using the compositional Monte Carlo technique

    International Nuclear Information System (INIS)

    Goshtasbi, K.; Ahmadi, M; Naeimi, Y.

    2008-01-01

    Locating the critical slip surface and the associated minimum factor of safety are two complementary parts in a slope stability analysis. A large number of computer programs exist to solve slope stability problems. Most of these programs, however, have used inefficient and unreliable search procedures to locate the global minimum factor of safety. This paper presents an efficient and reliable method to determine the global minimum factor of safety coupled with a modified version of the Monte Carlo technique. Examples arc presented to illustrate the reliability of the proposed method

  16. Generalized Pattern Search methods for a class of nonsmooth optimization problems with structure

    Science.gov (United States)

    Bogani, C.; Gasparo, M. G.; Papini, A.

    2009-07-01

    We propose a Generalized Pattern Search (GPS) method to solve a class of nonsmooth minimization problems, where the set of nondifferentiability is included in the union of known hyperplanes and, therefore, is highly structured. Both unconstrained and linearly constrained problems are considered. At each iteration the set of poll directions is enforced to conform to the geometry of both the nondifferentiability set and the boundary of the feasible region, near the current iterate. This is the key issue to guarantee the convergence of certain subsequences of iterates to points which satisfy first-order optimality conditions. Numerical experiments on some classical problems validate the method.

  17. Searching in the Context of a Task: A Review of Methods and Tools

    Directory of Open Access Journals (Sweden)

    Ana Maguitman

    2018-04-01

    Full Text Available Contextual information extracted from the user task can help to better target retrieval to task-relevant content. In particular, topical context can be exploited to identify the subject of the information needs, contributing to reduce the information overload problem. A great number of methods exist to extract raw context data and contextual interaction patterns from the user task and to model this information using higher-level representations. Context can then be used as a source for automatic query generation, or as a means to refine or disambiguate user-generated queries. It can also be used to filter and rank results as well as to select domain-specific search engines with better capabilities to satisfy specific information requests. This article reviews methods that have been applied to deal with the problem of reflecting the current and long-term interests of a user in the search process. It discusses major difficulties encountered in the research area of context-based information retrieval and presents an overview of tools proposed since the mid-nineties to deal with the problem of context-based search.

  18. Search method for long-duration gravitational-wave transients from neutron stars

    International Nuclear Information System (INIS)

    Prix, R.; Giampanis, S.; Messenger, C.

    2011-01-01

    We introduce a search method for a new class of gravitational-wave signals, namely, long-duration O(hours-weeks) transients from spinning neutron stars. We discuss the astrophysical motivation from glitch relaxation models and we derive a rough estimate for the maximal expected signal strength based on the superfluid excess rotational energy. The transient signal model considered here extends the traditional class of infinite-duration continuous-wave signals by a finite start-time and duration. We derive a multidetector Bayes factor for these signals in Gaussian noise using F-statistic amplitude priors, which simplifies the detection statistic and allows for an efficient implementation. We consider both a fully coherent statistic, which is computationally limited to directed searches for known pulsars, and a cheaper semicoherent variant, suitable for wide parameter-space searches for transients from unknown neutron stars. We have tested our method by Monte-Carlo simulation, and we find that it outperforms orthodox maximum-likelihood approaches both in sensitivity and in parameter-estimation quality.

  19. Firefly as a novel swarm intelligence variable selection method in spectroscopy.

    Science.gov (United States)

    Goodarzi, Mohammad; dos Santos Coelho, Leandro

    2014-12-10

    A critical step in multivariate calibration is wavelength selection, which is used to build models with better prediction performance when applied to spectral data. Up to now, many feature selection techniques have been developed. Among all different types of feature selection techniques, those based on swarm intelligence optimization methodologies are more interesting since they are usually simulated based on animal and insect life behavior to, e.g., find the shortest path between a food source and their nests. This decision is made by a crowd, leading to a more robust model with less falling in local minima during the optimization cycle. This paper represents a novel feature selection approach to the selection of spectroscopic data, leading to more robust calibration models. The performance of the firefly algorithm, a swarm intelligence paradigm, was evaluated and compared with genetic algorithm and particle swarm optimization. All three techniques were coupled with partial least squares (PLS) and applied to three spectroscopic data sets. They demonstrate improved prediction results in comparison to when only a PLS model was built using all wavelengths. Results show that firefly algorithm as a novel swarm paradigm leads to a lower number of selected wavelengths while the prediction performance of built PLS stays the same. Copyright © 2014. Published by Elsevier B.V.

  20. Enhancing the stabilization of aircraft pitch motion control via intelligent and classical method

    Science.gov (United States)

    Lukman, H.; Munawwarah, S.; Azizan, A.; Yakub, F.; Zaki, S. A.; Rasid, Z. A.

    2017-12-01

    The pitching movement of an aircraft is very important to ensure passengers are intrinsically safe and the aircraft achieve its maximum stability. The equations governing the motion of an aircraft are a complex set of six nonlinear coupled differential equations. Under certain assumptions, it can be decoupled and linearized into longitudinal and lateral equations. Pitch control is a longitudinal problem and thus, only the longitudinal dynamics equations are involved in this system. It is a third order nonlinear system, which is linearized about the operating point. The system is also inherently unstable due to the presence of a free integrator. Because of this, a feedback controller is added in order to solve this problem and enhance the system performance. This study uses two approaches in designing controller: a conventional controller and an intelligent controller. The pitch control scheme consists of proportional, integral and derivatives (PID) for conventional controller and fuzzy logic control (FLC) for intelligent controller. Throughout the paper, the performance of the presented controllers are investigated and compared based on the common criteria of step response. Simulation results have been obtained and analysed by using Matlab and Simulink software. The study shows that FLC controller has higher ability to control and stabilize the aircraft's pitch angle as compared to PID controller.

  1. Utilizing mixed methods research in analyzing Iranian researchers’ informarion search behaviour in the Web and presenting current pattern

    Directory of Open Access Journals (Sweden)

    Maryam Asadi

    2015-12-01

    Full Text Available Using mixed methods research design, the current study has analyzed Iranian researchers’ information searching behaviour on the Web.Then based on extracted concepts, the model of their information searching behavior was revealed. . Forty-four participants, including academic staff from universities and research centers were recruited for this study selected by purposive sampling. Data were gathered from questionnairs including ten questions and semi-structured interview. Each participant’s memos were analyzed using grounded theory methods adapted from Strauss & Corbin (1998. Results showed that the main objectives of subjects were doing a research, writing a paper, studying, doing assignments, downloading files and acquiring public information in using Web. The most important of learning about how to search and retrieve information were trial and error and get help from friends among the subjects. Information resources are identified by searching in information resources (e.g. search engines, references in papers, and search in Online database… communications facilities & tools (e.g. contact with colleagues, seminars & workshops, social networking..., and information services (e.g. RSS, Alerting, and SDI. Also, Findings indicated that searching by search engines, reviewing references, searching in online databases, and contact with colleagues and studying last issue of the electronic journals were the most important for searching. The most important strategies were using search engines and scientific tools such as Google Scholar. In addition, utilizing from simple (Quick search method was the most common among subjects. Using of topic, keywords, title of paper were most important of elements for retrieval information. Analysis of interview showed that there were nine stages in researchers’ information searching behaviour: topic selection, initiating search, formulating search query, information retrieval, access to information

  2. IP-MLI: An Independency of Learning Materials from Platforms in a Mobile Learning using Intelligent Method

    Directory of Open Access Journals (Sweden)

    Mohammed Abdallh Otair

    2006-06-01

    Full Text Available Attempting to deliver a monolithic mobile learning system is too inflexible in view of the heterogeneous mixture of hardware and services available and the desirability of facility blended approaches to learning delivery, and how to build learning materials to run on all platforms[1]. This paper proposes a framework of mobile learning system using an intelligent method (IP-MLI . A fuzzy matching method is used to find suitable learning material design. It will provide a best matching for each specific platform type for each learner. The main contribution of the proposed method is to use software layer to insulate learning materials from device-specific features. Consequently, many versions of learning materials can be designed to work on many platform types.

  3. (Re)interpreting LHC New Physics Search Results : Tools and Methods, 3rd Workshop

    CERN Document Server

    The quest for new physics beyond the SM is arguably the driving topic for LHC Run2. LHC collaborations are pursuing searches for new physics in a vast variety of channels. Although collaborations provide various interpretations for their search results, the full understanding of these results requires a much wider interpretation scope involving all kinds of theoretical models. This is a very active field, with close theory-experiment interaction. In particular, development of dedicated methodologies and tools is crucial for such scale of interpretation. Recently, a Forum was initiated to host discussions among LHC experimentalists and theorists on topics related to the BSM (re)interpretation of LHC data, and especially on the development of relevant interpretation tools and infrastructure: https://twiki.cern.ch/twiki/bin/view/LHCPhysics/InterpretingLHCresults Two meetings were held at CERN, where active discussions and concrete work on (re)interpretation methods and tools took place, with valuable cont...

  4. Electricity price forecast using Combinatorial Neural Network trained by a new stochastic search method

    International Nuclear Information System (INIS)

    Abedinia, O.; Amjady, N.; Shafie-khah, M.; Catalão, J.P.S.

    2015-01-01

    Highlights: • Presenting a Combinatorial Neural Network. • Suggesting a new stochastic search method. • Adapting the suggested method as a training mechanism. • Proposing a new forecast strategy. • Testing the proposed strategy on real-world electricity markets. - Abstract: Electricity price forecast is key information for successful operation of electricity market participants. However, the time series of electricity price has nonlinear, non-stationary and volatile behaviour and so its forecast method should have high learning capability to extract the complex input/output mapping function of electricity price. In this paper, a Combinatorial Neural Network (CNN) based forecasting engine is proposed to predict the future values of price data. The CNN-based forecasting engine is equipped with a new training mechanism for optimizing the weights of the CNN. This training mechanism is based on an efficient stochastic search method, which is a modified version of chemical reaction optimization algorithm, giving high learning ability to the CNN. The proposed price forecast strategy is tested on the real-world electricity markets of Pennsylvania–New Jersey–Maryland (PJM) and mainland Spain and its obtained results are extensively compared with the results obtained from several other forecast methods. These comparisons illustrate effectiveness of the proposed strategy.

  5. A Method for Estimating View Transformations from Image Correspondences Based on the Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Erik Cuevas

    2015-01-01

    Full Text Available In this paper, a new method for robustly estimating multiple view relations from point correspondences is presented. The approach combines the popular random sampling consensus (RANSAC algorithm and the evolutionary method harmony search (HS. With this combination, the proposed method adopts a different sampling strategy than RANSAC to generate putative solutions. Under the new mechanism, at each iteration, new candidate solutions are built taking into account the quality of the models generated by previous candidate solutions, rather than purely random as it is the case of RANSAC. The rules for the generation of candidate solutions (samples are motivated by the improvisation process that occurs when a musician searches for a better state of harmony. As a result, the proposed approach can substantially reduce the number of iterations still preserving the robust capabilities of RANSAC. The method is generic and its use is illustrated by the estimation of homographies, considering synthetic and real images. Additionally, in order to demonstrate the performance of the proposed approach within a real engineering application, it is employed to solve the problem of position estimation in a humanoid robot. Experimental results validate the efficiency of the proposed method in terms of accuracy, speed, and robustness.

  6. Search method optimization technique for thermal design of high power RFQ structure

    International Nuclear Information System (INIS)

    Sharma, N.K.; Joshi, S.C.

    2009-01-01

    RRCAT has taken up the development of 3 MeV RFQ structure for the low energy part of 100 MeV H - ion injector linac. RFQ is a precision machined resonating structure designed for high rf duty factor. RFQ structural stability during high rf power operation is an important design issue. The thermal analysis of RFQ has been performed using ANSYS finite element analysis software and optimization of various parameters is attempted using Search Method optimization technique. It is an effective optimization technique for the systems governed by a large number of independent variables. The method involves examining a number of combinations of values of independent variables and drawing conclusions from the magnitude of the objective function at these combinations. In these methods there is a continuous improvement in the objective function throughout the course of the search and hence these methods are very efficient. The method has been employed in optimization of various parameters (called independent variables) of RFQ like cooling water flow rate, cooling water inlet temperatures, cavity thickness etc. involved in RFQ thermal design. The temperature rise within RFQ structure is the objective function during the thermal design. Using ANSYS Programming Development Language (APDL), various multiple iterative programmes are written and the analysis are performed to minimize the objective function. The dependency of the objective function on various independent variables is established and the optimum values of the parameters are evaluated. The results of the analysis are presented in the paper. (author)

  7. Sliding surface searching method for slopes containing a potential weak structural surface

    Directory of Open Access Journals (Sweden)

    Aijun Yao

    2014-06-01

    Full Text Available Weak structural surface is one of the key factors controlling the stability of slopes. The stability of rock slopes is in general concerned with set of discontinuities. However, in soft rocks, failure can occur along surfaces approaching to a circular failure surface. To better understand the position of potential sliding surface, a new method called simplex-finite stochastic tracking method is proposed. This method basically divides sliding surface into two parts: one is described by smooth curve obtained by random searching, the other one is polyline formed by the weak structural surface. Single or multiple sliding surfaces can be considered, and consequently several types of combined sliding surfaces can be simulated. The paper will adopt the arc-polyline to simulate potential sliding surface and analyze the searching process of sliding surface. Accordingly, software for slope stability analysis using this method was developed and applied in real cases. The results show that, using simplex-finite stochastic tracking method, it is possible to locate the position of a potential sliding surface in the slope.

  8. Basic study on intelligent materialization of glass; Glass no intelligent ko zairyoka ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-31

    This is the report No. 98 issued by the Inorganic Material Research Institute. An intelligent material is a substance and/or material which responds intelligently to environmental conditions and exhibits functions. One of the features of amorphous materials including amorphous glass is a large freedom in chemical composition. These materials maintain order in short distance, but have as a whole the turbulent and specific atom orientation. Therefore, high tolerability in selecting the composition, and diverse synthesizing methods are available. A wide range of utilization may be conceived, such as introduction of the state of electrons having different valences in a structure, and the diverse chemical combinations. Patterns of existence of polyhedrons having different orientations, and how they are connected correlate closely with an external environment. Intelligent materials have high freedom against change in the external environment and are suitable to exhibit intelligent functions. Setting heat and light as the external conditions, attempts have been made on search and creation of intelligent materials based on state change induced by interactions between the two factors. Fundamental studies have been made on synthesis of different environment responding glasses and films, and on factors and phenomena for exhibition of the intelligence. 62 refs., 91 figs., 8 tabs.

  9. Searching for rigour in the reporting of mixed methods population health research: a methodological review.

    Science.gov (United States)

    Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J

    2015-12-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  10. Artificial intelligence

    CERN Document Server

    Hunt, Earl B

    1975-01-01

    Artificial Intelligence provides information pertinent to the fundamental aspects of artificial intelligence. This book presents the basic mathematical and computational approaches to problems in the artificial intelligence field.Organized into four parts encompassing 16 chapters, this book begins with an overview of the various fields of artificial intelligence. This text then attempts to connect artificial intelligence problems to some of the notions of computability and abstract computing devices. Other chapters consider the general notion of computability, with focus on the interaction bet

  11. Intelligent mechatronics; Intelligent mechatronics

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, H. [The University of Tokyo, Tokyo (Japan). Institute of Industrial Science

    1995-10-01

    Intelligent mechatronics (IM) was explained as follows: a study of IM essentially targets realization of a robot namely, but in the present stage the target is a creation of new values by intellectualization of machine, that is, a combination of the information infrastructure and the intelligent machine system. IM is also thought to be constituted of computers positively used and micromechatronics. The paper next introduces examples of IM study, mainly those the author is concerned with as shown below: sensor gloves, robot hands, robot eyes, tele operation, three-dimensional object recognition, mobile robot, magnetic bearing, construction of remote controlled unmanned dam, robot network, sensitivity communication using neuro baby, etc. 27 figs.

  12. An image segmentation method based on fuzzy C-means clustering and Cuckoo search algorithm

    Science.gov (United States)

    Wang, Mingwei; Wan, Youchuan; Gao, Xianjun; Ye, Zhiwei; Chen, Maolin

    2018-04-01

    Image segmentation is a significant step in image analysis and machine vision. Many approaches have been presented in this topic; among them, fuzzy C-means (FCM) clustering is one of the most widely used methods for its high efficiency and ambiguity of images. However, the success of FCM could not be guaranteed because it easily traps into local optimal solution. Cuckoo search (CS) is a novel evolutionary algorithm, which has been tested on some optimization problems and proved to be high-efficiency. Therefore, a new segmentation technique using FCM and blending of CS algorithm is put forward in the paper. Further, the proposed method has been measured on several images and compared with other existing FCM techniques such as genetic algorithm (GA) based FCM and particle swarm optimization (PSO) based FCM in terms of fitness value. Experimental results indicate that the proposed method is robust, adaptive and exhibits the better performance than other methods involved in the paper.

  13. A comparison of methods for gravitational wave burst searches from LIGO and Virgo

    International Nuclear Information System (INIS)

    Beauville, F; Buskulic, D; Grosjean, D; Bizouard, M-A; Cavalier, F; Clapson, A-C; Hello, P; Blackburn, L; Katsavounidis, E; Bosi, L; Brocco, L; Brown, D A; Chatterji, S; Christensen, N; Knight, M; Fairhurst, S; Guidi, G; Heng, S; Hewitson, M; Klimenko, S

    2008-01-01

    The search procedure for burst gravitational waves has been studied using 24 h of simulated data in a network of three interferometers (Hanford 4 km, Livingston 4 km and Virgo 3 km are the example interferometers). Several methods to detect burst events developed in the LIGO Scientific Collaboration (LSC) and Virgo Collaboration have been studied and compared. We have performed coincidence analysis of the triggers obtained in the different interferometers with and without simulated signals added to the data. The benefits of having multiple interferometers of similar sensitivity are demonstrated by comparing the detection performance of the joint coincidence analysis with LSC and Virgo only burst searches. Adding Virgo to the LIGO detector network can increase by 50% the detection efficiency for this search. Another advantage of a joint LIGO-Virgo network is the ability to reconstruct the source sky position. The reconstruction accuracy depends on the timing measurement accuracy of the events in each interferometer, and is displayed in this paper with a fixed source position example

  14. A comparison of methods for gravitational wave burst searches from LIGO and Virgo

    Energy Technology Data Exchange (ETDEWEB)

    Beauville, F; Buskulic, D; Grosjean, D [Laboratoire d' Annecy-le-Vieux de Physique des Particules, Chemin de Bellevue, BP 110, 74941 Annecy-le-Vieux Cedex (France); Bizouard, M-A; Cavalier, F; Clapson, A-C; Hello, P [Laboratoire de l' Accelerateur Lineaire, IN2P3/CNRS-Universite de Paris XI, BP 34, 91898 Orsay Cedex (France); Blackburn, L; Katsavounidis, E [LIGO-Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Bosi, L [INFN Sezione di Perugia and/or Universita di Perugia, Via A Pascoli, I-06123 Perugia (Italy); Brocco, L [INFN Sezione di Roma and/or Universita ' La Sapienza' , P le A Moro 2, I-00185 Roma (Italy); Brown, D A; Chatterji, S [LIGO-California Institute of Technology, Pasadena, CA 91125 (United States); Christensen, N; Knight, M [Carleton College, Northfield, MN 55057 (United States); Fairhurst, S [University of Wisconsin-Milwaukee, Milwaukee, WI 53201 (United States); Guidi, G [INFN Sezione Firenze/Urbino Via G Sansone 1, I-50019 Sesto Fiorentino (Italy); and/or Universita di Firenze, Largo E Fermi 2, I-50125 Firenze and/or Universita di Urbino, Via S Chiara 27, I-61029 Urbino (Italy); Heng, S; Hewitson, M [University of Glasgow, Glasgow, G12 8QQ (United Kingdom); Klimenko, S [University of Florida-Gainesville, FL 32611 (United States)] (and others)

    2008-02-21

    The search procedure for burst gravitational waves has been studied using 24 h of simulated data in a network of three interferometers (Hanford 4 km, Livingston 4 km and Virgo 3 km are the example interferometers). Several methods to detect burst events developed in the LIGO Scientific Collaboration (LSC) and Virgo Collaboration have been studied and compared. We have performed coincidence analysis of the triggers obtained in the different interferometers with and without simulated signals added to the data. The benefits of having multiple interferometers of similar sensitivity are demonstrated by comparing the detection performance of the joint coincidence analysis with LSC and Virgo only burst searches. Adding Virgo to the LIGO detector network can increase by 50% the detection efficiency for this search. Another advantage of a joint LIGO-Virgo network is the ability to reconstruct the source sky position. The reconstruction accuracy depends on the timing measurement accuracy of the events in each interferometer, and is displayed in this paper with a fixed source position example.

  15. Does emotional intelligence influence success during medical school admissions and program matriculation?: a systematic review

    Directory of Open Access Journals (Sweden)

    Christian Jaeger Cook

    2016-11-01

    Full Text Available Purpose It aimed at determining whether emotional intelligence is a predictor for success in a medical school program and whether the emotional intelligence construct correlated with other markers for admission into medical school. Methods Three databases (PubMed, CINAHL, and ERIC were searched up to and including July 2016, using relevant terms. Studies written in English were selected if they included emotional intelligence as a predictor for success in medical school, markers of success such as examination scores and grade point average and association with success defined through traditional medical school admission criteria and failures, and details about the sample. Data extraction included the study authors and year, population description, emotional intelligence I tool, outcome variables, and results. Associations between emotional intelligence scores and reported data were extracted and recorded. Results Six manuscripts were included. Overall, study quality was high. Four of the manuscripts examined emotional intelligence as a predictor for success while in medical school. Three of these four studies supported a weak positive relationship between emotional intelligence scores and success during matriculation. Two of manuscripts examined the relationship of emotional intelligence to medical school admissions. There were no significant relevant correlations between emotional intelligence and medical school admission selection. Conclusion Emotional intelligence was correlated with some, but not all, measures of success during medical school matriculation and none of the measures associated with medical school admissions. Variability in success measures across studies likely explains the variable findings.

  16. Intelligent robot action planning

    Energy Technology Data Exchange (ETDEWEB)

    Vamos, T; Siegler, A

    1982-01-01

    Action planning methods used in intelligent robot control are discussed. Planning is accomplished through environment understanding, environment representation, task understanding and planning, motion analysis and man-machine communication. These fields are analysed in detail. The frames of an intelligent motion planning system are presented. Graphic simulation of the robot's environment and motion is used to support the planning. 14 references.

  17. Phase boundary estimation in electrical impedance tomography using the Hooke and Jeeves pattern search method

    International Nuclear Information System (INIS)

    Khambampati, Anil Kumar; Kim, Kyung Youn; Ijaz, Umer Zeeshan; Lee, Jeong Seong; Kim, Sin

    2010-01-01

    In industrial processes, monitoring of heterogeneous phases is crucial to the safety and operation of the engineering structures. Particularly, the visualization of voids and air bubbles is advantageous. As a result many studies have appeared in the literature that offer varying degrees of functionality. Electrical impedance tomography (EIT) has already been proved to be a hallmark for process monitoring and offers not only the visualization of the resistivity profile for a given flow mixture but is also used for detection of phase boundaries. Iterative image reconstruction algorithms, such as the modified Newton–Raphson (mNR) method, are commonly used as inverse solvers. However, their utility is problematic in a sense that they require the initial solution in close proximity of the ground truth. Furthermore, they also rely on the gradient information of the objective function to be minimized. Therefore, in this paper, we address all these issues by employing a direct search algorithm, namely the Hooke and Jeeves pattern search method, to estimate the phase boundaries that directly minimizes the cost function and does not require the gradient information. It is assumed that the resistivity profile is known a priori and therefore the unknown information will be the size and location of the object. The boundary coefficients are parameterized using truncated Fourier series and are estimated using the relationship between the measured voltages and injected currents. Through extensive simulation and experimental result and by comparison with mNR, we show that the Hooke and Jeeves pattern search method offers a promising prospect for process monitoring

  18. Artificial Intelligence and Moral intelligence

    OpenAIRE

    Laura Pana

    2008-01-01

    We discuss the thesis that the implementation of a moral code in the behaviour of artificial intelligent systems needs a specific form of human and artificial intelligence, not just an abstract intelligence. We present intelligence as a system with an internal structure and the structural levels of the moral system, as well as certain characteristics of artificial intelligent agents which can/must be treated as 1- individual entities (with a complex, specialized, autonomous or selfdetermined,...

  19. All roads lead to Rome - New search methods for the optimal triangulation problem

    Czech Academy of Sciences Publication Activity Database

    Ottosen, T. J.; Vomlel, Jiří

    2012-01-01

    Roč. 53, č. 9 (2012), s. 1350-1366 ISSN 0888-613X R&D Projects: GA MŠk 1M0572; GA ČR GEICC/08/E010; GA ČR GA201/09/1891 Grant - others:GA MŠk(CZ) 2C06019 Institutional support: RVO:67985556 Keywords : Bayesian networks * Optimal triangulation * Probabilistic inference * Cliques in a graph Subject RIV: BD - Theory of Information Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/vomlel-all roads lead to rome - new search methods for the optimal triangulation problem.pdf

  20. Hybrid Genetic Algorithm - Local Search Method for Ground-Water Management

    Science.gov (United States)

    Chiu, Y.; Nishikawa, T.; Martin, P.

    2008-12-01

    Ground-water management problems commonly are formulated as a mixed-integer, non-linear programming problem (MINLP). Relying only on conventional gradient-search methods to solve the management problem is computationally fast; however, the methods may become trapped in a local optimum. Global-optimization schemes can identify the global optimum, but the convergence is very slow when the optimal solution approaches the global optimum. In this study, we developed a hybrid optimization scheme, which includes a genetic algorithm and a gradient-search method, to solve the MINLP. The genetic algorithm identifies a near- optimal solution, and the gradient search uses the near optimum to identify the global optimum. Our methodology is applied to a conjunctive-use project in the Warren ground-water basin, California. Hi- Desert Water District (HDWD), the primary water-manager in the basin, plans to construct a wastewater treatment plant to reduce future septic-tank effluent from reaching the ground-water system. The treated wastewater instead will recharge the ground-water basin via percolation ponds as part of a larger conjunctive-use strategy, subject to State regulations (e.g. minimum distances and travel times). HDWD wishes to identify the least-cost conjunctive-use strategies that control ground-water levels, meet regulations, and identify new production-well locations. As formulated, the MINLP objective is to minimize water-delivery costs subject to constraints including pump capacities, available recharge water, water-supply demand, water-level constraints, and potential new-well locations. The methodology was demonstrated by an enumerative search of the entire feasible solution and comparing the optimum solution with results from the branch-and-bound algorithm. The results also indicate that the hybrid method identifies the global optimum within an affordable computation time. Sensitivity analyses, which include testing different recharge-rate scenarios, pond

  1. PMSVM: An Optimized Support Vector Machine Classification Algorithm Based on PCA and Multilevel Grid Search Methods

    Directory of Open Access Journals (Sweden)

    Yukai Yao

    2015-01-01

    Full Text Available We propose an optimized Support Vector Machine classifier, named PMSVM, in which System Normalization, PCA, and Multilevel Grid Search methods are comprehensively considered for data preprocessing and parameters optimization, respectively. The main goals of this study are to improve the classification efficiency and accuracy of SVM. Sensitivity, Specificity, Precision, and ROC curve, and so forth, are adopted to appraise the performances of PMSVM. Experimental results show that PMSVM has relatively better accuracy and remarkable higher efficiency compared with traditional SVM algorithms.

  2. Intelligent Systems For Aerospace Engineering: An Overview

    Science.gov (United States)

    KrishnaKumar, K.

    2003-01-01

    Intelligent systems are nature-inspired, mathematically sound, computationally intensive problem solving tools and methodologies that have become extremely important for advancing the current trends in information technology. Artificially intelligent systems currently utilize computers to emulate various faculties of human intelligence and biological metaphors. They use a combination of symbolic and sub-symbolic systems capable of evolving human cognitive skills and intelligence, not just systems capable of doing things humans do not do well. Intelligent systems are ideally suited for tasks such as search and optimization, pattern recognition and matching, planning, uncertainty management, control, and adaptation. In this paper, the intelligent system technologies and their application potential are highlighted via several examples.

  3. Comparing the Precision of Information Retrieval of MeSH-Controlled Vocabulary Search Method and a Visual Method in the Medline Medical Database.

    Science.gov (United States)

    Hariri, Nadjla; Ravandi, Somayyeh Nadi

    2014-01-01

    Medline is one of the most important databases in the biomedical field. One of the most important hosts for Medline is Elton B. Stephens CO. (EBSCO), which has presented different search methods that can be used based on the needs of the users. Visual search and MeSH-controlled search methods are among the most common methods. The goal of this research was to compare the precision of the retrieved sources in the EBSCO Medline base using MeSH-controlled and visual search methods. This research was a semi-empirical study. By holding training workshops, 70 students of higher education in different educational departments of Kashan University of Medical Sciences were taught MeSH-Controlled and visual search methods in 2012. Then, the precision of 300 searches made by these students was calculated based on Best Precision, Useful Precision, and Objective Precision formulas and analyzed in SPSS software using the independent sample T Test, and three precisions obtained with the three precision formulas were studied for the two search methods. The mean precision of the visual method was greater than that of the MeSH-Controlled search for all three types of precision, i.e. Best Precision, Useful Precision, and Objective Precision, and their mean precisions were significantly different (P searches. Fifty-three percent of the participants in the research also mentioned that the use of the combination of the two methods produced better results. For users, it is more appropriate to use a natural, language-based method, such as the visual method, in the EBSCO Medline host than to use the controlled method, which requires users to use special keywords. The potential reason for their preference was that the visual method allowed them more freedom of action.

  4. A dynamic lattice searching method with rotation operation for optimization of large clusters

    International Nuclear Information System (INIS)

    Wu Xia; Cai Wensheng; Shao Xueguang

    2009-01-01

    Global optimization of large clusters has been a difficult task, though much effort has been paid and many efficient methods have been proposed. During our works, a rotation operation (RO) is designed to realize the structural transformation from decahedra to icosahedra for the optimization of large clusters, by rotating the atoms below the center atom with a definite degree around the fivefold axis. Based on the RO, a development of the previous dynamic lattice searching with constructed core (DLSc), named as DLSc-RO, is presented. With an investigation of the method for the optimization of Lennard-Jones (LJ) clusters, i.e., LJ 500 , LJ 561 , LJ 600 , LJ 665-667 , LJ 670 , LJ 685 , and LJ 923 , Morse clusters, silver clusters by Gupta potential, and aluminum clusters by NP-B potential, it was found that both the global minima with icosahedral and decahedral motifs can be obtained, and the method is proved to be efficient and universal.

  5. MRS algorithm: a new method for searching myocardial region in SPECT myocardial perfusion images.

    Science.gov (United States)

    He, Yuan-Lie; Tian, Lian-Fang; Chen, Ping; Li, Bin; Mao, Zhong-Yuan

    2005-10-01

    First, the necessity of automatically segmenting myocardium from myocardial SPECT image is discussed in Section 1. To eliminate the influence of the background, the optimal threshold segmentation method modified for the MRS algorithm is explained in Section 2. Then, the image erosion structure is applied to identify the myocardium region and the liver region. The contour tracing method is introduced to extract the myocardial contour. To locate the centriod of the myocardium, the myocardial centriod searching method is developed. The protocol of the MRS algorithm is summarized in Section 6. The performance of the MRS algorithm is investigated and the conclusion is drawn in Section 7. Finally, the importance of the MRS algorithm and the improvement of the MRS algorithm are discussed.

  6. Intelligence, Academic Self-Concept, and Information Literacy: The Role of Adequate Perceptions of Academic Ability in the Acquisition of Knowledge about Information Searching

    Science.gov (United States)

    Rosman, Tom; Mayer, Anne-Kathrin; Krampen, Günter

    2015-01-01

    Introduction: The present paper argues that adequate self-perceptions of academic ability are essential for students' realization of their intellectual potential, thereby fostering learning of complex skills, e.g., information-seeking skills. Thus, academic self-concept should moderate the relationship between intelligence and information…

  7. An R-peak detection method that uses an SVD filter and a search back system.

    Science.gov (United States)

    Jung, Woo-Hyuk; Lee, Sang-Goog

    2012-12-01

    In this paper, we present a method for detecting the R-peak of an ECG signal by using an singular value decomposition (SVD) filter and a search back system. The ECG signal was detected in two phases: the pre-processing phase and the decision phase. The pre-processing phase consisted of the stages for the SVD filter, Butterworth High Pass Filter (HPF), moving average (MA), and squaring, whereas the decision phase consisted of a single stage that detected the R-peak. In the pre-processing phase, the SVD filter removed noise while the Butterworth HPF eliminated baseline wander. The MA removed the remaining noise of the signal that had gone through the SVD filter to make the signal smooth, and squaring played a role in strengthening the signal. In the decision phase, the threshold was used to set the interval before detecting the R-peak. When the latest R-R interval (RRI), suggested by Hamilton et al., was greater than 150% of the previous RRI, the method of detecting the R-peak in such an interval was modified to be 150% or greater than the smallest interval of the two most latest RRIs. When the modified search back system was used, the error rate of the peak detection decreased to 0.29%, compared to 1.34% when the modified search back system was not used. Consequently, the sensitivity was 99.47%, the positive predictivity was 99.47%, and the detection error was 1.05%. Furthermore, the quality of the signal in data with a substantial amount of noise was improved, and thus, the R-peak was detected effectively. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Coupling artificial intelligence and numerical computation for engineering design (Invited paper)

    Science.gov (United States)

    Tong, S. S.

    1986-01-01

    The possibility of combining artificial intelligence (AI) systems and numerical computation methods for engineering designs is considered. Attention is given to three possible areas of application involving fan design, controlled vortex design of turbine stage blade angles, and preliminary design of turbine cascade profiles. Among the AI techniques discussed are: knowledge-based systems; intelligent search; and pattern recognition systems. The potential cost and performance advantages of an AI-based design-generation system are discussed in detail.

  9. Methods to filter out spurious disturbances in continuous-wave searches from gravitational-wave detectors

    International Nuclear Information System (INIS)

    Leaci, Paola

    2015-01-01

    Semicoherent all-sky searches over year-long observation times for continuous gravitational wave signals produce various thousands of potential periodic source candidates. Efficient methods able to discard false candidate events are crucial in order to put all the efforts into a computationally intensive follow-up analysis for the remaining most promising candidates (Shaltev et al 2014 Phys. Rev. D 89 124030). In this paper we present a set of techniques able to fulfill such requirements, identifying and eliminating false candidate events, reducing thus the bulk of candidate sets that need to be further investigated. Some of these techniques were also used to streamline the candidate sets returned by the Einstein@Home hierarchical searches presented in (Aasi J et al (The LIGO Scientific Collaboration and the Virgo Collaboration) 2013 Phys. Rev. D 87 042001). These powerful methods and the benefits originating from their application to both simulated and on detector data from the fifth LIGO science run are illustrated and discussed. (paper)

  10. Trends in ambient intelligent systems the role of computational intelligence

    CERN Document Server

    Khan, Mohammad; Abraham, Ajith

    2016-01-01

    This book demonstrates the success of Ambient Intelligence in providing possible solutions for the daily needs of humans. The book addresses implications of ambient intelligence in areas of domestic living, elderly care, robotics, communication, philosophy and others. The objective of this edited volume is to show that Ambient Intelligence is a boon to humanity with conceptual, philosophical, methodical and applicative understanding. The book also aims to schematically demonstrate developments in the direction of augmented sensors, embedded systems and behavioral intelligence towards Ambient Intelligent Networks or Smart Living Technology. It contains chapters in the field of Ambient Intelligent Networks, which received highly positive feedback during the review process. The book contains research work, with in-depth state of the art from augmented sensors, embedded technology and artificial intelligence along with cutting-edge research and development of technologies and applications of Ambient Intelligent N...

  11. PERSONALIZED MEDICINE: GENOME, ELECTRONIC HEALTH AND INTELLIGENT SYSTEMS. PART 2. MOLECULAR GENETICS AND METHODS OF INTELLECTUAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    B. A. Kobrinskii

    2017-01-01

    Full Text Available The transition to personalized medicine in practical terms should combine the problems of molecular-genetic predisposition to diseases with transient states in the organism in the direction of possible pathology. Classification and monitoring of the state can be  effectively carried out using artificial intelligence methods. Various intellectual approaches are considered in different conditions for  monitoring patient.

  12. Attitude Determination Method by Fusing Single Antenna GPS and Low Cost MEMS Sensors Using Intelligent Kalman Filter Algorithm

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2017-01-01

    Full Text Available For meeting the demands of cost and size for micronavigation system, a combined attitude determination approach with sensor fusion algorithm and intelligent Kalman filter (IKF on low cost Micro-Electro-Mechanical System (MEMS gyroscope, accelerometer, and magnetometer and single antenna Global Positioning System (GPS is proposed. The effective calibration method is performed to compensate the effect of errors in low cost MEMS Inertial Measurement Unit (IMU. The different control strategies fusing the MEMS multisensors are designed. The yaw angle fusing gyroscope, accelerometer, and magnetometer algorithm is estimated accurately under GPS failure and unavailable sideslip situations. For resolving robust control and characters of the uncertain noise statistics influence, the high gain scale of IKF is adjusted by fuzzy controller in the transition process and steady state to achieve faster convergence and accurate estimation. The experiments comparing different MEMS sensors and fusion algorithms are implemented to verify the validity of the proposed approach.

  13. Semen parameters can be predicted from environmental factors and lifestyle using artificial intelligence methods.

    Science.gov (United States)

    Girela, Jose L; Gil, David; Johnsson, Magnus; Gomez-Torres, María José; De Juan, Joaquín

    2013-04-01

    Fertility rates have dramatically decreased in the last two decades, especially in men. It has been described that environmental factors as well as life habits may affect semen quality. In this paper we use artificial intelligence techniques in order to predict semen characteristics resulting from environmental factors, life habits, and health status, with these techniques constituting a possible decision support system that can help in the study of male fertility potential. A total of 123 young, healthy volunteers provided a semen sample that was analyzed according to the World Health Organization 2010 criteria. They also were asked to complete a validated questionnaire about life habits and health status. Sperm concentration and percentage of motile sperm were related to sociodemographic data, environmental factors, health status, and life habits in order to determine the predictive accuracy of a multilayer perceptron network, a type of artificial neural network. In conclusion, we have developed an artificial neural network that can predict the results of the semen analysis based on the data collected by the questionnaire. The semen parameter that is best predicted using this methodology is the sperm concentration. Although the accuracy for motility is slightly lower than that for concentration, it is possible to predict it with a significant degree of accuracy. This methodology can be a useful tool in early diagnosis of patients with seminal disorders or in the selection of candidates to become semen donors.

  14. Intelligent Method for Identifying Driving Risk Based on V2V Multisource Big Data

    Directory of Open Access Journals (Sweden)

    Jinshuan Peng

    2018-01-01

    Full Text Available Risky driving behavior is a major cause of traffic conflicts, which can develop into road traffic accidents, making the timely and accurate identification of such behavior essential to road safety. A platform was therefore established for analyzing the driving behavior of 20 professional drivers in field tests, in which overclose car following and lane departure were used as typical risky driving behaviors. Characterization parameters for identification were screened and used to determine threshold values and an appropriate time window for identification. A neural network-Bayesian filter identification model was established and data samples were selected to identify risky driving behavior and evaluate the identification efficiency of the model. The results obtained indicated a successful identification rate of 83.6% when the neural network model was solely used to identify risky driving behavior, but this could be increased to 92.46% once corrected by the Bayesian filter. This has important theoretical and practical significance in relation to evaluating the efficiency of existing driver assist systems, as well as the development of future intelligent driving systems.

  15. Reporting Quality of Search Methods in Systematic Reviews of HIV Behavioral Interventions (2000–2010): Are the Searches Clearly Explained, Systematic and Reproducible?

    Science.gov (United States)

    Mullins, Mary M.; DeLuca, Julia B.; Crepaz, Nicole; Lyles, Cynthia M.

    2018-01-01

    Systematic reviews are an essential tool for researchers, prevention providers and policy makers who want to remain current with the evidence in the field. Systematic review must adhere to strict standards, as the results can provide a more objective appraisal of evidence for making scientific decisions than traditional narrative reviews. An integral component of a systematic review is the development and execution of a comprehensive systematic search to collect available and relevant information. A number of reporting guidelines have been developed to ensure quality publications of systematic reviews. These guidelines provide the essential elements to include in the review process and report in the final publication for complete transparency. We identified the common elements of reporting guidelines and examined the reporting quality of search methods in HIV behavioral intervention literature. Consistent with the findings from previous evaluations of reporting search methods of systematic reviews in other fields, our review shows a lack of full and transparent reporting within systematic reviews even though a plethora of guidelines exist. This review underscores the need for promoting the completeness of and adherence to transparent systematic search reporting within systematic reviews. PMID:26052651

  16. LITERATURE SEARCH FOR METHODS FOR HAZARD ANALYSES OF AIR CARRIER OPERATIONS.

    Energy Technology Data Exchange (ETDEWEB)

    MARTINEZ - GURIDI,G.; SAMANTA,P.

    2002-07-01

    Representatives of the Federal Aviation Administration (FAA) and several air carriers under Title 14 of the Code of Federal Regulations (CFR) Part 121 developed a system-engineering model of the functions of air-carrier operations. Their analyses form the foundation or basic architecture upon which other task areas are based: hazard analyses, performance measures, and risk indicator design. To carry out these other tasks, models may need to be developed using the basic architecture of the Air Carrier Operations System Model (ACOSM). Since ACOSM encompasses various areas of air-carrier operations and can be used to address different task areas with differing but interrelated objectives, the modeling needs are broad. A literature search was conducted to identify and analyze the existing models that may be applicable for pursuing the task areas in ACOSM. The intent of the literature search was not necessarily to identify a specific model that can be directly used, but rather to identify relevant ones that have similarities with the processes and activities defined within ACOSM. Such models may provide useful inputs and insights in structuring ACOSM models. ACOSM simulates processes and activities in air-carrier operation, but, in a general framework, it has similarities with other industries where attention also has been paid to hazard analyses, emphasizing risk management, and in designing risk indicators. To assure that efforts in other industries are adequately considered, the literature search includes publications from other industries, e.g., chemical, nuclear, and process industries. This report discusses the literature search, the relevant methods identified and provides a preliminary assessment of their use in developing the models needed for the ACOSM task areas. A detailed assessment of the models has not been made. Defining those applicable for ACOSM will need further analyses of both the models and tools identified. The report is organized in four chapters

  17. A business intelligence approach using web search tools and online data reduction techniques to examine the value of product-enabled services

    DEFF Research Database (Denmark)

    Tanev, Stoyan; Liotta, Giacomo; Kleismantas, Andrius

    2015-01-01

    in Canada and Europe. It adopts an innovative methodology based on online textual data that could be implemented in advanced business intelligence tools aiming at the facilitation of innovation, marketing and business decision making. Combinations of keywords referring to different aspects of service value......-service innovation as a competitive advantage on the marketplace. On the other hand, the focus of EU firms on innovative hybrid offerings is not explicitly related to business differentiation and competitiveness....

  18. Artificial Intelligence.

    Science.gov (United States)

    Information Technology Quarterly, 1985

    1985-01-01

    This issue of "Information Technology Quarterly" is devoted to the theme of "Artificial Intelligence." It contains two major articles: (1) Artificial Intelligence and Law" (D. Peter O'Neill and George D. Wood); (2) "Artificial Intelligence: A Long and Winding Road" (John J. Simon, Jr.). In addition, it contains two sidebars: (1) "Calculating and…

  19. Competitive Intelligence.

    Science.gov (United States)

    Bergeron, Pierrette; Hiller, Christine A.

    2002-01-01

    Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligence process; and discusses education and training opportunities for competitive intelligence, including core competencies…

  20. Search methods that people use to find owners of lost pets.

    Science.gov (United States)

    Lord, Linda K; Wittum, Thomas E; Ferketich, Amy K; Funk, Julie A; Rajala-Schultz, Päivi J

    2007-06-15

    To characterize the process by which people who find lost pets search for the owners. Cross-sectional study. Sample Population-188 individuals who found a lost pet in Dayton, Ohio, between March 1 and June 30, 2006. Procedures-Potential participants were identified as a result of contact with a local animal agency or placement of an advertisement in the local newspaper. A telephone survey was conducted to identify methods participants used to find the pets' owners. 156 of 188 (83%) individuals completed the survey. Fifty-nine of the 156 (38%) pets were reunited with their owners; median time to reunification was 2 days (range, 0.5 to 45 days). Only 1 (3%) cat owner was found, compared with 58 (46%) dog owners. Pet owners were found as a result of information provided by an animal agency (25%), placement of a newspaper advertisement (24%), walking the neighborhood (19%), signs in the neighborhood (15%), information on a pet tag (10%), and other methods (7%). Most finders (87%) considered it extremely important to find the owner, yet only 13 (8%) initially surrendered the found pet to an animal agency. The primary reason people did not surrender found pets was fear of euthanasia (57%). Only 97 (62%) individuals were aware they could run a found-pet advertisement in the newspaper at no charge, and only 1 person who was unaware of the no-charge policy placed an advertisement. Veterinarians and shelters can help educate people who find lost pets about methods to search for the pets' owners.

  1. A Novel Hybrid Intelligent Indoor Location Method for Mobile Devices by Zones Using Wi-Fi Signals.

    Science.gov (United States)

    Castañón-Puga, Manuel; Salazar, Abby Stephanie; Aguilar, Leocundo; Gaxiola-Pacheco, Carelia; Licea, Guillermo

    2015-12-02

    The increasing use of mobile devices in indoor spaces brings challenges to location methods. This work presents a hybrid intelligent method based on data mining and Type-2 fuzzy logic to locate mobile devices in an indoor space by zones using Wi-Fi signals from selected access points (APs). This approach takes advantage of wireless local area networks (WLANs) over other types of architectures and implements the complete method in a mobile application using the developed tools. Besides, the proposed approach is validated by experimental data obtained from case studies and the cross-validation technique. For the purpose of generating the fuzzy rules that conform to the Takagi-Sugeno fuzzy system structure, a semi-supervised data mining technique called subtractive clustering is used. This algorithm finds centers of clusters from the radius map given by the collected signals from APs. Measurements of Wi-Fi signals can be noisy due to several factors mentioned in this work, so this method proposed the use of Type-2 fuzzy logic for modeling and dealing with such uncertain information.

  2. A Novel Hybrid Intelligent Indoor Location Method for Mobile Devices by Zones Using Wi-Fi Signals

    Directory of Open Access Journals (Sweden)

    Manuel Castañón–Puga

    2015-12-01

    Full Text Available The increasing use of mobile devices in indoor spaces brings challenges to location methods. This work presents a hybrid intelligent method based on data mining and Type-2 fuzzy logic to locate mobile devices in an indoor space by zones using Wi-Fi signals from selected access points (APs. This approach takes advantage of wireless local area networks (WLANs over other types of architectures and implements the complete method in a mobile application using the developed tools. Besides, the proposed approach is validated by experimental data obtained from case studies and the cross-validation technique. For the purpose of generating the fuzzy rules that conform to the Takagi–Sugeno fuzzy system structure, a semi-supervised data mining technique called subtractive clustering is used. This algorithm finds centers of clusters from the radius map given by the collected signals from APs. Measurements of Wi-Fi signals can be noisy due to several factors mentioned in this work, so this method proposed the use of Type-2 fuzzy logic for modeling and dealing with such uncertain information.

  3. Exploring genomic dark matter: A critical assessment of the performance of homology search methods on noncoding RNA

    DEFF Research Database (Denmark)

    Freyhult, E.; Bollback, J. P.; Gardner, P. P.

    2006-01-01

    Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer, and Infer......Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer......, and Infernal. Surprisingly, the most popular homology search methods are often the least accurate. As a result, many studies have used inappropriate tools for their analyses. On the basis of our results, we suggest homology search strategies using the currently available tools and some directions for future...

  4. Intelligent automation of high-performance liquid chromatography method development by means of a real-time knowledge-based approach.

    Science.gov (United States)

    I, Ting-Po; Smith, Randy; Guhan, Sam; Taksen, Ken; Vavra, Mark; Myers, Douglas; Hearn, Milton T W

    2002-09-27

    We describe the development, attributes and capabilities of a novel type of artificial intelligence system, called LabExpert, for automation of HPLC method development. Unlike other computerised method development systems, LabExpert operates in real-time, using an artificial intelligence system and design engine to provide experimental decision outcomes relevant to the optimisation of complex separations as well as the control of the instrumentation, column selection, mobile phase choice and other experimental parameters. LabExpert manages every input parameter to a HPLC data station and evaluates each output parameter of the HPLC data station in real-time as part of its decision process. Based on a combination of inherent and user-defined evaluation criteria, the artificial intelligence system programs use a reasoning process, applying chromatographic principles and acquired experimental observations to iteratively provide a regime for a priori development of an acceptable HPLC separation method. Because remote monitoring and control are also functions of LabExpert, the system allows full-time utilisation of analytical instrumentation and associated laboratory resources. Based on our experience with LabExpert with a wide range of analyte mixtures, this artificial intelligence system consistently identified in a similar or faster time-frame preferred sets of analytical conditions that are equal in resolution, efficiency and throughput to those empirically determined by highly experienced chromatographic scientists. An illustrative example, demonstrating the potential of LabExpert in the process of method development of drug substances, is provided.

  5. Optimal correction and design parameter search by modern methods of rigorous global optimization

    International Nuclear Information System (INIS)

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  6. A meta-heuristic method for solving scheduling problem: crow search algorithm

    Science.gov (United States)

    Adhi, Antono; Santosa, Budi; Siswanto, Nurhadi

    2018-04-01

    Scheduling is one of the most important processes in an industry both in manufacturingand services. The scheduling process is the process of selecting resources to perform an operation on tasks. Resources can be machines, peoples, tasks, jobs or operations.. The selection of optimum sequence of jobs from a permutation is an essential issue in every research in scheduling problem. Optimum sequence becomes optimum solution to resolve scheduling problem. Scheduling problem becomes NP-hard problem since the number of job in the sequence is more than normal number can be processed by exact algorithm. In order to obtain optimum results, it needs a method with capability to solve complex scheduling problems in an acceptable time. Meta-heuristic is a method usually used to solve scheduling problem. The recently published method called Crow Search Algorithm (CSA) is adopted in this research to solve scheduling problem. CSA is an evolutionary meta-heuristic method which is based on the behavior in flocks of crow. The calculation result of CSA for solving scheduling problem is compared with other algorithms. From the comparison, it is found that CSA has better performance in term of optimum solution and time calculation than other algorithms.

  7. Intelligent Information Systems Institute

    National Research Council Canada - National Science Library

    Gomes, Carla

    2004-01-01

    ...) at Cornell during the first three years of operation. IISI's mandate is threefold: To perform and stimulate research in computational and data-intensive methods for intelligent decision making systems...

  8. Inverse atmospheric radiative transfer problems - A nonlinear minimization search method of solution. [aerosol pollution monitoring

    Science.gov (United States)

    Fymat, A. L.

    1976-01-01

    The paper studies the inversion of the radiative transfer equation describing the interaction of electromagnetic radiation with atmospheric aerosols. The interaction can be considered as the propagation in the aerosol medium of two light beams: the direct beam in the line-of-sight attenuated by absorption and scattering, and the diffuse beam arising from scattering into the viewing direction, which propagates more or less in random fashion. The latter beam has single scattering and multiple scattering contributions. In the former case and for single scattering, the problem is reducible to first-kind Fredholm equations, while for multiple scattering it is necessary to invert partial integrodifferential equations. A nonlinear minimization search method, applicable to the solution of both types of problems has been developed, and is applied here to the problem of monitoring aerosol pollution, namely the complex refractive index and size distribution of aerosol particles.

  9. Adjusting the Parameters of Metal Oxide Gapless Surge Arresters’ Equivalent Circuits Using the Harmony Search Method

    Directory of Open Access Journals (Sweden)

    Christos A. Christodoulou

    2017-12-01

    Full Text Available The appropriate circuit modeling of metal oxide gapless surge arresters is critical for insulation coordination studies. Metal oxide arresters present a dynamic behavior for fast front surges; namely, their residual voltage is dependent on the peak value, as well as the duration of the injected impulse current, and should therefore not only be represented by non-linear elements. The aim of the current work is to adjust the parameters of the most frequently used surge arresters’ circuit models by considering the magnitude of the residual voltage, as well as the dissipated energy for given pulses. In this aim, the harmony search method is implemented to adjust parameter values of the arrester equivalent circuit models. This functions by minimizing a defined objective function that compares the simulation outcomes with the manufacturer’s data and the results obtained from previous methodologies.

  10. A method in search of a theory: peer education and health promotion.

    Science.gov (United States)

    Turner, G; Shepherd, J

    1999-04-01

    Peer education has grown in popularity and practice in recent years in the field of health promotion. However, advocates of peer education rarely make reference to theories in their rationale for particular projects. In this paper the authors review a selection of commonly cited theories, and examine to what extent they have value and relevance to peer education in health promotion. Beginning from an identification of 10 claims made for peer education, each theory is examined in terms of the scope of the theory and evidence to support it in practice. The authors conclude that, whilst most theories have something to offer towards an explanation of why peer education might be effective, most theories are limited in scope and there is little empirical evidence in health promotion practice to support them. Peer education would seem to be a method in search of a theory rather than the application of theory to practice.

  11. Validation of the Child Premorbid Intelligence Estimate method to predict premorbid Wechsler Intelligence Scale for Children-Fourth Edition Full Scale IQ among children with brain injury.

    Science.gov (United States)

    Schoenberg, Mike R; Lange, Rael T; Saklofske, Donald H; Suarez, Mariann; Brickell, Tracey A

    2008-12-01

    Determination of neuropsychological impairment involves contrasting obtained performances with a comparison standard, which is often an estimate of premorbid IQ. M. R. Schoenberg, R. T. Lange, T. A. Brickell, and D. H. Saklofske (2007) proposed the Child Premorbid Intelligence Estimate (CPIE) to predict premorbid Full Scale IQ (FSIQ) using the Wechsler Intelligence Scale for Children-4th Edition (WISC-IV; Wechsler, 2003). The CPIE includes 12 algorithms to predict FSIQ, 1 using demographic variables and 11 algorithms combining WISC-IV subtest raw scores with demographic variables. The CPIE was applied to a sample of children with acquired traumatic brain injury (TBI sample; n = 40) and a healthy demographically matched sample (n = 40). Paired-samples t tests found estimated premorbid FSIQ differed from obtained FSIQ when applied to the TBI sample (ps .02). The demographic only algorithm performed well at a group level, but estimates were restricted in range. Algorithms combining single subtest scores with demographics performed adequately. Results support the clinical application of the CPIE algorithms. However, limitations to estimating individual premorbid ability, including statistical and developmental factors, must be considered. (c) 2008 APA, all rights reserved.

  12. Methods for estimating residential building energy consumption by application of artificial intelligence; Methode d'estimation energetique des batiments d'habitation basee sur l'application de l'intelligence artificielle

    Energy Technology Data Exchange (ETDEWEB)

    Kajl, S.; Roberge, M-A. [Quebec Univ., Ecole de technologie superieure, Montreal, PQ (Canada)

    1999-02-01

    A method for estimating energy requirements in buildings five to twenty-five stories in height using artificial intelligence techniques is proposed. In developing this technique, the pre-requisites specified were rapid execution, the ability to generate a wide range of results, including total energy consumption, power demands, heating and cooling consumption, and accuracy comparable to that of a detailed building energy simulation software. The method proposed encompasses (1) the creation of various databases such as classification of the parameters used in the energy simulation, modelling using the Department of Energy (DOE)-2 software and validation of the DOE-2 models; (2) application of the neural networks inclusive of teaching the neural network and validation of the neural network's learning; (3) designing an energy estimate assessment (EEA) system for residential buildings; and (4) validation of the EEA system. The system has been developed in the MATLAB software environment, specifically for the climate in the Ottawa region. For use under different climatic conditions appropriate adjustments need to be made for the heating and cooling consumption. 12 refs., tabs., figs., 2 appendices.

  13. EFFECTIVENESS OF AGILE COMPARED TO WATERFALL IMPLEMENTATION METHODS IN IT PROJECTS: ANALYSIS BASED ON BUSINESS INTELLIGENCE PROJECTS

    Directory of Open Access Journals (Sweden)

    Kisielnicki Jerzy

    2017-10-01

    Full Text Available The global Business Intelligence (BI market grew by 7.3% in 2016 according to the Gartner report (2017. Today, organizations require better use of data and analytics to support their business decisions. Internet power and business trend changes have provided a broad term for data analytics - Big Data. To be able to handle it and leverage a value of having access to Big Data, organizations have no other choice than to get proper systems implemented and working. However, traditional methods are not efficient for changing business needs. Long time between project start and go-live causes a gap between initial solution blueprint and actual user requirements at the end of the project. This article presents the latest market trends in BI systems implementation by comparing agile with traditional methods. It presents a case study provided in a large telecommunications company (350 BI users and the results of a pilot research provided in the three large companies: media, digital, and insurance. Both studies prove that agile methods might be more effective in BI projects from an end-user perspective and give first results and added value in a much shorter time compared to a traditional approach.

  14. A new family of Polak-Ribiere-Polyak conjugate gradient method with the strong-Wolfe line search

    Science.gov (United States)

    Ghani, Nur Hamizah Abdul; Mamat, Mustafa; Rivaie, Mohd

    2017-08-01

    Conjugate gradient (CG) method is an important technique in unconstrained optimization, due to its effectiveness and low memory requirements. The focus of this paper is to introduce a new CG method for solving large scale unconstrained optimization. Theoretical proofs show that the new method fulfills sufficient descent condition if strong Wolfe-Powell inexact line search is used. Besides, computational results show that our proposed method outperforms to other existing CG methods.

  15. Integration of first-principles methods and crystallographic database searches for new ferroelectrics: Strategies and explorations

    International Nuclear Information System (INIS)

    Bennett, Joseph W.; Rabe, Karin M.

    2012-01-01

    In this concept paper, the development of strategies for the integration of first-principles methods with crystallographic database mining for the discovery and design of novel ferroelectric materials is discussed, drawing on the results and experience derived from exploratory investigations on three different systems: (1) the double perovskite Sr(Sb 1/2 Mn 1/2 )O 3 as a candidate semiconducting ferroelectric; (2) polar derivatives of schafarzikite MSb 2 O 4 ; and (3) ferroelectric semiconductors with formula M 2 P 2 (S,Se) 6 . A variety of avenues for further research and investigation are suggested, including automated structure type classification, low-symmetry improper ferroelectrics, and high-throughput first-principles searches for additional representatives of structural families with desirable functional properties. - Graphical abstract: Integration of first-principles methods with crystallographic database mining, for the discovery and design of novel ferroelectric materials, could potentially lead to new classes of multifunctional materials. Highlights: ► Integration of first-principles methods and database mining. ► Minor structural families with desirable functional properties. ► Survey of polar entries in the Inorganic Crystal Structural Database.

  16. Overview of intelligent data retrieval methods for waveforms and images in massive fusion databases

    Energy Technology Data Exchange (ETDEWEB)

    Vega, J. [JET-EFDA, Culham Science Center, OX14 3DB Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense 22, 28040 Madrid (Spain)], E-mail: jesus.vega@ciemat.es; Murari, A. [JET-EFDA, Culham Science Center, OX14 3DB Abingdon (United Kingdom); Consorzio RFX-Associazione EURATOM ENEA per la Fusione, I-35127 Padua (Italy); Pereira, A.; Portas, A.; Ratta, G.A.; Castro, R. [JET-EFDA, Culham Science Center, OX14 3DB Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense 22, 28040 Madrid (Spain)

    2009-06-15

    JET database contains more than 42 Tbytes of data (waveforms and images) and it doubles its size about every 2 years. ITER database is expected to be orders of magnitude above this quantity. Therefore, data access in such huge databases can no longer be efficiently based on shot number or temporal interval. Taking into account that diagnostics generate reproducible signal patterns (structural shapes) for similar physical behaviour, high level data access systems can be developed. In these systems, the input parameter is a pattern and the outputs are the shot numbers and the temporal locations where similar patterns appear inside the database. These pattern oriented techniques can be used for first data screening of any type of morphological aspect of waveforms and images. The article shows a new technique to look for similar images in huge databases in a fast an efficient way. Also, previous techniques to search for similar waveforms and to retrieve time-series data or images containing any kind of patterns are reviewed.

  17. Intelligence Naturelle et Intelligence Artificielle

    OpenAIRE

    Dubois, Daniel

    2011-01-01

    Cet article présente une approche systémique du concept d’intelligence naturelle en ayant pour objectif de créer une intelligence artificielle. Ainsi, l’intelligence naturelle, humaine et animale non-humaine, est une fonction composée de facultés permettant de connaître et de comprendre. De plus, l'intelligence naturelle reste indissociable de la structure, à savoir les organes du cerveau et du corps. La tentation est grande de doter les systèmes informatiques d’une intelligence artificielle ...

  18. The Nigerian health care system: Need for integrating adequate medical intelligence and surveillance systems

    Directory of Open Access Journals (Sweden)

    Menizibeya Osain Welcome

    2011-01-01

    Full Text Available Objectives : As an important element of national security, public health not only functions to provide adequate and timely medical care but also track, monitor, and control disease outbreak. The Nigerian health care had suffered several infectious disease outbreaks year after year. Hence, there is need to tackle the problem. This study aims to review the state of the Nigerian health care system and to provide possible recommendations to the worsening state of health care in the country. To give up-to-date recommendations for the Nigerian health care system, this study also aims at reviewing the dynamics of health care in the United States, Britain, and Europe with regards to methods of medical intelligence/surveillance. Materials and Methods : Databases were searched for relevant literatures using the following keywords: Nigerian health care, Nigerian health care system, and Nigerian primary health care system. Additional keywords used in the search were as follows: United States (OR Europe health care dynamics, Medical Intelligence, Medical Intelligence systems, Public health surveillance systems, Nigerian medical intelligence, Nigerian surveillance systems, and Nigerian health information system. Literatures were searched in scientific databases Pubmed and African Journals OnLine. Internet searches were based on Google and Search Nigeria. Results : Medical intelligence and surveillance represent a very useful component in the health care system and control diseases outbreak, bioattack, etc. There is increasing role of automated-based medical intelligence and surveillance systems, in addition to the traditional manual pattern of document retrieval in advanced medical setting such as those in western and European countries. Conclusion : The Nigerian health care system is poorly developed. No adequate and functional surveillance systems are developed. To achieve success in health care in this modern era, a system well grounded in routine

  19. An Intelligent Optimization Method for Vortex-Induced Vibration Reducing and Performance Improving in a Large Francis Turbine

    Directory of Open Access Journals (Sweden)

    Xuanlin Peng

    2017-11-01

    Full Text Available In this paper, a new methodology is proposed to reduce the vortex-induced vibration (VIV and improve the performance of the stay vane in a 200-MW Francis turbine. The process can be divided into two parts. Firstly, a diagnosis method for stay vane vibration based on field experiments and a finite element method (FEM is presented. It is found that the resonance between the Kármán vortex and the stay vane is the main cause for the undesired vibration. Then, we focus on establishing an intelligent optimization model of the stay vane’s trailing edge profile. To this end, an approach combining factorial experiments, extreme learning machine (ELM and particle swarm optimization (PSO is implemented. Three kinds of improved profiles of the stay vane are proposed and compared. Finally, the profile with a Donaldson trailing edge is adopted as the best solution for the stay vane, and verifications such as computational fluid dynamics (CFD simulations, structural analysis and fatigue analysis are performed to validate the optimized geometry.

  20. Evolutionary Policy Transfer and Search Methods for Boosting Behavior Quality: RoboCup Keep-Away Case Study

    Directory of Open Access Journals (Sweden)

    Geoff Nitschke

    2017-11-01

    Full Text Available This study evaluates various evolutionary search methods to direct neural controller evolution in company with policy (behavior transfer across increasingly complex collective robotic (RoboCup keep-away tasks. Robot behaviors are first evolved in a source task and then transferred for further evolution to more complex target tasks. Evolutionary search methods tested include objective-based search (fitness function, behavioral and genotypic diversity maintenance, and hybrids of such diversity maintenance and objective-based search. Evolved behavior quality is evaluated according to effectiveness and efficiency. Effectiveness is the average task performance of transferred and evolved behaviors, where task performance is the average time the ball is controlled by a keeper team. Efficiency is the average number of generations taken for the fittest evolved behaviors to reach a minimum task performance threshold given policy transfer. Results indicate that policy transfer coupled with hybridized evolution (behavioral diversity maintenance and objective-based search addresses the bootstrapping problem for increasingly complex keep-away tasks. That is, this hybrid method (coupled with policy transfer evolves behaviors that could not otherwise be evolved. Also, this hybrid evolutionary search was demonstrated as consistently evolving topologically simple neural controllers that elicited high-quality behaviors.

  1. Challenging problems and solutions in intelligent systems

    CERN Document Server

    Grzegorzewski, Przemysław; Kacprzyk, Janusz; Owsiński, Jan; Penczek, Wojciech; Zadrożny, Sławomir

    2016-01-01

    This volume presents recent research, challenging problems and solutions in Intelligent Systems– covering the following disciplines: artificial and computational intelligence, fuzzy logic and other non-classic logics, intelligent database systems, information retrieval, information fusion, intelligent search (engines), data mining, cluster analysis, unsupervised learning, machine learning, intelligent data analysis, (group) decision support systems, intelligent agents and multi-agent systems, knowledge-based systems, imprecision and uncertainty handling, electronic commerce, distributed systems, etc. The book defines a common ground for sometimes seemingly disparate problems and addresses them by using the paradigm of broadly perceived intelligent systems. It presents a broad panorama of a multitude of theoretical and practical problems which have been successfully dealt with using the paradigm of intelligent computing.

  2. Intelligent environmental data warehouse

    International Nuclear Information System (INIS)

    Ekechukwu, B.

    1998-01-01

    Making quick and effective decisions in environment management are based on multiple and complex parameters, a data warehouse is a powerful tool for the over all management of massive environmental information. Selecting the right data from a warehouse is an important factor consideration for end-users. This paper proposed an intelligent environmental data warehouse system. It consists of data warehouse to feed an environmental researchers and managers with desire environmental information needs to their research studies and decision in form of geometric and attribute data for study area, and a metadata for the other sources of environmental information. In addition, the proposed intelligent search engine works according to a set of rule, which enables the system to be aware of the environmental data wanted by the end-user. The system development process passes through four stages. These are data preparation, warehouse development, intelligent engine development and internet platform system development. (author)

  3. Investigations on search methods for speech recognition using weighted finite state transducers

    OpenAIRE

    Rybach, David

    2014-01-01

    The search problem in the statistical approach to speech recognition is to find the most likely word sequence for an observed speech signal using a combination of knowledge sources, i.e. the language model, the pronunciation model, and the acoustic models of phones. The resulting search space is enormous. Therefore, an efficient search strategy is required to compute the result with a feasible amount of time and memory. The structured statistical models as well as their combination, the searc...

  4. Elliptical tiling method to generate a 2-dimensional set of templates for gravitational wave search

    International Nuclear Information System (INIS)

    Arnaud, Nicolas; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Kreckelbergh, Stephane; Porter, Edward K.

    2003-01-01

    Searching for a signal depending on unknown parameters in a noisy background with matched filtering techniques always requires an analysis of the data with several templates in parallel in order to ensure a proper match between the filter and the real waveform. The key feature of such an implementation is the design of the filter bank which must be small to limit the computational cost while keeping the detection efficiency as high as possible. This paper presents a geometrical method that allows one to cover the corresponding physical parameter space by a set of ellipses, each of them being associated with a given template. After the description of the main characteristics of the algorithm, the method is applied in the field of gravitational wave (GW) data analysis, for the search of damped sine signals. Such waveforms are expected to be produced during the deexcitation phase of black holes - the so-called 'ringdown' signals - and are also encountered in some numerically computed supernova signals. First, the number of templates N computed by the method is similar to its analytical estimation, despite the overlaps between neighbor templates and the border effects. Moreover, N is small enough to test for the first time the performances of the set of templates for different choices of the minimal match MM, the parameter used to define the maximal allowed loss of signal-to-noise ratio (SNR) due to the mismatch between real signals and templates. The main result of this analysis is that the fraction of SNR recovered is on average much higher than MM, which dramatically decreases the mean percentage of false dismissals. Indeed, it goes well below its estimated value of 1-MM 3 used as input of the algorithm. Thus, as this feature should be common to any tiling algorithm, it seems possible to reduce the constraint on the value of MM - and indeed the number of templates and the computing power - without losing as many events as expected on average. This should be of great

  5. The optimal design support system for shell components of vehicles using the methods of artificial intelligence

    Science.gov (United States)

    Szczepanik, M.; Poteralski, A.

    2016-11-01

    The paper is devoted to an application of the evolutionary methods and the finite element method to the optimization of shell structures. Optimization of thickness of a car wheel (shell) by minimization of stress functional is considered. A car wheel geometry is built from three surfaces of revolution: the central surface with the holes destined for the fastening bolts, the surface of the ring of the wheel and the surface connecting the two mentioned earlier. The last one is subjected to the optimization process. The structures are discretized by triangular finite elements and subjected to the volume constraints. Using proposed method, material properties or thickness of finite elements are changing evolutionally and some of them are eliminated. As a result the optimal shape, topology and material or thickness of the structures are obtained. The numerical examples demonstrate that the method based on evolutionary computation is an effective technique for solving computer aided optimal design.

  6. The intelligence of dual simplex method to solve linear fractional fuzzy transportation problem.

    Science.gov (United States)

    Narayanamoorthy, S; Kalyani, S

    2015-01-01

    An approach is presented to solve a fuzzy transportation problem with linear fractional fuzzy objective function. In this proposed approach the fractional fuzzy transportation problem is decomposed into two linear fuzzy transportation problems. The optimal solution of the two linear fuzzy transportations is solved by dual simplex method and the optimal solution of the fractional fuzzy transportation problem is obtained. The proposed method is explained in detail with an example.

  7. The Intelligence of Dual Simplex Method to Solve Linear Fractional Fuzzy Transportation Problem

    Directory of Open Access Journals (Sweden)

    S. Narayanamoorthy

    2015-01-01

    Full Text Available An approach is presented to solve a fuzzy transportation problem with linear fractional fuzzy objective function. In this proposed approach the fractional fuzzy transportation problem is decomposed into two linear fuzzy transportation problems. The optimal solution of the two linear fuzzy transportations is solved by dual simplex method and the optimal solution of the fractional fuzzy transportation problem is obtained. The proposed method is explained in detail with an example.

  8. Application of a heuristic search method for generation of fuel reload configurations

    International Nuclear Information System (INIS)

    Galperin, A.; Nissan, E.

    1988-01-01

    A computerized heuristic search method for the generation and optimization of fuel reload configurations is proposed and investigated. The heuristic knowledge is expressed modularly in the form of ''IF-THEN'' production rules. The method was implemented in a program coded in the Franz LISP programming language and executed under the UNIX operating system. A test problem was formulated, based on a typical light water reactor reload problem with a few simplifications assumed, in order to allow formulation of the reload strategy into a relatively small number of rules. A computer run of the problem was performed with a VAX-780 machine. A set of 312 solutions was generated in -- 20 min of execution time. Testing of a few arbitrarily chosen configurations demonstrated reasonably good performance for the computer-generated solutions. A computerized generator of reload configurations may be used for the fast generation or modification of reload patterns and as a tool for the formulation, tuning, and testing of the heuristic knowledge rules used by an ''expert'' fuel manager

  9. Gravity Search Algorithm hybridized Recursive Least Square method for power system harmonic estimation

    Directory of Open Access Journals (Sweden)

    Santosh Kumar Singh

    2017-06-01

    Full Text Available This paper presents a new hybrid method based on Gravity Search Algorithm (GSA and Recursive Least Square (RLS, known as GSA-RLS, to solve the harmonic estimation problems in the case of time varying power signals in presence of different noises. GSA is based on the Newton’s law of gravity and mass interactions. In the proposed method, the searcher agents are a collection of masses that interact with each other using Newton’s laws of gravity and motion. The basic GSA algorithm strategy is combined with RLS algorithm sequentially in an adaptive way to update the unknown parameters (weights of the harmonic signal. Simulation and practical validation are made with the experimentation of the proposed algorithm with real time data obtained from a heavy paper industry. A comparative performance of the proposed algorithm is evaluated with other recently reported algorithms like, Differential Evolution (DE, Particle Swarm Optimization (PSO, Bacteria Foraging Optimization (BFO, Fuzzy-BFO (F-BFO hybridized with Least Square (LS and BFO hybridized with RLS algorithm, which reveals that the proposed GSA-RLS algorithm is the best in terms of accuracy, convergence and computational time.

  10. Search for the top quark at D0 using multivariate methods

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1995-07-01

    We report on the search for the top quark in p bar p collisions at the Fermilab Tevatron (√s = 1.8 TeV) in the di-lepton and lepton+jets channels using multivariate methods. An H-matrix analysis of the eμ data corresponding to an integrated luminosity of 13.5±1.6 pb -1 yields one event whose likelihood to be a top quark event, assuming m top = 180 GeV/c 2 , is ten times more than that of WW and eighteen times more than that of Z → ττ. A neural network analysis of the e+jets channel using a data sample corresponding to an integrated luminosity of 47.9±5.7 pb -1 shows an excess of events in the signal region and yields a cross-section for t bar t production of 6.7±2.3 (stat.) pb, assuming a top mass of 200 GeV/c 2 . An analysis of the e+jets data using the probability density estimation method yields a cross-section that is consistent with the above result

  11. Application of Artificial Intelligence Methods for Analysis of Material and Non-material Determinants of Functioning of Young Europeans in Times of Crisis in the Eurozone

    OpenAIRE

    Gawlik, Remigiusz

    2014-01-01

    The study presents an analysis of possible applications of artificial intelligence methods for understanding, structuring and supporting the decision-making processes of European Youth in times of crisis in the Eurozone. Its main purpose is selecting a research method suitable for grasping and explaining the relations between social, economic and psychological premises when taking important life decisions by young Europeans at the beginning of their adult life. The interdisciplinary ap...

  12. Validation of a search strategy to identify nutrition trials in PubMed using the relative recall method.

    Science.gov (United States)

    Durão, Solange; Kredo, Tamara; Volmink, Jimmy

    2015-06-01

    To develop, assess, and maximize the sensitivity of a search strategy to identify diet and nutrition trials in PubMed using relative recall. We developed a search strategy to identify diet and nutrition trials in PubMed. We then constructed a gold standard reference set to validate the identified trials using the relative recall method. Relative recall was calculated by dividing the number of references from the gold standard our search strategy identified by the total number of references in the gold standard. Our gold standard comprised 298 trials, derived from 16 included systematic reviews. The initial search strategy identified 242 of 298 references, with a relative recall of 81.2% [95% confidence interval (CI): 76.3%, 85.5%]. We analyzed titles and abstracts of the 56 missed references for possible additional terms. We then modified the search strategy accordingly. The relative recall of the final search strategy was 88.6% (95% CI: 84.4%, 91.9%). We developed a search strategy to identify diet and nutrition trials in PubMed with a high relative recall (sensitivity). This could be useful for establishing a nutrition trials register to support the conduct of future research, including systematic reviews. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Artificial Intelligence Methods in Analysis of Morphology of Selected Structures in Medical Images

    Directory of Open Access Journals (Sweden)

    Ryszard Tadeusiewicz

    2001-01-01

    Full Text Available The goal of this paper is the presentation of the possibilities of application of syntactic method of computer image analysis for recognition of local stenoscs of coronary arteries lumen and detection of pathological signs in upper parts of ureter ducts and renal calyxes. Analysis of correct morphology of these structures is possible thanks to thc application of sequence and tree methods from the group of syntactic methods of pattern recognition. In the case of analysis of coronary arteries images the main objective is computer-aided early diagnosis of different form of ischemic cardiovascular diseases. Such diseases may reveal in the form of stable or unstable disturbances of heart rhythm or infarction. ln analysis of kidney radiograms the main goal is recognition of local irregularities in ureter lumens and examination of morphology of renal pelvis and calyxes.

  14. A multi-agent based intelligent configuration method for aircraft fleet maintenance personnel

    Directory of Open Access Journals (Sweden)

    Feng Qiang

    2014-04-01

    Full Text Available A multi-agent based fleet maintenance personnel configuration method is proposed to solve the mission oriented aircraft fleet maintenance personnel configuration problem. The maintenance process of an aircraft fleet is analyzed first. In the process each aircraft contains multiple parts, and different parts are repaired by personnel with different majors and levels. The factors and their relationship involved in the process of maintenance are analyzed and discussed. Then the whole maintenance process is described as a 3-layer multi-agent system (MAS model. A communication and reasoning strategy among the agents is put forward. A fleet maintenance personnel configuration algorithm is proposed based on contract net protocol (CNP. Finally, a fleet of 10 aircraft is studied for verification purposes. A mission type with 3 waves of continuous dispatch is imaged. Compared with the traditional methods that can just provide configuration results, the proposed method can provide optimal maintenance strategies as well.

  15. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  16. Artificial Intelligence in Civil Engineering

    OpenAIRE

    Lu, Pengzhen; Chen, Shengyong; Zheng, Yujun

    2012-01-01

    Artificial intelligence is a branch of computer science, involved in the research, design, and application of intelligent computer. Traditional methods for modeling and optimizing complex structure systems require huge amounts of computing resources, and artificial-intelligence-based solutions can often provide valuable alternatives for efficiently solving problems in the civil engineering. This paper summarizes recently developed methods and theories in the developing direction for applicati...

  17. The future of active and intelligent packaging industry

    Directory of Open Access Journals (Sweden)

    Renata Dobrucka

    2013-06-01

    Full Text Available Background: Innovation in food and beverage packaging is mostly driven by consumer needs and demands influenced by changing global trends, such as increased life expectancy, fewer organizations investing in food production and distribution. Food industry has seen great advances in the packaging sector since its inception in the 18th century with most active and intelligent innovations occurring during the past century. These advances have led to improved food quality and safety. Active and intelligent packaging is new and exciting area of technology  which efficient contemporary consumer response. Materials and methods: On the basis of broad review of the current state of the art in world literature, the market active and intelligent packaging is discussed. Results: This paper shows present innovation in the market active and intelligent packaging. Conclusion: Research and development in the field of active and intelligent packaging materials is very dynamic and develops in relation with the search for environment friendly packaging solutions. Besides, active and intelligent packaging is becoming more and more widely used for food products. The future of this type of packaging system seems to be very interesting.

  18. AN INTELLIGENT NEURO-FUZZY TERMINAL SLIDING MODE CONTROL METHOD WITH APPLICATION TO ATOMIC FORCE MICROSCOPE

    Directory of Open Access Journals (Sweden)

    Seied Yasser Nikoo

    2016-11-01

    Full Text Available In this paper, a neuro-fuzzy fast terminal sliding mode control method is proposed for controlling a class of nonlinear systems with bounded uncertainties and disturbances. In this method, a nonlinear terminal sliding surface is firstly designed. Then, this sliding surface is considered as input for an adaptive neuro-fuzzy inference system which is the main controller. A proportinal-integral-derivative controller is also used to asist the neuro-fuzzy controller in order to improve the performance of the system at the begining stage of control operation. In addition, bee algorithm is used in this paper to update the weights of neuro-fuzzy system as well as the parameters of the proportinal-integral-derivative controller. The proposed control scheme is simulated for vibration control in a model of atomic force microscope system and the results are compared with conventional sliding mode controllers. The simulation results show that the chattering effect in the proposed controller is decreased in comparison with the sliding mode and the terminal sliding mode controllers. Also, the method provides the advantages of fast convergence and low model dependency compared to the conventional methods.

  19. Relating business intelligence and enterprise architecture - A method for combining operational data with architectural metadata

    NARCIS (Netherlands)

    Veneberg, R.K.M.; Iacob, Maria Eugenia; van Sinderen, Marten J.; Bodenstaff, L.

    Combining enterprise architecture and operational data is complex (especially when considering the actual ‘matching’ of data with enterprise architecture elements), and little has been written on how to do this. In this paper we aim to fill this gap, and propose a method to combine operational data

  20. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  1. An Exploration of Retrieval-Enhancing Methods for Integrated Search in a Digital Library

    DEFF Research Database (Denmark)

    Sørensen, Diana Ransgaard; Bogers, Toine; Larsen, Birger

    2012-01-01

    Integrated search is defined as searching across different document types and representations simultaneously, with the goal of presenting the user with a single ranked result list containing the optimal mix of document types. In this paper, we compare various approaches to integrating three diffe...

  2. Novel citation-based search method for scientific literature: application to meta-analyses

    NARCIS (Netherlands)

    Janssens, A.C.J.W.; Gwinn, M.

    2015-01-01

    Background: Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of

  3. Intelligent Lighting Control System

    OpenAIRE

    García, Elena; Rodríguez González, Sara; de Paz Santana, Juan F.; Bajo Pérez, Javier

    2014-01-01

    This paper presents an adaptive architecture that allows centralized control of public lighting and intelligent management, in order to economise on lighting and maintain maximum comfort status of the illuminated areas. To carry out this management, architecture merges various techniques of artificial intelligence (AI) and statistics such as artificial neural networks (ANN), multi-agent systems (MAS), EM algorithm, methods based on ANOVA and a Service Oriented Aproach (SOA). It performs optim...

  4. A multi-agent based intelligent configuration method for aircraft fleet maintenance personnel

    OpenAIRE

    Feng, Qiang; Li, Songjie; Sun, Bo

    2014-01-01

    A multi-agent based fleet maintenance personnel configuration method is proposed to solve the mission oriented aircraft fleet maintenance personnel configuration problem. The maintenance process of an aircraft fleet is analyzed first. In the process each aircraft contains multiple parts, and different parts are repaired by personnel with different majors and levels. The factors and their relationship involved in the process of maintenance are analyzed and discussed. Then the whole maintenance...

  5. INNOVATIVE FORMS SUPPORTING SAFE METHODS OF WORK IN SAFETY ENGINEERING FOR THE DEVELOPMENT OF INTELLIGENT SPECIALIZATIONS

    Directory of Open Access Journals (Sweden)

    Anna GEMBALSKA-KWIECIEŃ

    2016-10-01

    Full Text Available The article discusses innovative forms of participation of employees in the work safety system. It also presents the advantages of these forms of employees’ involvement. The aim of empirical studies was the analysis of their behavior and attitude towards health and safety at work. The issues considered in the article have a significant impact on the improvement of methods of prevention related to work safety and aided the creation of a healthy society.

  6. A Method Based on Artificial Intelligence To Fully Automatize The Evaluation of Bovine Blastocyst Images.

    Science.gov (United States)

    Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Ciniciato, Diego de Souza; Maserati, Marc Peter; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia

    2017-08-09

    Morphological analysis is the standard method of assessing embryo quality; however, its inherent subjectivity tends to generate discrepancies among evaluators. Using genetic algorithms and artificial neural networks (ANNs), we developed a new method for embryo analysis that is more robust and reliable than standard methods. Bovine blastocysts produced in vitro were classified as grade 1 (excellent or good), 2 (fair), or 3 (poor) by three experienced embryologists according to the International Embryo Technology Society (IETS) standard. The images (n = 482) were subjected to automatic feature extraction, and the results were used as input for a supervised learning process. One part of the dataset (15%) was used for a blind test posterior to the fitting, for which the system had an accuracy of 76.4%. Interestingly, when the same embryologists evaluated a sub-sample (10%) of the dataset, there was only 54.0% agreement with the standard (mode for grades). However, when using the ANN to assess this sub-sample, there was 87.5% agreement with the modal values obtained by the evaluators. The presented methodology is covered by National Institute of Industrial Property (INPI) and World Intellectual Property Organization (WIPO) patents and is currently undergoing a commercial evaluation of its feasibility.

  7. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network.

    Science.gov (United States)

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-08

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method.

  8. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network

    Directory of Open Access Journals (Sweden)

    Ke Li

    2016-01-01

    Full Text Available A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF and Diagnostic Bayesian Network (DBN is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO. To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA is proposed to evaluate the sensitiveness of symptom parameters (SPs for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method.

  9. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network

    Science.gov (United States)

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-01

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006

  10. A Tabu Search WSN Deployment Method for Monitoring Geographically Irregular Distributed Events

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available In this paper, we address the Wireless Sensor Network (WSN deployment issue. We assume that the observed area is characterized by the geographical irregularity of the sensed events. Formally, we consider that each point in the deployment area is associated a differentiated detection probability threshold, which must be satisfied by our deployment method. Our resulting WSN deployment problem is formulated as a Multi-Objectives Optimization problem, which seeks to reduce the gap between the generated events detection probabilities and the required thresholds while minimizing the number of deployed sensors. To overcome the computational complexity of an exact resolution, we propose an original pseudo-random approach based on the Tabu Search heuristic. Simulations show that our proposal achieves better performances than several other approaches proposed in the literature. In the last part of this paper, we generalize the deployment problem by including the wireless communication network connectivity constraint. Thus, we extend our proposal to ensure that the resulting WSN topology is connected even if a sensor communication range takes small values.

  11. A Tabu Search WSN Deployment Method for Monitoring Geographically Irregular Distributed Events.

    Science.gov (United States)

    Aitsaadi, Nadjib; Achir, Nadjib; Boussetta, Khaled; Pujolle, Guy

    2009-01-01

    In this paper, we address the Wireless Sensor Network (WSN) deployment issue. We assume that the observed area is characterized by the geographical irregularity of the sensed events. Formally, we consider that each point in the deployment area is associated a differentiated detection probability threshold, which must be satisfied by our deployment method. Our resulting WSN deployment problem is formulated as a Multi-Objectives Optimization problem, which seeks to reduce the gap between the generated events detection probabilities and the required thresholds while minimizing the number of deployed sensors. To overcome the computational complexity of an exact resolution, we propose an original pseudo-random approach based on the Tabu Search heuristic. Simulations show that our proposal achieves better performances than several other approaches proposed in the literature. In the last part of this paper, we generalize the deployment problem by including the wireless communication network connectivity constraint. Thus, we extend our proposal to ensure that the resulting WSN topology is connected even if a sensor communication range takes small values.

  12. Searching for beyond the Standard Model physics using direct and indirect methods at LHCb

    CERN Document Server

    Hall, Samuel C P; Golutvin, Andrey

    It is known that the Standard Model of particle physics is incomplete in its description of nature at a fundamental level. For example, the Standard Model can neither incorporate dark matter nor explain the matter dominated nature of the Universe. This thesis presents three analyses undertaken using data collected by the LHCb detector. Each analysis searches for indications of physics beyond the Standard Model in dierent decays of B mesons, using dierent techniques. Notably, two analyses look for indications of new physics using indirect methods, and one uses a direct approach. The rst analysis shows evidence for the rare decay $B^{+} \\rightarrow D^{+}_{s}\\phi$ with greater than 3 $\\sigma$ signicance; this also constitutes the rst evidence for a fullyhadronic annihilation-type decay of a $B^{+}$ meson. A measurement of the branching fraction of the decay $B^{+} \\rightarrow D^{+}_{s}\\phi$ is seen to be higher than, but still compatible with, Standard Model predictions. The CP-asymmetry of the decay is also ...

  13. Minimization of municipal solid waste transportation route in West Jakarta using Tabu Search method

    Science.gov (United States)

    Chaerul, M.; Mulananda, A. M.

    2018-04-01

    Indonesia still adopts the concept of collect-haul-dispose for municipal solid waste handling and it leads to the queue of the waste trucks at final disposal site (TPA). The study aims to minimize the total distance of waste transportation system by applying a Transshipment model. In this case, analogous of transshipment point is a compaction facility (SPA). Small capacity of trucks collects the waste from waste temporary collection points (TPS) to the compaction facility which located near the waste generator. After compacted, the waste is transported using big capacity of trucks to the final disposal site which is located far away from city. Problem related with the waste transportation can be solved using Vehicle Routing Problem (VRP). In this study, the shortest distance of route from truck pool to TPS, TPS to SPA, and SPA to TPA was determined by using meta-heuristic methods, namely Tabu Search 2 Phases. TPS studied is the container type with total 43 units throughout the West Jakarta City with 38 units of Armroll truck with capacity of 10 m3 each. The result determines the assignment of each truck from the pool to the selected TPS, SPA and TPA with the total minimum distance of 2,675.3 KM. The minimum distance causing the total cost for waste transportation to be spent by the government also becomes minimal.

  14. FEM-based Printhead Intelligent Adjusting Method for Printing Conduct Material

    Directory of Open Access Journals (Sweden)

    Liang Xiaodan

    2017-01-01

    Full Text Available Ink-jet printing circuit board has some advantage, such as non-contact manufacture, high manufacture accuracy, and low pollution and so on. In order to improve the and printing precision, the finite element technology is adopted to model the piezoelectric print heads, and a new bacteria foraging algorithm with a lifecycle strategy is proposed to optimize the parameters of driving waveforms for getting the desired droplet characteristics. Results of numerical simulation show such algorithm has a good performance. Additionally, the droplet jetting simulation results and measured results confirmed such method precisely gets the desired droplet characteristics.

  15. Development of dose assessment method for high-energy neutrons using intelligent neutron monitor

    International Nuclear Information System (INIS)

    Satoh, Daiki; Sato, Tatsuhiko; Endo, Akira; Yamaguchi, Yasuhiro; Matsufuji, N.; Sato, S.; Takada, M.

    2006-01-01

    Light output of liquid organic scintillator NE213 has been measured for proton, deuteron, triton, 3 He nucleus and alpha particle. A thick graphite target was bombarded with 400-MeV/u C ions to the produce charged particles. Time-of-flight method was adopted to determine the kinetic energy of the charged particles. Light output for proton was also measured using mono-energy beams of 100 and 160 MeV. The experimental results gave a new database of light output. (author)

  16. Reverse screening methods to search for the protein targets of chemopreventive compounds

    Science.gov (United States)

    Huang, Hongbin; Zhang, Guigui; Zhou, Yuquan; Lin, Chenru; Chen, Suling; Lin, Yutong; Mai, Shangkang; Huang, Zunnan

    2018-05-01

    This article is a systematic review of reverse screening methods used to search for the protein targets of chemopreventive compounds or drugs. Typical chemopreventive compounds include components of traditional Chinese medicine, natural compounds and Food and Drug Administration (FDA)-approved drugs. Such compounds are somewhat selective but are predisposed to bind multiple protein targets distributed throughout diverse signaling pathways in human cells. In contrast to conventional virtual screening, which identifies the ligands of a targeted protein from a compound database, reverse screening is used to identify the potential targets or unintended targets of a given compound from a large number of receptors by examining their known ligands or crystal structures. This method, also known as in silico or computational target fishing, is highly valuable for discovering the target receptors of query molecules from terrestrial or marine natural products, exploring the molecular mechanisms of chemopreventive compounds, finding alternative indications of existing drugs by drug repositioning, and detecting adverse drug reactions and drug toxicity. Reverse screening can be divided into three major groups: shape screening, pharmacophore screening and reverse docking. Several large software packages, such as Schrödinger and Discovery Studio; typical software/network services such as ChemMapper, PharmMapper, idTarget and INVDOCK; and practical databases of known target ligands and receptor crystal structures, such as ChEMBL, BindingDB and the Protein Data Bank (PDB), are available for use in these computational methods. Different programs, online services and databases have different applications and constraints. Here, we conducted a systematic analysis and multilevel classification of the computational programs, online services and compound libraries available for shape screening, pharmacophore screening and reverse docking to enable non-specialist users to quickly learn and

  17. DESIGN OF A WEB SEMI-INTELLIGENT METADATA SEARCH MODEL APPLIED IN DATA WAREHOUSING SYSTEMS DISEÑO DE UN MODELO SEMIINTELIGENTE DE BÚSQUEDA DE METADATOS EN LA WEB, APLICADO A SISTEMAS DATA WAREHOUSING

    Directory of Open Access Journals (Sweden)

    Enrique Luna Ramírez

    2008-12-01

    Full Text Available In this paper, the design of a Web metadata search model with semi-intelligent features is proposed. The search model is oriented to retrieve the metadata associated to a data warehouse in a fast, flexible and reliable way. Our proposal includes a set of distinctive functionalities, which consist of the temporary storage of the frequently used metadata in an exclusive store, different to the global data warehouse metadata store, and of the use of control processes to retrieve information from both stores through aliases of concepts.En este artículo se propone el diseño de un modelo para la búsqueda Web de metadatos con características semiinteligentes. El modelo ha sido concebido para recuperar de manera rápida, flexible y fiable los metadatos asociados a un data warehouse corporativo. Nuestra propuesta incluye un conjunto de funcionalidades distintivas consistentes en el almacenamiento temporal de los metadatos de uso frecuente en un almacén exclusivo, diferente al almacén global de metadatos, y al uso de procesos de control para recuperar información de ambos almacenes a través de alias de conceptos.

  18. Pathway Detection from Protein Interaction Networks and Gene Expression Data Using Color-Coding Methods and A* Search Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yu Yeh

    2012-01-01

    Full Text Available With the large availability of protein interaction networks and microarray data supported, to identify the linear paths that have biological significance in search of a potential pathway is a challenge issue. We proposed a color-coding method based on the characteristics of biological network topology and applied heuristic search to speed up color-coding method. In the experiments, we tested our methods by applying to two datasets: yeast and human prostate cancer networks and gene expression data set. The comparisons of our method with other existing methods on known yeast MAPK pathways in terms of precision and recall show that we can find maximum number of the proteins and perform comparably well. On the other hand, our method is more efficient than previous ones and detects the paths of length 10 within 40 seconds using CPU Intel 1.73GHz and 1GB main memory running under windows operating system.

  19. Application of stochastic and artificial intelligence methods for nuclear material identification

    International Nuclear Information System (INIS)

    Pozzi, S.; Segovia, F.J.

    1999-01-01

    Nuclear materials safeguard efforts necessitate the use of non-destructive methods to determine the attributes of fissile samples enclosed in special, non-accessible containers. To this end, a large variety of methods has been developed at the Oak Ridge National Laboratory (ORNL) and elsewhere. Usually, a given set of statistics of the stochastic neutron-photon coupled field, such as source-detector, detector-detector cross correlation functions, and multiplicities are measured over a range of known samples to develop calibration algorithms. In this manner, the attributes of unknown samples can be inferred by the use of the calibration results. The organization of this paper is as follows: Section 2 describes the Monte Carlo simulations of source-detector cross correlation functions for a set of uranium metallic samples interrogated by the neutrons and photons from a 252 Cf source. From this database, a set of features is extracted in Section 3. The use of neural networks (NN) and genertic programming to provide sample mass and enrichment values from the input sets of features is illustrated in Sections 4 and 5, respectivelyl. Section 6 is a comparison of the results, while Section 7 is a brief summary of the work

  20. Application of pattern search method to power system security constrained economic dispatch with non-smooth cost function

    International Nuclear Information System (INIS)

    Al-Othman, A.K.; El-Naggar, K.M.

    2008-01-01

    Direct search methods are evolutionary algorithms used to solve optimization problems. (DS) methods do not require any information about the gradient of the objective function at hand while searching for an optimum solution. One of such methods is Pattern Search (PS) algorithm. This paper presents a new approach based on a constrained pattern search algorithm to solve a security constrained power system economic dispatch problem (SCED) with non-smooth cost function. Operation of power systems demands a high degree of security to keep the system satisfactorily operating when subjected to disturbances, while and at the same time it is required to pay attention to the economic aspects. Pattern recognition technique is used first to assess dynamic security. Linear classifiers that determine the stability of electric power system are presented and added to other system stability and operational constraints. The problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Pattern search method is then applied to solve the constrained optimization formulation. In particular, the method is tested using three different test systems. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that pattern search (PS) is very applicable for solving security constrained power system economic dispatch problem (SCED). In addition, valve-point effect loading and total system losses are considered to further investigate the potential of the PS technique. Based on the results, it can be concluded that the PS has demonstrated ability in handling highly nonlinear discontinuous non-smooth cost function of the SCED. (author)

  1. Searching for Innovations and Methods of Using the Cultural Heritage on the Example of Upper Silesia

    Science.gov (United States)

    Wagner, Tomasz

    2017-10-01

    The basic subject of this paper is historical and cultural heritage of some parts of Upper Silesia, bind by common history and similar problems at present days. The paper presents some selected historical phenomena that have influenced contemporary space, mentioned above, and contemporary issues of heritage protection in Upper Silesia. The Silesian architecture interpretation, since 1989, is strongly covered with some ideological and national ideas. The last 25 years are the next level of development which contains rapidly transformation of the space what is caused by another economical transformations. In this period, we can observe landscape transformations, liquidation of objects and historical structures, loos of regional features, spontaneous adaptation processes of objects and many methods of implementation forms of protection, and using of cultural resources. Some upheaval linked to the state borders changes, system, economy and ethnic transformation caused that former Upper Silesia border area focuses phenomena that exists in some other similar European areas which are abutments of cultures and traditions. The latest period in the history of Upper Silesia gives us time to reflect the character of changes in architecture and city planning of the area and appraisal of efficiency these practices which are connected to cultural heritage perseveration. The phenomena of the last decades are: decrement of regional features, elimination of objects, which were a key feature of the regional cultural heritage, deformation of these forms that were shaped in the history and some trials of using these elements of cultural heritage, which are widely recognized as cultural values. In this situation, it is important to seek creative solutions that will neutralize bad processes resulting from bad law and practice. The most important phenomena of temporary space is searching of innovative fields and methods and use of cultural resources. An important part of the article is

  2. Identifying complications of interventional procedures from UK routine healthcare databases: a systematic search for methods using clinical codes.

    Science.gov (United States)

    Keltie, Kim; Cole, Helen; Arber, Mick; Patrick, Hannah; Powell, John; Campbell, Bruce; Sims, Andrew

    2014-11-28

    Several authors have developed and applied methods to routine data sets to identify the nature and rate of complications following interventional procedures. But, to date, there has been no systematic search for such methods. The objective of this article was to find, classify and appraise published methods, based on analysis of clinical codes, which used routine healthcare databases in a United Kingdom setting to identify complications resulting from interventional procedures. A literature search strategy was developed to identify published studies that referred, in the title or abstract, to the name or acronym of a known routine healthcare database and to complications from procedures or devices. The following data sources were searched in February and March 2013: Cochrane Methods Register, Conference Proceedings Citation Index - Science, Econlit, EMBASE, Health Management Information Consortium, Health Technology Assessment database, MathSciNet, MEDLINE, MEDLINE in-process, OAIster, OpenGrey, Science Citation Index Expanded and ScienceDirect. Of the eligible papers, those which reported methods using clinical coding were classified and summarised in tabular form using the following headings: routine healthcare database; medical speciality; method for identifying complications; length of follow-up; method of recording comorbidity. The benefits and limitations of each approach were assessed. From 3688 papers identified from the literature search, 44 reported the use of clinical codes to identify complications, from which four distinct methods were identified: 1) searching the index admission for specified clinical codes, 2) searching a sequence of admissions for specified clinical codes, 3) searching for specified clinical codes for complications from procedures and devices within the International Classification of Diseases 10th revision (ICD-10) coding scheme which is the methodology recommended by NHS Classification Service, and 4) conducting manual clinical

  3. Assessment of methods for computing the closest point projection, penetration, and gap functions in contact searching problems

    Czech Academy of Sciences Publication Activity Database

    Kopačka, Ján; Gabriel, Dušan; Plešek, Jiří; Ulbin, M.

    2016-01-01

    Roč. 105, č. 11 (2016), s. 803-833 ISSN 0029-5981 R&D Projects: GA ČR(CZ) GAP101/12/2315; GA MŠk(CZ) ME10114 Institutional support: RVO:61388998 Keywords : closest point projection * local contact search * quadratic elements * Newtons methods * geometric iteration methods * simplex method Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.162, year: 2016 http://onlinelibrary.wiley.com/doi/10.1002/nme.4994/abstract

  4. Infrared thermography based on artificial intelligence as a screening method for carpal tunnel syndrome diagnosis.

    Science.gov (United States)

    Jesensek Papez, B; Palfy, M; Mertik, M; Turk, Z

    2009-01-01

    This study further evaluated a computer-based infrared thermography (IRT) system, which employs artificial neural networks for the diagnosis of carpal tunnel syndrome (CTS) using a large database of 502 thermal images of the dorsal and palmar side of 132 healthy and 119 pathological hands. It confirmed the hypothesis that the dorsal side of the hand is of greater importance than the palmar side when diagnosing CTS thermographically. Using this method it was possible correctly to classify 72.2% of all hands (healthy and pathological) based on dorsal images and > 80% of hands when only severely affected and healthy hands were considered. Compared with the gold standard electromyographic diagnosis of CTS, IRT cannot be recommended as an adequate diagnostic tool when exact severity level diagnosis is required, however we conclude that IRT could be used as a screening tool for severe cases in populations with high ergonomic risk factors of CTS.

  5. Novel activity classification and occupancy estimation methods for intelligent HVAC (heating, ventilation and air conditioning) systems

    International Nuclear Information System (INIS)

    Rana, Rajib; Kusy, Brano; Wall, Josh; Hu, Wen

    2015-01-01

    Reductions in HVAC (heating, ventilation and air conditioning) energy consumption can be achieved by limiting heating in the winter or cooling in the summer. However, the resulting low thermal comfort of building occupants may lead to an override of the HVAC control, which revokes its original purpose. This has led to an increased interest in modeling and real-time tracking of location, activity, and thermal comfort of building occupants for HVAC energy management. While thermal comfort is well understood, it is difficult to measure in real-time environments where user context changes dynamically. Encouragingly, plethora of sensors available on smartphone unleashes the opportunity to measure user contexts in real-time. An important contextual information for measuring thermal comfort is Metabolism rate, which changes based on current physical activities. To measure physical activity, we develop an activity classifier, which achieves 10% higher accuracy compared to Support Vector Machine and k-Nearest Neighbor. Office occupancy is another contextual information for energy-efficient HVAC control. Most of the phone based occupancy estimation techniques will fail to determine occupancy when phones are left at desk while sitting or attending meetings. We propose a novel sensor fusion method to detect if a user is near the phone, which achieves more than 90% accuracy. Determining activity and occupancy our proposed algorithms can help maintaining thermal comfort while reducing HVAC energy consumptions. - Highlights: • We propose activity and occupancy detection for efficient HVAC control. • Activity classifier achieves 10% higher accuracy than SVM and kNN. • For occupancy detection we propose a novel sensor fusion method. • Using Weighted Majority Voting we fuse microphone and accelerometer data on phone. • We achieve more than 90% accuracy in detecting occupancy.

  6. Narrowing of the middle cerebral artery: artificial intelligence methods and comparison of transcranial color coded duplex sonography with conventional TCD.

    Science.gov (United States)

    Swiercz, Miroslaw; Swiat, Maciej; Pawlak, Mikolaj; Weigele, John; Tarasewicz, Roman; Sobolewski, Andrzej; Hurst, Robert W; Mariak, Zenon D; Melhem, Elias R; Krejza, Jaroslaw

    2010-01-01

    The goal of the study was to compare performances of transcranial color-coded duplex sonography (TCCS) and transcranial Doppler sonography (TCD) in the diagnosis of the middle cerebral artery (MCA) narrowing in the same population of patients using statistical and nonstatistical intelligent models for data analysis. We prospectively collected data from 179 consecutive routine digital subtraction angiography (DSA) procedures performed in 111 patients (mean age 54.17+/-14.4 years; 59 women, 52 men) who underwent TCD and TCCS examinations simultaneously. Each patient was examined independently using both ultrasound techniques, 267 M1 segments of MCA were assessed and narrowings were classified as 50% lumen reduction. Diagnostic performance was estimated by two statistical and two artificial neural networks (ANN) classification methods. Separate models were constructed for the TCD and TCCS sonographic data, as well as for detection of "any narrowing" and "severe narrowing" of the MCA. Input for each classifier consisted of the peak-systolic, mean and end-diastolic velocities measured with each sonographic method; the output was MCA narrowing. Arterial narrowings less or equal 50% of lumen reduction were found in 55 and >50% narrowings in 26 out of 267 arteries, as indicated by DSA. In the category of "any narrowing" the rate of correct assignment by all models was 82% to 83% for TCCS and 79% to 81% for TCD. In the diagnosis of >50% narrowing the overall classification accuracy remained in the range of 89% to 90% for TCCS data and 90% to 91% for TCD data. For the diagnosis of any narrowing, the sensitivity of the TCCS was significantly higher than that of the TCD, while for diagnosis of >50% MCA narrowing, sensitivity of the TCCS was similar to sensitivity of the TCD. Our study showed that TCCS outperforms conventional TCD in detection of diagnosis of >50% MCA narrowing. (E-mail: jaroslaw.krejza@uphs.upenn.edu).

  7. Study of the Appropriate and Inappropriate Methods of Visual Arts Education in the Primary Schools According to the Types of Multiple Intelligences

    Directory of Open Access Journals (Sweden)

    Atena Salehi Baladehi

    2017-01-01

    Full Text Available In the current changing world, named the era of knowledge explosion, specialists and those involved in education have been attracted finding a response to a question: what should we teach today’s students that to be useful for them in the future life? The main objective of this study is to investigate the appropriate and inappropriate methods of visual arts education in pre-school. According to the types of multiple intelligences, reaching to this goal requires careful planning, proper training and proper content selection along with talent and interests of learners along with the use of appropriate practice training and educational staff training. The research handles descriptive and analytic methods as well as academic literature. The results suggest the importance of understanding the multiple intelligences in the visual arts education.

  8. Rapid Automatic Lighting Control of a Mixed Light Source for Image Acquisition using Derivative Optimum Search Methods

    Directory of Open Access Journals (Sweden)

    Kim HyungTae

    2015-01-01

    Full Text Available Automatic lighting (auto-lighting is a function that maximizes the image quality of a vision inspection system by adjusting the light intensity and color.In most inspection systems, a single color light source is used, and an equal step search is employed to determine the maximum image quality. However, when a mixed light source is used, the number of iterations becomes large, and therefore, a rapid search method must be applied to reduce their number. Derivative optimum search methods follow the tangential direction of a function and are usually faster than other methods. In this study, multi-dimensional forms of derivative optimum search methods are applied to obtain the maximum image quality considering a mixed-light source. The auto-lighting algorithms were derived from the steepest descent and conjugate gradient methods, which have N-size inputs of driving voltage and one output of image quality. Experiments in which the proposed algorithm was applied to semiconductor patterns showed that a reduced number of iterations is required to determine the locally maximized image quality.

  9. Artificial Intelligence.

    Science.gov (United States)

    Wash, Darrel Patrick

    1989-01-01

    Making a machine seem intelligent is not easy. As a consequence, demand has been rising for computer professionals skilled in artificial intelligence and is likely to continue to go up. These workers develop expert systems and solve the mysteries of machine vision, natural language processing, and neural networks. (Editor)

  10. Intelligent Design

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    2005-01-01

    Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig.......Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig....

  11. An automated and efficient conformation search of L-cysteine and L,L-cystine using the scaled hypersphere search method

    Science.gov (United States)

    Kishimoto, Naoki; Waizumi, Hiroki

    2017-10-01

    Stable conformers of L-cysteine and L,L-cystine were explored using an automated and efficient conformational searching method. The Gibbs energies of the stable conformers of L-cysteine and L,L-cystine were calculated with G4 and MP2 methods, respectively, at 450, 298.15, and 150 K. By assuming thermodynamic equilibrium and the barrier energies for the conformational isomerization pathways, the estimated ratios of the stable conformers of L-cysteine were compared with those determined by microwave spectroscopy in a previous study. Equilibrium structures of 1:1 and 2:1 cystine-Fe complexes were also calculated, and the energy of insertion of Fe into the disulfide bond was obtained.

  12. Artificial intelligence/fuzzy logic method for analysis of combined signals from heavy metal chemical sensors

    International Nuclear Information System (INIS)

    Turek, M.; Heiden, W.; Riesen, A.; Chhabda, T.A.; Schubert, J.; Zander, W.; Krueger, P.; Keusgen, M.; Schoening, M.J.

    2009-01-01

    The cross-sensitivity of chemical sensors for several metal ions resembles in a way the overlapping sensitivity of some biological sensors, like the optical colour receptors of human retinal cone cells. While it is difficult to assign crisp classification values to measurands based on complex overlapping sensory signals, fuzzy logic offers a possibility to mathematically model such systems. Current work goes into the direction of mixed heavy metal solutions and the combination of fuzzy logic with heavy metal-sensitive, silicon-based chemical sensors for training scenarios of arbitrary sensor/probe combinations in terms of an electronic tongue. Heavy metals play an important role in environmental analysis. As trace elements as well as water impurities released from industrial processes they occur in the environment. In this work, the development of a new fuzzy logic method based on potentiometric measurements performed with three different miniaturised chalcogenide glass sensors in different heavy metal solutions will be presented. The critical validation of the developed fuzzy logic program will be demonstrated by means of measurements in unknown single- and multi-component heavy metal solutions. Limitations of this program and a comparison between calculated and expected values in terms of analyte composition and heavy metal ion concentration will be shown and discussed.

  13. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Yan, W.; Henry, G.

    1999-01-01

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  14. Artificial intelligence/fuzzy logic method for analysis of combined signals from heavy metal chemical sensors

    Energy Technology Data Exchange (ETDEWEB)

    Turek, M. [Institute of Nano- and Biotechnologies (INB), Aachen University of Applied Sciences, Campus Juelich, Juelich (Germany); Institute of Bio- and Nanosystems (IBN), Research Centre Juelich GmbH, Juelich (Germany); Heiden, W.; Riesen, A. [Bonn-Rhein-Sieg University of Applied Sciences, Sankt Augustin (Germany); Chhabda, T.A. [Institute of Nano- and Biotechnologies (INB), Aachen University of Applied Sciences, Campus Juelich, Juelich (Germany); Schubert, J.; Zander, W. [Institute of Bio- and Nanosystems (IBN), Research Centre Juelich GmbH, Juelich (Germany); Krueger, P. [Institute of Biochemistry and Molecular Biology, RWTH Aachen, Aachen (Germany); Keusgen, M. [Institute for Pharmaceutical Chemistry, Philipps-University Marburg, Marburg (Germany); Schoening, M.J. [Institute of Nano- and Biotechnologies (INB), Aachen University of Applied Sciences, Campus Juelich, Juelich (Germany); Institute of Bio- and Nanosystems (IBN), Research Centre Juelich GmbH, Juelich (Germany)], E-mail: m.j.schoening@fz-juelich.de

    2009-10-30

    The cross-sensitivity of chemical sensors for several metal ions resembles in a way the overlapping sensitivity of some biological sensors, like the optical colour receptors of human retinal cone cells. While it is difficult to assign crisp classification values to measurands based on complex overlapping sensory signals, fuzzy logic offers a possibility to mathematically model such systems. Current work goes into the direction of mixed heavy metal solutions and the combination of fuzzy logic with heavy metal-sensitive, silicon-based chemical sensors for training scenarios of arbitrary sensor/probe combinations in terms of an electronic tongue. Heavy metals play an important role in environmental analysis. As trace elements as well as water impurities released from industrial processes they occur in the environment. In this work, the development of a new fuzzy logic method based on potentiometric measurements performed with three different miniaturised chalcogenide glass sensors in different heavy metal solutions will be presented. The critical validation of the developed fuzzy logic program will be demonstrated by means of measurements in unknown single- and multi-component heavy metal solutions. Limitations of this program and a comparison between calculated and expected values in terms of analyte composition and heavy metal ion concentration will be shown and discussed.

  15. Fault detection and analysis in nuclear research facility using artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Ghazali, Abu Bakar, E-mail: Abakar@uniten.edu.my [Department of Electronics & Communication, College of Engineering, Universiti Tenaga Nasional, 43009 Kajang, Selangor (Malaysia); Ibrahim, Maslina Mohd [Instrumentation Program, Malaysian Nuclear Agency, Bangi (Malaysia)

    2016-01-22

    In this article, an online detection of transducer and actuator condition is discussed. A case study is on the reading of area radiation monitor (ARM) installed at the chimney of PUSPATI TRIGA nuclear reactor building, located at Bangi, Malaysia. There are at least five categories of abnormal ARM reading that could happen during the transducer failure, namely either the reading becomes very high, or very low/ zero, or with high fluctuation and noise. Moreover, the reading may be significantly higher or significantly lower as compared to the normal reading. An artificial neural network (ANN) and adaptive neuro-fuzzy inference system (ANFIS) are good methods for modeling this plant dynamics. The failure of equipment is based on ARM reading so it is then to compare with the estimated ARM data from ANN/ ANFIS function. The failure categories in either ‘yes’ or ‘no’ state are obtained from a comparison between the actual online data and the estimated output from ANN/ ANFIS function. It is found that this system design can correctly report the condition of ARM equipment in a simulated environment and later be implemented for online monitoring. This approach can also be extended to other transducers, such as the temperature profile of reactor core and also to include other critical actuator conditions such as the valves and pumps in the reactor facility provided that the failure symptoms are clearly defined.

  16. Development of a diagnostic expert system for eddy current data analysis using applied artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering; Behravesh, M.M. [Electric Power Research Institute, Palo Alto, CA (United States); Henry, G. [EPRI NDE Center, Charlotte, NC (United States)

    1999-09-01

    A diagnostic expert system that integrates database management methods, artificial neural networks, and decision-making using fuzzy logic has been developed for the automation of steam generator eddy current test (ECT) data analysis. The new system, known as EDDYAI, considers the following key issues: (1) digital eddy current test data calibration, compression, and representation; (2) development of robust neural networks with low probability of misclassification for flaw depth estimation; (3) flaw detection using fuzzy logic; (4) development of an expert system for database management, compilation of a trained neural network library, and a decision module; and (5) evaluation of the integrated approach using eddy current data. The implementation to field test data includes the selection of proper feature vectors for ECT data analysis, development of a methodology for large eddy current database management, artificial neural networks for flaw depth estimation, and a fuzzy logic decision algorithm for flaw detection. A large eddy current inspection database from the Electric Power Research Institute NDE Center is being utilized in this research towards the development of an expert system for steam generator tube diagnosis. The integration of ECT data pre-processing as part of the data management, fuzzy logic flaw detection technique, and tube defect parameter estimation using artificial neural networks are the fundamental contributions of this research. (orig.)

  17. Handbook of Intelligent Vehicles

    CERN Document Server

    2012-01-01

    The Handbook of Intelligent Vehicles provides a complete coverage of the fundamentals, new technologies, and sub-areas essential to the development of intelligent vehicles; it also includes advances made to date, challenges, and future trends. Significant strides in the field have been made to date; however, so far there has been no single book or volume which captures these advances in a comprehensive format, addressing all essential components and subspecialties of intelligent vehicles, as this book does. Since the intended users are engineering practitioners, as well as researchers and graduate students, the book chapters do not only cover fundamentals, methods, and algorithms but also include how software/hardware are implemented, and demonstrate the advances along with their present challenges. Research at both component and systems levels are required to advance the functionality of intelligent vehicles. This volume covers both of these aspects in addition to the fundamentals listed above.

  18. Modelling intelligent behavior

    Science.gov (United States)

    Green, H. S.; Triffet, T.

    1993-01-01

    An introductory discussion of the related concepts of intelligence and consciousness suggests criteria to be met in the modeling of intelligence and the development of intelligent materials. Methods for the modeling of actual structure and activity of the animal cortex have been found, based on present knowledge of the ionic and cellular constitution of the nervous system. These have led to the development of a realistic neural network model, which has been used to study the formation of memory and the process of learning. An account is given of experiments with simple materials which exhibit almost all properties of biological synapses and suggest the possibility of a new type of computer architecture to implement an advanced type of artificial intelligence.

  19. Emotional Intelligence: Requiring Attention

    Directory of Open Access Journals (Sweden)

    Monica Tudor

    2016-01-01

    Full Text Available This article aims to highlight the need for emotional intelligence. Two methods of measurementare presented in this research, in order to better understand the necessity of a correct result. Theresults of research can lead to recommendations for improving levels of emotional intelligence andare useful for obtaining data to better compare past and present result. The papers presented inthis research are significant for future study of this subject. The first paper presents the evolutionof emotional intelligence in the past two years, more specifically its decrease concerning certaincharacteristics. The second one presents a research on the differences between generations. Thethird one shows a difference in emotional intelligence levels of children from rural versus urbanenvironments and the obstacles that they encounter in their own development.

  20. Decentralized cooperative unmanned aerial vehicles conflict resolution by neural network-based tree search method

    Directory of Open Access Journals (Sweden)

    Jian Yang

    2016-09-01

    Full Text Available In this article, a tree search algorithm is proposed to find the near optimal conflict avoidance solutions for unmanned aerial vehicles. In the dynamic environment, the unmodeled elements, such as wind, would make UAVs deviate from nominal traces. It brings about difficulties for conflict detection and resolution. The back propagation neural networks are utilized to approximate the unmodeled dynamics of the environment. To satisfy the online planning requirement, the search length of the tree search algorithm would be limited. Therefore, the algorithm may not be able to reach the goal states in search process. The midterm reward function for assessing each node is devised, with consideration given to two factors, namely, the safe separation requirement and the mission of each unmanned aerial vehicle. The simulation examples and the comparisons with previous approaches are provided to illustrate the smooth and convincing behaviours of the proposed algorithm.

  1. Effectiveness of artificial intelligence methods in applications to burning optimization and coal mills diagnostics on the basis of IASE's experiences in Turow Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Pollak, J.; Wozniak, A.W.; Dynia, Z.; Lipanowicz, T.

    2004-07-01

    Modern methods referred to as 'artificial intelligence' have been applied to combustion optimization and implementation of selected diagnostic functions for the milling system of a pulverized lignite-fired boiler. The results of combustion optimization have shown significant improvement of efficiency and reduction of NO, emission. Fuzzy logic has been used to develop, among other things, a fan mill overload detection system.

  2. Artificial intelligence approaches in statistics

    International Nuclear Information System (INIS)

    Phelps, R.I.; Musgrove, P.B.

    1986-01-01

    The role of pattern recognition and knowledge representation methods from Artificial Intelligence within statistics is considered. Two areas of potential use are identified and one, data exploration, is used to illustrate the possibilities. A method is presented to identify and separate overlapping groups within cluster analysis, using an AI approach. The potential of such ''intelligent'' approaches is stressed

  3. Intelligent error correction method applied on an active pixel sensor based star tracker

    Science.gov (United States)

    Schmidt, Uwe

    2005-10-01

    Star trackers are opto-electronic sensors used on-board of satellites for the autonomous inertial attitude determination. During the last years star trackers became more and more important in the field of the attitude and orbit control system (AOCS) sensors. High performance star trackers are based up today on charge coupled device (CCD) optical camera heads. The active pixel sensor (APS) technology, introduced in the early 90-ties, allows now the beneficial replacement of CCD detectors by APS detectors with respect to performance, reliability, power, mass and cost. The company's heritage in star tracker design started in the early 80-ties with the launch of the worldwide first fully autonomous star tracker system ASTRO1 to the Russian MIR space station. Jena-Optronik recently developed an active pixel sensor based autonomous star tracker "ASTRO APS" as successor of the CCD based star tracker product series ASTRO1, ASTRO5, ASTRO10 and ASTRO15. Key features of the APS detector technology are, a true xy-address random access, the multiple windowing read out and the on-chip signal processing including the analogue to digital conversion. These features can be used for robust star tracking at high slew rates and under worse conditions like stray light and solar flare induced single event upsets. A special algorithm have been developed to manage the typical APS detector error contributors like fixed pattern noise (FPN), dark signal non-uniformity (DSNU) and white spots. The algorithm works fully autonomous and adapts to e.g. increasing DSNU and up-coming white spots automatically without ground maintenance or re-calibration. In contrast to conventional correction methods the described algorithm does not need calibration data memory like full image sized calibration data sets. The application of the presented algorithm managing the typical APS detector error contributors is a key element for the design of star trackers for long term satellite applications like

  4. Pep-3D-Search: a method for B-cell epitope prediction based on mimotope analysis.

    Science.gov (United States)

    Huang, Yan Xin; Bao, Yong Li; Guo, Shu Yan; Wang, Yan; Zhou, Chun Guang; Li, Yu Xin

    2008-12-16

    The prediction of conformational B-cell epitopes is one of the most important goals in immunoinformatics. The solution to this problem, even if approximate, would help in designing experiments to precisely map the residues of interaction between an antigen and an antibody. Consequently, this area of research has received considerable attention from immunologists, structural biologists and computational biologists. Phage-displayed random peptide libraries are powerful tools used to obtain mimotopes that are selected by binding to a given monoclonal antibody (mAb) in a similar way to the native epitope. These mimotopes can be considered as functional epitope mimics. Mimotope analysis based methods can predict not only linear but also conformational epitopes and this has been the focus of much research in recent years. Though some algorithms based on mimotope analysis have been proposed, the precise localization of the interaction site mimicked by the mimotopes is still a challenging task. In this study, we propose a method for B-cell epitope prediction based on mimotope analysis called Pep-3D-Search. Given the 3D structure of an antigen and a set of mimotopes (or a motif sequence derived from the set of mimotopes), Pep-3D-Search can be used in two modes: mimotope or motif. To evaluate the performance of Pep-3D-Search to predict epitopes from a set of mimotopes, 10 epitopes defined by crystallography were compared with the predicted results from a Pep-3D-Search: the average Matthews correlation coefficient (MCC), sensitivity and precision were 0.1758, 0.3642 and 0.6948. Compared with other available prediction algorithms, Pep-3D-Search showed comparable MCC, specificity and precision, and could provide novel, rational results. To verify the capability of Pep-3D-Search to align a motif sequence to a 3D structure for predicting epitopes, 6 test cases were used. The predictive performance of Pep-3D-Search was demonstrated to be superior to that of other similar programs

  5. The Professionalization of Intelligence Cooperation

    DEFF Research Database (Denmark)

    Svendsen, Adam David Morgan

    "Providing an in-depth insight into the subject of intelligence cooperation (officially known as liason), this book explores the complexities of this process. Towards facilitating a general understanding of the professionalization of intelligence cooperation, Svendsen's analysis includes risk...... management and encourages the realisation of greater resilience. Svendsen discusses the controversial, mixed and uneven characterisations of the process of the professionalization of intelligence cooperation and argues for a degree of 'fashioning method out of mayhem' through greater operational...

  6. Climate change on the Colorado River: a method to search for robust management strategies

    Science.gov (United States)

    Keefe, R.; Fischbach, J. R.

    2010-12-01

    The Colorado River is a principal source of water for the seven Basin States, providing approximately 16.5 maf per year to users in the southwestern United States and Mexico. Though the dynamics of the river ensure Upper Basin users a reliable supply of water, the three Lower Basin states (California, Nevada, and Arizona) are in danger of delivery interruptions as Upper Basin demand increases and climate change threatens to reduce future streamflows. In light of the recent drought and uncertain effects of climate change on Colorado River flows, we evaluate the performance of a suite of policies modeled after the shortage sharing agreement adopted in December 2007 by the Department of the Interior. We build on the current literature by using a simplified model of the Lower Colorado River to consider future streamflow scenarios given climate change uncertainty. We also generate different scenarios of parametric consumptive use growth in the Upper Basin and evaluate alternate management strategies in light of these uncertainties. Uncertainty associated with climate change is represented with a multi-model ensemble from the literature, using a nearest neighbor perturbation to increase the size of the ensemble. We use Robust Decision Making to compare near-term or long-term management strategies across an ensemble of plausible future scenarios with the goal of identifying one or more approaches that are robust to alternate assumptions about the future. This method entails using search algorithms to quantitatively identify vulnerabilities that may threaten a given strategy (including the current operating policy) and characterize key tradeoffs between strategies under different scenarios.

  7. Best, Useful and Objective Precisions for Information Retrieval of Three Search Methods in PubMed and iPubMed

    Directory of Open Access Journals (Sweden)

    Somayyeh Nadi Ravandi

    2016-10-01

    Full Text Available MEDLINE is one of the valuable sources of medical information on the Internet. Among the different open access sites of MEDLINE, PubMed is the best-known site. In 2010, iPubMed was established with an interaction-fuzzy search method for MEDLINE access. In the present work, we aimed to compare the precision of the retrieved sources (Best, Useful and Objective precision in the PubMed and iPubMed using two search methods (simple and MeSH search in PubMed and interaction-fuzzy method in iPubmed. During our semi-empirical study period, we held training workshops for 61 students of higher education to teach them Simple Search, MeSH Search, and Fuzzy-Interaction Search methods. Then, the precision of 305 searches for each method prepared by the students was calculated on the basis of Best precision, Useful precision, and Objective precision formulas. Analyses were done in SPSS version 11.5 using the Friedman and Wilcoxon Test, and three precisions obtained with the three precision formulas were studied for the three search methods. The mean precision of the interaction-fuzzy Search method was higher than that of the simple search and MeSH search for all three types of precision, i.e., Best precision, Useful precision, and Objective precision, and the Simple search method was in the next rank, and their mean precisions were significantly different (P < 0.001. The precision of the interaction-fuzzy search method in iPubmed was investigated for the first time. Also for the first time, three types of precision were evaluated in PubMed and iPubmed. The results showed that the Interaction-Fuzzy search method is more precise than using the natural language search (simple search and MeSH search, and users of this method found papers that were more related to their queries; even though search in Pubmed is useful, it is important that users apply new search methods to obtain the best results.

  8. Integration of artificial intelligence methods and life cycle assessment to predict energy output and environmental impacts of paddy production.

    Science.gov (United States)

    Nabavi-Pelesaraei, Ashkan; Rafiee, Shahin; Mohtasebi, Seyed Saeid; Hosseinzadeh-Bandbafha, Homa; Chau, Kwok-Wing

    2018-08-01

    Prediction of agricultural energy output and environmental impacts play important role in energy management and conservation of environment as it can help us to evaluate agricultural energy efficiency, conduct crops production system commissioning, and detect and diagnose faults of crop production system. Agricultural energy output and environmental impacts can be readily predicted by artificial intelligence (AI), owing to the ease of use and adaptability to seek optimal solutions in a rapid manner as well as the use of historical data to predict future agricultural energy use pattern under constraints. This paper conducts energy output and environmental impact prediction of paddy production in Guilan province, Iran based on two AI methods, artificial neural networks (ANNs), and adaptive neuro fuzzy inference system (ANFIS). The amounts of energy input and output are 51,585.61MJkg -1 and 66,112.94MJkg -1 , respectively, in paddy production. Life Cycle Assessment (LCA) is used to evaluate environmental impacts of paddy production. Results show that, in paddy production, in-farm emission is a hotspot in global warming, acidification and eutrophication impact categories. ANN model with 12-6-8-1 structure is selected as the best one for predicting energy output. The correlation coefficient (R) varies from 0.524 to 0.999 in training for energy input and environmental impacts in ANN models. ANFIS model is developed based on a hybrid learning algorithm, with R for predicting output energy being 0.860 and, for environmental impacts, varying from 0.944 to 0.997. Results indicate that the multi-level ANFIS is a useful tool to managers for large-scale planning in forecasting energy output and environmental indices of agricultural production systems owing to its higher speed of computation processes compared to ANN model, despite ANN's higher accuracy. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. In vitro detection of circulating tumor cells compared by the CytoTrack and CellSearch methods

    DEFF Research Database (Denmark)

    Hillig, T.; Horn, P.; Nygaard, Ann-Britt

    2015-01-01

    .23/p = 0.09). Overall, the recovery of CytoTrack and CellSearch was 68.8 +/- 3.9 %/71.1 +/- 2.9 %, respectively (p = 0.58). In spite of different methodologies, CytoTrack and CellSearch found similar number of CTCs, when spiking was performed with the EpCAM and pan cytokeratin-positive cell line MCF-7......Comparison of two methods to detect circulating tumor cells (CTC) CytoTrack and CellSearch through recovery of MCF-7 breast cancer cells, spiked into blood collected from healthy donors. Spiking of a fixed number of EpCAM and pan-cytokeratin positive MCF-7 cells into 7.5 mL donor blood...... was performed by FACSAria flow sorting. The samples were shipped to either CytoTrack or CellSearch research facilities within 48 h, where evaluation of MCF-7 recovery was performed. CytoTrack and CellSearch analyses were performed simultaneously. Recoveries of MCF-7 single cells, cells in clusters, and clusters...

  10. Intelligent playgrounds

    DEFF Research Database (Denmark)

    Larsen, Lasse Juel

    2009-01-01

    This paper examines play, gaming and learning in regard to intelligent playware developed for outdoor use. The key questions are how does these novel artefacts influence the concept of play, gaming and learning. Up until now play and game have been understood as different activities. This paper...... examines if the sharp differentiation between the two can be uphold in regard to intelligent playware for outdoor use. Play and game activities will be analysed and viewed in conjunction with learning contexts. This paper will stipulate that intelligent playware facilitates rapid shifts in contexts...

  11. Artificial intelligence

    CERN Document Server

    Ennals, J R

    1987-01-01

    Artificial Intelligence: State of the Art Report is a two-part report consisting of the invited papers and the analysis. The editor first gives an introduction to the invited papers before presenting each paper and the analysis, and then concludes with the list of references related to the study. The invited papers explore the various aspects of artificial intelligence. The analysis part assesses the major advances in artificial intelligence and provides a balanced analysis of the state of the art in this field. The Bibliography compiles the most important published material on the subject of

  12. Artificial Intelligence

    CERN Document Server

    Warwick, Kevin

    2011-01-01

    if AI is outside your field, or you know something of the subject and would like to know more then Artificial Intelligence: The Basics is a brilliant primer.' - Nick Smith, Engineering and Technology Magazine November 2011 Artificial Intelligence: The Basics is a concise and cutting-edge introduction to the fast moving world of AI. The author Kevin Warwick, a pioneer in the field, examines issues of what it means to be man or machine and looks at advances in robotics which have blurred the boundaries. Topics covered include: how intelligence can be defined whether machines can 'think' sensory

  13. Soft computing in artificial intelligence

    CERN Document Server

    Matson, Eric

    2014-01-01

    This book explores the concept of artificial intelligence based on knowledge-based algorithms. Given the current hardware and software technologies and artificial intelligence theories, we can think of how efficient to provide a solution, how best to implement a model and how successful to achieve it. This edition provides readers with the most recent progress and novel solutions in artificial intelligence. This book aims at presenting the research results and solutions of applications in relevance with artificial intelligence technologies. We propose to researchers and practitioners some methods to advance the intelligent systems and apply artificial intelligence to specific or general purpose. This book consists of 13 contributions that feature fuzzy (r, s)-minimal pre- and β-open sets, handling big coocurrence matrices, Xie-Beni-type fuzzy cluster validation, fuzzy c-regression models, combination of genetic algorithm and ant colony optimization, building expert system, fuzzy logic and neural network, ind...

  14. A Sequential Mixed Methods Study: An Exploration of the Use of Emotional Intelligence by Senior Student Affairs Officers in Managing Critical Incidents

    Science.gov (United States)

    Johnson, Brian

    2013-01-01

    Emotional intelligence is a relatively new academic discipline that began forming in the early 1990s. Currently, emotional intelligence is used in academia and in business as a new intelligence quotient. This research study investigates how Senior Student Affairs Officers' use their emotional intelligence ability during critical incidents. The…

  15. Artificial Intelligence in Cardiology.

    Science.gov (United States)

    Johnson, Kipp W; Torres Soto, Jessica; Glicksberg, Benjamin S; Shameer, Khader; Miotto, Riccardo; Ali, Mohsin; Ashley, Euan; Dudley, Joel T

    2018-06-12

    Artificial intelligence and machine learning are poised to influence nearly every aspect of the human condition, and cardiology is not an exception to this trend. This paper provides a guide for clinicians on relevant aspects of artificial intelligence and machine learning, reviews selected applications of these methods in cardiology to date, and identifies how cardiovascular medicine could incorporate artificial intelligence in the future. In particular, the paper first reviews predictive modeling concepts relevant to cardiology such as feature selection and frequent pitfalls such as improper dichotomization. Second, it discusses common algorithms used in supervised learning and reviews selected applications in cardiology and related disciplines. Third, it describes the advent of deep learning and related methods collectively called unsupervised learning, provides contextual examples both in general medicine and in cardiovascular medicine, and then explains how these methods could be applied to enable precision cardiology and improve patient outcomes. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Multi-Agent Based Beam Search for Real-Time Production Scheduling and Control Method, Software and Industrial Application

    CERN Document Server

    Kang, Shu Gang

    2013-01-01

    The Multi-Agent Based Beam Search (MABBS) method systematically integrates four major requirements of manufacturing production - representation capability, solution quality, computation efficiency, and implementation difficulty - within a unified framework to deal with the many challenges of complex real-world production planning and scheduling problems. Multi-agent Based Beam Search for Real-time Production Scheduling and Control introduces this method, together with its software implementation and industrial applications.  This book connects academic research with industrial practice, and develops a practical solution to production planning and scheduling problems. To simplify implementation, a reusable software platform is developed to build the MABBS method into a generic computation engine.  This engine is integrated with a script language, called the Embedded Extensible Application Script Language (EXASL), to provide a flexible and straightforward approach to representing complex real-world problems. ...

  17. THE CHANGING LANDSCAPE OF COMPETITIVE INTELLIGENCE: TWO CRITICAL ISSUES INVESTIGATED

    OpenAIRE

    John J. McGonagle; Michael Misner-Elias

    2016-01-01

    Competitive intelligence is evolving. Why? It is the evolving needs of businesses and not the method or technology supporting the gathering and analysis of information that force this continuing evolution. Two changes in competitive intelligence are investigated in this paper: 1) the failure of the competitive intelligence system because of reliance on an outdated understanding of the intelligence cycle and the associated concepts of key intelligence topics (KITs) and key intelligence questio...

  18. Method and electronic database search engine for exposing the content of an electronic database

    NARCIS (Netherlands)

    Stappers, P.J.

    2000-01-01

    The invention relates to an electronic database search engine comprising an electronic memory device suitable for storing and releasing elements from the database, a display unit, a user interface for selecting and displaying at least one element from the database on the display unit, and control

  19. Intelligent Advertising

    OpenAIRE

    Díaz Pinedo, Edilfredo Eliot

    2012-01-01

    Intelligent Advertisement diseña e implementa un sistema de publicidad para dispositivos móviles en un centro comercial, donde los clientes reciben publicidad de forma pasiva en sus dispositivos mientras están dentro.

  20. BUSINESS INTELLIGENCE

    OpenAIRE

    Bogdan Mohor Dumitrita

    2011-01-01

    The purpose of this work is to present business intelligence systems. These systems can be extremely complex and important in modern market competition. Its effectiveness also reflects in price, so we have to exlore their financial potential before investment. The systems have 20 years long history and during that time many of such tools have been developed, but they are rarely still in use. Business intelligence system consists of three main areas: Data Warehouse, ETL tools and tools f...