WorldWideScience

Sample records for search methods discoveries

  1. DISCOVERY OF NINE GAMMA-RAY PULSARS IN FERMI LARGE AREA TELESCOPE DATA USING A NEW BLIND SEARCH METHOD

    Pletsch, H. J.; Allen, B.; Aulbert, C.; Fehrmann, H. [Albert-Einstein-Institut, Max-Planck-Institut fuer Gravitationsphysik, D-30167 Hannover (Germany); Guillemot, L.; Kramer, M.; Barr, E. D.; Champion, D. J.; Eatough, R. P.; Freire, P. C. C. [Max-Planck-Institut fuer Radioastronomie, Auf dem Huegel 69, D-53121 Bonn (Germany); Ray, P. S. [Space Science Division, Naval Research Laboratory, Washington, DC 20375-5352 (United States); Belfiore, A.; Dormody, M. [Santa Cruz Institute for Particle Physics, Department of Physics and Department of Astronomy and Astrophysics, University of California at Santa Cruz, Santa Cruz, CA 95064 (United States); Camilo, F. [Columbia Astrophysics Laboratory, Columbia University, New York, NY 10027 (United States); Caraveo, P. A. [INAF-Istituto di Astrofisica Spaziale e Fisica Cosmica, I-20133 Milano (Italy); Celik, Oe.; Ferrara, E. C. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Hessels, J. W. T. [Astronomical Institute ' Anton Pannekoek' , University of Amsterdam, Postbus 94249, 1090 GE Amsterdam (Netherlands); Keith, M. [CSIRO Astronomy and Space Science, Australia Telescope National Facility, Epping NSW 1710 (Australia); Kerr, M., E-mail: holger.pletsch@aei.mpg.de, E-mail: guillemo@mpifr-bonn.mpg.de [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States); and others

    2012-01-10

    We report the discovery of nine previously unknown gamma-ray pulsars in a blind search of data from the Fermi Large Area Telescope (LAT). The pulsars were found with a novel hierarchical search method originally developed for detecting continuous gravitational waves from rapidly rotating neutron stars. Designed to find isolated pulsars spinning at up to kHz frequencies, the new method is computationally efficient and incorporates several advances, including a metric-based gridding of the search parameter space (frequency, frequency derivative, and sky location) and the use of photon probability weights. The nine pulsars have spin frequencies between 3 and 12 Hz, and characteristic ages ranging from 17 kyr to 3 Myr. Two of them, PSRs J1803-2149 and J2111+ 4606, are young and energetic Galactic-plane pulsars (spin-down power above 6 Multiplication-Sign 10{sup 35} erg s{sup -1} and ages below 100 kyr). The seven remaining pulsars, PSRs J0106+4855, J0622+3749, J1620-4927, J1746-3239, J2028+3332, J2030+4415, and J2139+4716, are older and less energetic; two of them are located at higher Galactic latitudes (|b| > 10 Degree-Sign ). PSR J0106+4855 has the largest characteristic age (3 Myr) and the smallest surface magnetic field (2 Multiplication-Sign 10{sup 11} G) of all LAT blind-search pulsars. PSR J2139+4716 has the lowest spin-down power (3 Multiplication-Sign 10{sup 33} erg s{sup -1}) among all non-recycled gamma-ray pulsars ever found. Despite extensive multi-frequency observations, only PSR J0106+4855 has detectable pulsations in the radio band. The other eight pulsars belong to the increasing population of radio-quiet gamma-ray pulsars.

  2. Discovery of Nine Gamma-Ray Pulsars in Fermi-Lat Data Using a New Blind Search Method

    Celik-Tinmaz, Ozlem; Ferrara, E. C.; Pletsch, H. J.; Allen, B.; Aulbert, C.; Fehrmann, H.; Kramer, M.; Barr, E. D.; Champion, D. J.; Eatough, R. P.; hide

    2011-01-01

    We report the discovery of nine previously unknown gamma-ray pulsars in a blind search of data from the Fermi Large Area Telescope (LAT). The pulsars were found with a novel hierarchical search method originally developed for detecting continuous gravitational waves from rapidly rotating neutron stars. Designed to find isolated pulsars spinning at up to kHz frequencies, the new method is computationally efficient, and incorporates several advances, including a metric-based gridding of the search parameter space (frequency, frequency derivative and sky location) and the use of photon probability weights. The nine pulsars have spin frequencies between 3 and 12 Hz, and characteristic ages ranging from 17 kyr to 3 Myr. Two of them, PSRs Jl803-2149 and J2111+4606, are young and energetic Galactic-plane pulsars (spin-down power above 6 x 10(exp 35) ergs per second and ages below 100 kyr). The seven remaining pulsars, PSRs J0106+4855, J010622+3749, Jl620-4927, Jl746-3239, J2028+3332,J2030+4415, J2139+4716, are older and less energetic; two of them are located at higher Galactic latitudes (|b| greater than 10 degrees). PSR J0106+4855 has the largest characteristic age (3 Myr) and the smallest surface magnetic field (2x 10(exp 11)G) of all LAT blind-search pulsars. PSR J2139+4716 has the lowest spin-down power (3 x l0(exp 33) erg per second) among all non-recycled gamma-ray pulsars ever found. Despite extensive multi-frequency observations, only PSR J0106+4855 has detectable pulsations in the radio band. The other eight pulsars belong to the increasing population of radio-quiet gamma-ray pulsars.

  3. Searching the ASRS Database Using QUORUM Keyword Search, Phrase Search, Phrase Generation, and Phrase Discovery

    McGreevy, Michael W.; Connors, Mary M. (Technical Monitor)

    2001-01-01

    To support Search Requests and Quick Responses at the Aviation Safety Reporting System (ASRS), four new QUORUM methods have been developed: keyword search, phrase search, phrase generation, and phrase discovery. These methods build upon the core QUORUM methods of text analysis, modeling, and relevance-ranking. QUORUM keyword search retrieves ASRS incident narratives that contain one or more user-specified keywords in typical or selected contexts, and ranks the narratives on their relevance to the keywords in context. QUORUM phrase search retrieves narratives that contain one or more user-specified phrases, and ranks the narratives on their relevance to the phrases. QUORUM phrase generation produces a list of phrases from the ASRS database that contain a user-specified word or phrase. QUORUM phrase discovery finds phrases that are related to topics of interest. Phrase generation and phrase discovery are particularly useful for finding query phrases for input to QUORUM phrase search. The presentation of the new QUORUM methods includes: a brief review of the underlying core QUORUM methods; an overview of the new methods; numerous, concrete examples of ASRS database searches using the new methods; discussion of related methods; and, in the appendices, detailed descriptions of the new methods.

  4. Assessment of Metabolome Annotation Quality: A Method for Evaluating the False Discovery Rate of Elemental Composition Searches

    Matsuda, Fumio; Shinbo, Yoko; Oikawa, Akira; Hirai, Masami Yokota; Fiehn, Oliver; Kanaya, Shigehiko; Saito, Kazuki

    2009-01-01

    Background In metabolomics researches using mass spectrometry (MS), systematic searching of high-resolution mass data against compound databases is often the first step of metabolite annotation to determine elemental compositions possessing similar theoretical mass numbers. However, incorrect hits derived from errors in mass analyses will be included in the results of elemental composition searches. To assess the quality of peak annotation information, a novel methodology for false discovery rates (FDR) evaluation is presented in this study. Based on the FDR analyses, several aspects of an elemental composition search, including setting a threshold, estimating FDR, and the types of elemental composition databases most reliable for searching are discussed. Methodology/Principal Findings The FDR can be determined from one measured value (i.e., the hit rate for search queries) and four parameters determined by Monte Carlo simulation. The results indicate that relatively high FDR values (30–50%) were obtained when searching time-of-flight (TOF)/MS data using the KNApSAcK and KEGG databases. In addition, searches against large all-in-one databases (e.g., PubChem) always produced unacceptable results (FDR >70%). The estimated FDRs suggest that the quality of search results can be improved not only by performing more accurate mass analysis but also by modifying the properties of the compound database. A theoretical analysis indicates that FDR could be improved by using compound database with smaller but higher completeness entries. Conclusions/Significance High accuracy mass analysis, such as Fourier transform (FT)-MS, is needed for reliable annotation (FDR metabolome data. PMID:19847304

  5. Discovery and Innovation: Advanced Search

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  6. Discovery and Innovation: Advanced Search

    ... containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., archive ((journal OR conference) NOT theses); Search for an exact phrase by putting it in quotes; e.g., "open access publishing"; Exclude a word by prefixing it with - or NOT; e.g. online -politics or online NOT politics ...

  7. The Higgs Boson Search and Discovery

    Bernardi, Gregorio

    2016-01-01

    We present a brief account of the search for the Higgs boson at the three major colliders that have operated over the last three decades: LEP, the Tevatron, and the LHC. The experimental challenges encountered stemmed from the distinct event phenomenology as determined by the colliders energy and the possible values for the Higgs boson mass, and from the capability of these colliders to deliver as much collision data as possible to fully explore the mass spectrum within their reach. Focusing more on the hadron collider searches during the last decade, we discuss how the search for the Higgs boson was advanced through mastering the experimental signatures of standard theory backgrounds, through the comprehensive utilization of the features of the detectors involved in the searches, and by means of advanced data analysis techniques. The search culminated in 2012 with the discovery, by the ATLAS and CMS collaborations, of a Higgs-like particle with mass close to 125 GeV, confirmed more recently to have propertie...

  8. An extended dual search space model of scientific discovery learning

    van Joolingen, Wouter; de Jong, Anthonius J.M.

    1997-01-01

    This article describes a theory of scientific discovery learning which is an extension of Klahr and Dunbar''s model of Scientific Discovery as Dual Search (SDDS) model. We present a model capable of describing and understanding scientific discovery learning in complex domains in terms of the SDDS

  9. Computational methods in drug discovery

    Sumudu P. Leelananda

    2016-12-01

    Full Text Available The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed.

  10. SpEnD: Linked Data SPARQL Endpoints Discovery Using Search Engines

    Yumusak, Semih; Dogdu, Erdogan; Kodaz, Halife; Kamilaris, Andreas

    2016-01-01

    In this study, a novel metacrawling method is proposed for discovering and monitoring linked data sources on the Web. We implemented the method in a prototype system, named SPARQL Endpoints Discovery (SpEnD). SpEnD starts with a "search keyword" discovery process for finding relevant keywords for the linked data domain and specifically SPARQL endpoints. Then, these search keywords are utilized to find linked data sources via popular search engines (Google, Bing, Yahoo, Yandex). By using this ...

  11. 43 CFR 4.1130 - Discovery methods.

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Discovery methods. 4.1130 Section 4.1130... Special Rules Applicable to Surface Coal Mining Hearings and Appeals Discovery § 4.1130 Discovery methods. Parties may obtain discovery by one or more of the following methods— (a) Depositions upon oral...

  12. Paths of discovery: Comparing the search effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and conventional library resources.

    Müge Akbulut

    2015-09-01

    Full Text Available It is becoming hard for users to select significant sources among many others as number of scientific publications increase (Henning and Gunn, 2012. Search engines that are using cloud computing methods such as Google can list related documents successfully answering user requirements (Johnson, Levine and Smith, 2009. In order to meet users’ increasing demands, libraries started to use systems which enable users to access printed and electronic sources through a single interface. This study uses quantitative and qualitative methods to compare search effectiveness between Serial Solutions Summon, EBSCO Discovery Service (EDS web discovery tools, Google Scholar (GS and conventional library databases among users from Bucknell University and Illinois Wesleyan University.

  13. Paths of Discovery: Comparing the Search Effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and Conventional Library Resources

    Asher, Andrew D.; Duke, Lynda M.; Wilson, Suzanne

    2013-01-01

    In 2011, researchers at Bucknell University and Illinois Wesleyan University compared the search efficacy of Serial Solutions Summon, EBSCO Discovery Service, Google Scholar, and conventional library databases. Using a mixed-methods approach, qualitative and quantitative data were gathered on students' usage of these tools. Regardless of the…

  14. 29 CFR 18.13 - Discovery methods.

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Discovery methods. 18.13 Section 18.13 Labor Office of the... ADMINISTRATIVE LAW JUDGES General § 18.13 Discovery methods. Parties may obtain discovery by one or more of the following methods: Depositions upon oral examination or written questions; written interrogatories...

  15. Semantic Search in E-Discovery: An Interdisciplinary Approach

    Graus, D.; Ren, Z.; de Rijke, M.; van Dijk, D.; Henseler, H.; van der Knaap, N.

    2013-01-01

    We propose an interdisciplinary approach to applying and evaluating semantic search in the e-discovery setting. By combining expertise from the fields of law and criminology with that of information retrieval and extraction, we move beyond "algorithm-centric" evaluation, towards evaluating the

  16. Computational methods in drug discovery

    Sumudu P. Leelananda; Steffen Lindert

    2016-01-01

    The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery project...

  17. Enhancing discovery in spatial data infrastructures using a search engine

    Paolo Corti

    2018-05-01

    Full Text Available A spatial data infrastructure (SDI is a framework of geospatial data, metadata, users and tools intended to provide an efficient and flexible way to use spatial information. One of the key software components of an SDI is the catalogue service which is needed to discover, query and manage the metadata. Catalogue services in an SDI are typically based on the Open Geospatial Consortium (OGC Catalogue Service for the Web (CSW standard which defines common interfaces for accessing the metadata information. A search engine is a software system capable of supporting fast and reliable search, which may use ‘any means necessary’ to get users to the resources they need quickly and efficiently. These techniques may include full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting, recommendations and many others. In this paper we present an example of a search engine being added to an SDI to improve search against large collections of geospatial datasets. The Centre for Geographic Analysis (CGA at Harvard University re-engineered the search component of its public domain SDI (Harvard WorldMap which is based on the GeoNode platform. A search engine was added to the SDI stack to enhance the CSW catalogue discovery abilities. It is now possible to discover spatial datasets from metadata by using the standard search operations of the catalogue and to take advantage of the new abilities of the search engine, to return relevant and reliable content to SDI users.

  18. SpEnD: Linked Data SPARQL Endpoints Discovery Using Search Engines

    Yumusak, Semih; Dogdu, Erdogan; Kodaz, Halife; Kamilaris, Andreas; Vandenbussche, Pierre-Yves

    In this study, a novel metacrawling method is proposed for discovering and monitoring linked data sources on the Web. We implemented the method in a prototype system, named SPARQL Endpoints Discovery (SpEnD). SpEnD starts with a "search keyword" discovery process for finding relevant keywords for the linked data domain and specifically SPARQL endpoints. Then, these search keywords are utilized to find linked data sources via popular search engines (Google, Bing, Yahoo, Yandex). By using this method, most of the currently listed SPARQL endpoints in existing endpoint repositories, as well as a significant number of new SPARQL endpoints, have been discovered. Finally, we have developed a new SPARQL endpoint crawler (SpEC) for crawling and link analysis.

  19. Applying Hierarchical Task Analysis Method to Discovery Layer Evaluation

    Marlen Promann

    2015-03-01

    Full Text Available Libraries are implementing discovery layers to offer better user experiences. While usability tests have been helpful in evaluating the success or failure of implementing discovery layers in the library context, the focus has remained on its relative interface benefits over the traditional federated search. The informal site- and context specific usability tests have offered little to test the rigor of the discovery layers against the user goals, motivations and workflow they have been designed to support. This study proposes hierarchical task analysis (HTA as an important complementary evaluation method to usability testing of discovery layers. Relevant literature is reviewed for the discovery layers and the HTA method. As no previous application of HTA to the evaluation of discovery layers was found, this paper presents the application of HTA as an expert based and workflow centered (e.g. retrieving a relevant book or a journal article method to evaluating discovery layers. Purdue University’s Primo by Ex Libris was used to map eleven use cases as HTA charts. Nielsen’s Goal Composition theory was used as an analytical framework to evaluate the goal carts from two perspectives: a users’ physical interactions (i.e. clicks, and b user’s cognitive steps (i.e. decision points for what to do next. A brief comparison of HTA and usability test findings is offered as a way of conclusion.

  20. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value

  1. mySearch changed my life – a resource discovery journey

    Crowley, Emma J.

    2013-01-01

    mySearch: the federated years mySearch: choosing a new platform mySearch: EBSCO Discovery Service (EDS) Implementing a new system Technical challenges Has resource discovery enhanced experiences at BU? Ongoing challenges Implications for library management systems Implications for information literacy Questions

  2. OpenSearch technology for geospatial resources discovery

    Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo

    2010-05-01

    In 2005, the term Web 2.0 has been coined by Tim O'Reilly to describe a quickly growing set of Web-based applications that share a common philosophy of "mutually maximizing collective intelligence and added value for each participant by formalized and dynamic information sharing". Around this same period, OpenSearch a new Web 2.0 technology, was developed. More properly, OpenSearch is a collection of technologies that allow publishing of search results in a format suitable for syndication and aggregation. It is a way for websites and search engines to publish search results in a standard and accessible format. Due to its strong impact on the way the Web is perceived by users and also due its relevance for businesses, Web 2.0 has attracted the attention of both mass media and the scientific community. This explosive growth in popularity of Web 2.0 technologies like OpenSearch, and practical applications of Service Oriented Architecture (SOA) resulted in an increased interest in similarities, convergence, and a potential synergy of these two concepts. SOA is considered as the philosophy of encapsulating application logic in services with a uniformly defined interface and making these publicly available via discovery mechanisms. Service consumers may then retrieve these services, compose and use them according to their current needs. A great degree of similarity between SOA and Web 2.0 may be leading to a convergence between the two paradigms. They also expose divergent elements, such as the Web 2.0 support to the human interaction in opposition to the typical SOA machine-to-machine interaction. According to these considerations, the Geospatial Information (GI) domain, is also moving first steps towards a new approach of data publishing and discovering, in particular taking advantage of the OpenSearch technology. A specific GI niche is represented by the OGC Catalog Service for Web (CSW) that is part of the OGC Web Services (OWS) specifications suite, which provides a

  3. Search of computers for discovery of electronic evidence

    Pisarić Milana M.

    2015-01-01

    Full Text Available In order to address the specific nature of criminal activities committed using computer networks and systems, the efforts of states to adapt or complement the existing criminal law with purposeful provisions is understandable. To create an appropriate legal framework for supressing cybercrime, except the rules of substantive criminal law predict certain behavior as criminal offenses against the confidentiality, integrity and availability of computer data, computer systems and networks, it is essential that the provisions of the criminal procedure law contain adequate powers of competent authorities for detecting sources of illegal activities, or the collection of data on the committed criminal offense and offender, which can be used as evidence in criminal proceedings, taking into account the specificities of cyber crime and the environment within which the illegal activity is undertaken. Accordingly, the provisions of the criminal procedural law should be designed to be able to overcome certain challenges in discovering and proving high technology crime, and the provisions governing search of computer for discovery of electronic evidence is of special importance.

  4. NEW COMPLETENESS METHODS FOR ESTIMATING EXOPLANET DISCOVERIES BY DIRECT DETECTION

    Brown, Robert A.; Soummer, Remi

    2010-01-01

    We report on new methods for evaluating realistic observing programs that search stars for planets by direct imaging, where observations are selected from an optimized star list and stars can be observed multiple times. We show how these methods bring critical insight into the design of the mission and its instruments. These methods provide an estimate of the outcome of the observing program: the probability distribution of discoveries (detection and/or characterization) and an estimate of the occurrence rate of planets (η). We show that these parameters can be accurately estimated from a single mission simulation, without the need for a complete Monte Carlo mission simulation, and we prove the accuracy of this new approach. Our methods provide tools to define a mission for a particular science goal; for example, a mission can be defined by the expected number of discoveries and its confidence level. We detail how an optimized star list can be built and how successive observations can be selected. Our approach also provides other critical mission attributes, such as the number of stars expected to be searched and the probability of zero discoveries. Because these attributes depend strongly on the mission scale (telescope diameter, observing capabilities and constraints, mission lifetime, etc.), our methods are directly applicable to the design of such future missions and provide guidance to the mission and instrument design based on scientific performance. We illustrate our new methods with practical calculations and exploratory design reference missions for the James Webb Space Telescope (JWST) operating with a distant starshade to reduce scattered and diffracted starlight on the focal plane. We estimate that five habitable Earth-mass planets would be discovered and characterized with spectroscopy, with a probability of zero discoveries of 0.004, assuming a small fraction of JWST observing time (7%), η = 0.3, and 70 observing visits, limited by starshade fuel.

  5. Efficient searching in meshfree methods

    Olliff, James; Alford, Brad; Simkins, Daniel C.

    2018-04-01

    Meshfree methods such as the Reproducing Kernel Particle Method and the Element Free Galerkin method have proven to be excellent choices for problems involving complex geometry, evolving topology, and large deformation, owing to their ability to model the problem domain without the constraints imposed on the Finite Element Method (FEM) meshes. However, meshfree methods have an added computational cost over FEM that come from at least two sources: increased cost of shape function evaluation and the determination of adjacency or connectivity. The focus of this paper is to formally address the types of adjacency information that arises in various uses of meshfree methods; a discussion of available techniques for computing the various adjacency graphs; propose a new search algorithm and data structure; and finally compare the memory and run time performance of the methods.

  6. Search strategy has influenced the discovery rate of human viruses.

    Rosenberg, Ronald; Johansson, Michael A; Powers, Ann M; Miller, Barry R

    2013-08-20

    A widely held concern is that the pace of infectious disease emergence has been increasing. We have analyzed the rate of discovery of pathogenic viruses, the preeminent source of newly discovered causes of human disease, from 1897 through 2010. The rate was highest during 1950-1969, after which it moderated. This general picture masks two distinct trends: for arthropod-borne viruses, which comprised 39% of pathogenic viruses, the discovery rate peaked at three per year during 1960-1969, but subsequently fell nearly to zero by 1980; however, the rate of discovery of nonarboviruses remained stable at about two per year from 1950 through 2010. The period of highest arbovirus discovery coincided with a comprehensive program supported by The Rockefeller Foundation of isolating viruses from humans, animals, and arthropod vectors at field stations in Latin America, Africa, and India. The productivity of this strategy illustrates the importance of location, approach, long-term commitment, and sponsorship in the discovery of emerging pathogens.

  7. The Search for Regularity: Four Aspects of Scientific Discovery.

    1984-09-01

    reactions, and ultimately led to the determination of relative atomic weights. To some extent. Dalton’s and Guy- Lussacs laws were motivated by an atomic...heuristic search empirical laws theory of acids and bases structural models theory of phlogiston qualitative laws atomic theory 20. ABSTRACT (Continue...systems that address different facets of this process. BACON.6 focuses on discovering empirical laws that summarize numerical data. This program searches a

  8. Statistic methods for searching inundated radioactive entities

    Dubasov, Yu.V.; Krivokhatskij, A.S.; Khramov, N.N.

    1993-01-01

    The problem of searching flooded radioactive object in a present area was considered. Various models of the searching route plotting are discussed. It is shown that spiral route by random points from the centre of the area examined is the most efficient one. The conclusion is made that, when searching flooded radioactive objects, it is advisable to use multidimensional statistical methods of classification

  9. Analytical Methods in Search Theory

    1979-11-01

    X, t ) ,I pick g(x,t;E), *(x,tjc) and find the b necessary to satisfy the search equation. SOLUTION: This is an audience participation problem. It...Cnstotiaticon G11trant,’ ’I pp 2110 Path lestegsls,’ to pp., Jun IBM Iltetteol Pepsi pp., Ott 1313 (Tt o besubmitoet lot pubtinatteon l t Messino, Daidit

  10. AM: An Artificial Intelligence Approach to Discovery in Mathematics as Heuristic Search

    1976-07-01

    deficiency . The idea of "Intuitions" facets was a flop. Intuitions were meant to model reality, at least little pieces of it, so that AM could...Discovery in Mathematic, as Heuristic Search -323- s Tk2 ** Check examples of Single-ADD, because many examples have recently been found, but not yet

  11. Interdiscipline: Search and Discovery--Systematization, Application, and Transfer.

    Bonomo de Zago, Maria

    1978-01-01

    Discusses efforts to develop an interdisciplinary cybernetic method, its transfer to different fields of group activities, and results achieved internationally. The 1978 program of activities designed for the promotion of the interdisciplinary cybernetic method by the International Association for Synthesis is also presented. (HM)

  12. Metadata Effectiveness in Internet Discovery: An Analysis of Digital Collection Metadata Elements and Internet Search Engine Keywords

    Yang, Le

    2016-01-01

    This study analyzed digital item metadata and keywords from Internet search engines to learn what metadata elements actually facilitate discovery of digital collections through Internet keyword searching and how significantly each metadata element affects the discovery of items in a digital repository. The study found that keywords from Internet…

  13. Module discovery by exhaustive search for densely connected, co-expressed regions in biomolecular interaction networks.

    Recep Colak

    2010-10-01

    Full Text Available Computational prediction of functionally related groups of genes (functional modules from large-scale data is an important issue in computational biology. Gene expression experiments and interaction networks are well studied large-scale data sources, available for many not yet exhaustively annotated organisms. It has been well established, when analyzing these two data sources jointly, modules are often reflected by highly interconnected (dense regions in the interaction networks whose participating genes are co-expressed. However, the tractability of the problem had remained unclear and methods by which to exhaustively search for such constellations had not been presented.We provide an algorithmic framework, referred to as Densely Connected Biclustering (DECOB, by which the aforementioned search problem becomes tractable. To benchmark the predictive power inherent to the approach, we computed all co-expressed, dense regions in physical protein and genetic interaction networks from human and yeast. An automatized filtering procedure reduces our output which results in smaller collections of modules, comparable to state-of-the-art approaches. Our results performed favorably in a fair benchmarking competition which adheres to standard criteria. We demonstrate the usefulness of an exhaustive module search, by using the unreduced output to more quickly perform GO term related function prediction tasks. We point out the advantages of our exhaustive output by predicting functional relationships using two examples.We demonstrate that the computation of all densely connected and co-expressed regions in interaction networks is an approach to module discovery of considerable value. Beyond confirming the well settled hypothesis that such co-expressed, densely connected interaction network regions reflect functional modules, we open up novel computational ways to comprehensively analyze the modular organization of an organism based on prevalent and largely

  14. Module discovery by exhaustive search for densely connected, co-expressed regions in biomolecular interaction networks.

    Colak, Recep; Moser, Flavia; Chu, Jeffrey Shih-Chieh; Schönhuth, Alexander; Chen, Nansheng; Ester, Martin

    2010-10-25

    Computational prediction of functionally related groups of genes (functional modules) from large-scale data is an important issue in computational biology. Gene expression experiments and interaction networks are well studied large-scale data sources, available for many not yet exhaustively annotated organisms. It has been well established, when analyzing these two data sources jointly, modules are often reflected by highly interconnected (dense) regions in the interaction networks whose participating genes are co-expressed. However, the tractability of the problem had remained unclear and methods by which to exhaustively search for such constellations had not been presented. We provide an algorithmic framework, referred to as Densely Connected Biclustering (DECOB), by which the aforementioned search problem becomes tractable. To benchmark the predictive power inherent to the approach, we computed all co-expressed, dense regions in physical protein and genetic interaction networks from human and yeast. An automatized filtering procedure reduces our output which results in smaller collections of modules, comparable to state-of-the-art approaches. Our results performed favorably in a fair benchmarking competition which adheres to standard criteria. We demonstrate the usefulness of an exhaustive module search, by using the unreduced output to more quickly perform GO term related function prediction tasks. We point out the advantages of our exhaustive output by predicting functional relationships using two examples. We demonstrate that the computation of all densely connected and co-expressed regions in interaction networks is an approach to module discovery of considerable value. Beyond confirming the well settled hypothesis that such co-expressed, densely connected interaction network regions reflect functional modules, we open up novel computational ways to comprehensively analyze the modular organization of an organism based on prevalent and largely available large

  15. Improving sensitivity in proteome studies by analysis of false discovery rates for multiple search engines.

    Jones, Andrew R; Siepen, Jennifer A; Hubbard, Simon J; Paton, Norman W

    2009-03-01

    LC-MS experiments can generate large quantities of data, for which a variety of database search engines are available to make peptide and protein identifications. Decoy databases are becoming widely used to place statistical confidence in result sets, allowing the false discovery rate (FDR) to be estimated. Different search engines produce different identification sets so employing more than one search engine could result in an increased number of peptides (and proteins) being identified, if an appropriate mechanism for combining data can be defined. We have developed a search engine independent score, based on FDR, which allows peptide identifications from different search engines to be combined, called the FDR Score. The results demonstrate that the observed FDR is significantly different when analysing the set of identifications made by all three search engines, by each pair of search engines or by a single search engine. Our algorithm assigns identifications to groups according to the set of search engines that have made the identification, and re-assigns the score (combined FDR Score). The combined FDR Score can differentiate between correct and incorrect peptide identifications with high accuracy, allowing on average 35% more peptide identifications to be made at a fixed FDR than using a single search engine.

  16. Footprints: A Visual Search Tool that Supports Discovery and Coverage Tracking.

    Isaacs, Ellen; Domico, Kelly; Ahern, Shane; Bart, Eugene; Singhal, Mudita

    2014-12-01

    Searching a large document collection to learn about a broad subject involves the iterative process of figuring out what to ask, filtering the results, identifying useful documents, and deciding when one has covered enough material to stop searching. We are calling this activity "discoverage," discovery of relevant material and tracking coverage of that material. We built a visual analytic tool called Footprints that uses multiple coordinated visualizations to help users navigate through the discoverage process. To support discovery, Footprints displays topics extracted from documents that provide an overview of the search space and are used to construct searches visuospatially. Footprints allows users to triage their search results by assigning a status to each document (To Read, Read, Useful), and those status markings are shown on interactive histograms depicting the user's coverage through the documents across dates, sources, and topics. Coverage histograms help users notice biases in their search and fill any gaps in their analytic process. To create Footprints, we used a highly iterative, user-centered approach in which we conducted many evaluations during both the design and implementation stages and continually modified the design in response to feedback.

  17. Phonetic search methods for large speech databases

    Moyal, Ami; Tetariy, Ella; Gishri, Michal

    2013-01-01

    “Phonetic Search Methods for Large Databases” focuses on Keyword Spotting (KWS) within large speech databases. The brief will begin by outlining the challenges associated with Keyword Spotting within large speech databases using dynamic keyword vocabularies. It will then continue by highlighting the various market segments in need of KWS solutions, as well as, the specific requirements of each market segment. The work also includes a detailed description of the complexity of the task and the different methods that are used, including the advantages and disadvantages of each method and an in-depth comparison. The main focus will be on the Phonetic Search method and its efficient implementation. This will include a literature review of the various methods used for the efficient implementation of Phonetic Search Keyword Spotting, with an emphasis on the authors’ own research which entails a comparative analysis of the Phonetic Search method which includes algorithmic details. This brief is useful for resea...

  18. Discovery of IPV6 Router Interface Addresses via Heuristic Methods

    2015-09-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS DISCOVERY OF IPV6 ROUTER INTERFACE ADDRESSES VIA HEURISTIC METHODS by Matthew D. Gray September...AND SUBTITLE DISCOVERY OF IPV6 ROUTER INTERFACE ADDRESSES VIA HEURISTIC METHODS 5. FUNDING NUMBERS CNS-1111445 6. AUTHOR(S) Matthew D. Gray 7...Internet Assigned Numbers Authority, there is continued pressure for widespread IPv6 adoption. Because the IPv6 address space is orders of magnitude

  19. An introduction to harmony search optimization method

    Wang, Xiaolei; Zenger, Kai

    2014-01-01

    This brief provides a detailed introduction, discussion and bibliographic review of the nature1-inspired optimization algorithm called Harmony Search. It uses a large number of simulation results to demonstrate the advantages of Harmony Search and its variants and also their drawbacks. The authors show how weaknesses can be amended by hybridization with other optimization methods. The Harmony Search Method with Applications will be of value to researchers in computational intelligence in demonstrating the state of the art of research on an algorithm of current interest. It also helps researche

  20. Developing a distributed HTML5-based search engine for geospatial resource discovery

    ZHOU, N.; XIA, J.; Nebert, D.; Yang, C.; Gui, Z.; Liu, K.

    2013-12-01

    With explosive growth of data, Geospatial Cyberinfrastructure(GCI) components are developed to manage geospatial resources, such as data discovery and data publishing. However, the efficiency of geospatial resources discovery is still challenging in that: (1) existing GCIs are usually developed for users of specific domains. Users may have to visit a number of GCIs to find appropriate resources; (2) The complexity of decentralized network environment usually results in slow response and pool user experience; (3) Users who use different browsers and devices may have very different user experiences because of the diversity of front-end platforms (e.g. Silverlight, Flash or HTML). To address these issues, we developed a distributed and HTML5-based search engine. Specifically, (1)the search engine adopts a brokering approach to retrieve geospatial metadata from various and distributed GCIs; (2) the asynchronous record retrieval mode enhances the search performance and user interactivity; (3) the search engine based on HTML5 is able to provide unified access capabilities for users with different devices (e.g. tablet and smartphone).

  1. THE EINSTEIN-HOME SEARCH FOR RADIO PULSARS AND PSR J2007+2722 DISCOVERY

    Allen, B.; Knispel, B.; Aulbert, C.; Bock, O.; Eggenstein, H. B.; Fehrmann, H.; Machenschalk, B. [Max-Planck-Institut fuer Gravitationsphysik, D-30167 Hannover (Germany); Cordes, J. M.; Brazier, A.; Chatterjee, S. [Department of Astronomy, Cornell University, Ithaca, NY 14853 (United States); Deneva, J. S. [Arecibo Observatory, HC3 Box 53995, Arecibo, PR 00612 (United States); Hessels, J. W. T. [ASTRON, the Netherlands Institute for Radio Astronomy, Postbus 2, 7990 AA, Dwingeloo (Netherlands); Anderson, D. [Space Sciences Laboratory, University of California, Berkeley, CA 94720 (United States); Demorest, P. B. [NRAO (National Radio Astronomy Observatory), Charlottesville, VA 22903 (United States); Gotthelf, E. V. [Columbia Astrophysics Laboratory, Columbia University, New York, NY 10027 (United States); Hammer, D. [Department of Physics, University of Wisconsin-Milwaukee, Milwaukee, WI 53211 (United States); Kaspi, V. M. [Department of Physics, McGill University, Montreal, QC H3A2T8 (Canada); Kramer, M. [Max-Planck-Institut fuer Radioastronomie, D-53121 Bonn (Germany); Lyne, A. G. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Manchester, M13 9PL (United Kingdom); McLaughlin, M. A., E-mail: bruce.allen@aei.mpg.de [Department of Physics, West Virginia University, Morgantown, WV 26506 (United States); and others

    2013-08-20

    Einstein-Home aggregates the computer power of hundreds of thousands of volunteers from 193 countries, to search for new neutron stars using data from electromagnetic and gravitational-wave detectors. This paper presents a detailed description of the search for new radio pulsars using Pulsar ALFA survey data from the Arecibo Observatory. The enormous computing power allows this search to cover a new region of parameter space; it can detect pulsars in binary systems with orbital periods as short as 11 minutes. We also describe the first Einstein-Home discovery, the 40.8 Hz isolated pulsar PSR J2007+2722, and provide a full timing model. PSR J2007+2722's pulse profile is remarkably wide with emission over almost the entire spin period. This neutron star is most likely a disrupted recycled pulsar, about as old as its characteristic spin-down age of 404 Myr. However, there is a small chance that it was born recently, with a low magnetic field. If so, upper limits on the X-ray flux suggest but cannot prove that PSR J2007+2722 is at least {approx}100 kyr old. In the future, we expect that the massive computing power provided by volunteers should enable many additional radio pulsar discoveries.

  2. Automated search method for AFM and profilers

    Ray, Michael; Martin, Yves C.

    2001-08-01

    A new automation software creates a search model as an initial setup and searches for a user-defined target in atomic force microscopes or stylus profilometers used in semiconductor manufacturing. The need for such automation has become critical in manufacturing lines. The new method starts with a survey map of a small area of a chip obtained from a chip-design database or an image of the area. The user interface requires a user to point to and define a precise location to be measured, and to select a macro function for an application such as line width or contact hole. The search algorithm automatically constructs a range of possible scan sequences within the survey, and provides increased speed and functionality compared to the methods used in instruments to date. Each sequence consists in a starting point relative to the target, a scan direction, and a scan length. The search algorithm stops when the location of a target is found and criteria for certainty in positioning is met. With today's capability in high speed processing and signal control, the tool can simultaneously scan and search for a target in a robotic and continuous manner. Examples are given that illustrate the key concepts.

  3. Employed and unemployed job search methods: Australian evidence on search duration, wages and job stability

    Colin Green

    2012-01-01

    This paper examines the use and impact of job search methods of both unemployed and employed job seekers. Informal job search methods are associated with relativel high level of job exit and shorter search duration. Job exists through the public employment agency (PEA) display positive duration dependence for the unemployed. This may suggest that the PEA is used as a job search method of last resort. Informal job search methods have lower associated duration in search and higher wages than th...

  4. The method of search of tendencies

    Reuss, Paul.

    1981-08-01

    The search of tendencies is an application of the mean squares method. Its objective is the better possible evaluation of the basic data used in the calculations from the comparison between measurements of integral characteristics and the corresponding theoretical results. This report presents the minimization which allows the estimation of the basic data and, above all, the methods which are necessary for the critical analysis of the obtained results [fr

  5. A method of searching LDAP directories using XQuery

    Hesselroth, Ted

    2011-01-01

    A method by which an LDAP directory can be searched using XQuery is described. The strategy behind the tool consists of four steps. First the XQuery script is examined and relevant XPath expressions are extracted, determined to be sufficient to define all information needed to perform the query. Then the XPath expressions are converted into their equivalent LDAP search filters by use of the published LDAP schema of the service, and search requests are made to the LDAP host. The search results are then merged and converted to an XML document that conforms to the hierarchy of the LDAP schema. Finally, the XQuery script is executed on the working XML document by conventional means. Examples are given of application of the tool in the Open Science Grid, which for discovery purposes operates an LDAP server that contains Glue schema-based information on site configuration and authorization policies. The XQuery scripts compactly replace hundreds of lines of custom python code that relied on the unix ldapsearch utility. Installation of the tool is available through the Virtual Data Toolkit.

  6. Brief history for the search and discovery of the Higgs particle - A personal perspective

    Wu, Sau Lan

    2014-01-01

    In 1964, a new particle was proposed by several groups to answer the question of where the masses of elementary particles come from; this particle is usually referred to as the Higgs particle or the Higgs boson. In July 2012, this Higgs particle was finally found experimentally, a feat accomplished by the ATLAS Collaboration and the CMS Collaboration using the Large Hadron Collider at CERN. It is the purpose of this review to give my personal perspective on a brief history of the experimental search for this particle since the '80s and finally its discovery in 2012. Besides the early searches, those at the LEP collider at CERN, the Tevatron Collider at Fermilab, and the Large Hadron Collider at CERN are described in some detail. This experimental discovery of the Higgs boson is often considered to be the most important advance in particle physics in the last half a century, and some of the possible implications are briefly discussed. This review is partially based on a talk presented by the author at the ...

  7. Discovery of gigantic molecular nanostructures using a flow reaction array as a search engine.

    Zang, Hong-Ying; de la Oliva, Andreu Ruiz; Miras, Haralampos N; Long, De-Liang; McBurney, Roy T; Cronin, Leroy

    2014-04-28

    The discovery of gigantic molecular nanostructures like coordination and polyoxometalate clusters is extremely time-consuming since a vast combinatorial space needs to be searched, and even a systematic and exhaustive exploration of the available synthetic parameters relies on a great deal of serendipity. Here we present a synthetic methodology that combines a flow reaction array and algorithmic control to give a chemical 'real-space' search engine leading to the discovery and isolation of a range of new molecular nanoclusters based on [Mo(2)O(2)S(2)](2+)-based building blocks with either fourfold (C4) or fivefold (C5) symmetry templates and linkers. This engine leads us to isolate six new nanoscale cluster compounds: 1, {Mo(10)(C5)}; 2, {Mo(14)(C4)4(C5)2}; 3, {Mo(60)(C4)10}; 4, {Mo(48)(C4)6}; 5, {Mo(34)(C4)4}; 6, {Mo(18)(C4)9}; in only 200 automated experiments from a parameter space spanning ~5 million possible combinations.

  8. THE PULSAR SEARCH COLLABORATORY: DISCOVERY AND TIMING OF FIVE NEW PULSARS

    Rosen, R.; Swiggum, J.; McLaughlin, M. A.; Lorimer, D. R.; Yun, M.; Boyles, J. [West Virginia University, White Hall, Morgantown, WV 26506 (United States); Heatherly, S. A.; Scoles, S. [NRAO, P.O. Box 2, Green Bank, WV 24944 (United States); Lynch, R. [McGill University, Rutherford Physics Building, 3600 Rue University, Montreal, QC H3A 2T8 (Canada); Kondratiev, V. I. [ASTRON, the Netherlands Institute for Radio Astronomy, Postbus 2, 7990 AA Dwingeloo (Netherlands); Ransom, S. M. [NRAO, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Moniot, M. L.; Thompson, C. [James River High School, 9906 Springwood Road, Buchanan, VA 24066 (United States); Cottrill, A.; Raycraft, M. [Lincoln High School, 100 Jerry Toth Drive, Shinnston, WV 26431 (United States); Weaver, M. [Broadway High School, 269 Gobbler Drive, Broadway, VA 22815 (United States); Snider, A. [Sherando High School, 185 South Warrior Drive, Stephens City, VA 22655 (United States); Dudenhoefer, J.; Allphin, L. [Hedgesville High School, 109 Ridge Road North, Hedgesville, WV 25427 (United States); Thorley, J., E-mail: Rachel.Rosen@mail.wvu.edu [Strasburg High School, 250 Ram Drive, Strasburg, VA 22657 (United States); and others

    2013-05-01

    We present the discovery and timing solutions of five new pulsars by students involved in the Pulsar Search Collaboratory, a NSF-funded joint program between the National Radio Astronomy Observatory and West Virginia University designed to excite and engage high-school students in Science, Technology, Engineering, and Mathematics (STEM) and related fields. We encourage students to pursue STEM fields by apprenticing them within a professional scientific community doing cutting edge research, specifically by teaching them to search for pulsars. The students are analyzing 300 hr of drift-scan survey data taken with the Green Bank Telescope at 350 MHz. These data cover 2876 deg{sup 2} of the sky. Over the course of five years, more than 700 students have inspected diagnostic plots through a web-based graphical interface designed for this project. The five pulsars discovered in the data have spin periods ranging from 3.1 ms to 4.8 s. Among the new discoveries are PSR J1926-1314, a long period, nulling pulsar; PSR J1821+0155, an isolated, partially recycled 33 ms pulsar; and PSR J1400-1438, a millisecond pulsar in a 9.5 day orbit whose companion is likely a white dwarf star.

  9. THE PULSAR SEARCH COLLABORATORY: DISCOVERY AND TIMING OF FIVE NEW PULSARS

    Rosen, R.; Swiggum, J.; McLaughlin, M. A.; Lorimer, D. R.; Yun, M.; Boyles, J.; Heatherly, S. A.; Scoles, S.; Lynch, R.; Kondratiev, V. I.; Ransom, S. M.; Moniot, M. L.; Thompson, C.; Cottrill, A.; Raycraft, M.; Weaver, M.; Snider, A.; Dudenhoefer, J.; Allphin, L.; Thorley, J.

    2013-01-01

    We present the discovery and timing solutions of five new pulsars by students involved in the Pulsar Search Collaboratory, a NSF-funded joint program between the National Radio Astronomy Observatory and West Virginia University designed to excite and engage high-school students in Science, Technology, Engineering, and Mathematics (STEM) and related fields. We encourage students to pursue STEM fields by apprenticing them within a professional scientific community doing cutting edge research, specifically by teaching them to search for pulsars. The students are analyzing 300 hr of drift-scan survey data taken with the Green Bank Telescope at 350 MHz. These data cover 2876 deg 2 of the sky. Over the course of five years, more than 700 students have inspected diagnostic plots through a web-based graphical interface designed for this project. The five pulsars discovered in the data have spin periods ranging from 3.1 ms to 4.8 s. Among the new discoveries are PSR J1926–1314, a long period, nulling pulsar; PSR J1821+0155, an isolated, partially recycled 33 ms pulsar; and PSR J1400–1438, a millisecond pulsar in a 9.5 day orbit whose companion is likely a white dwarf star.

  10. Data Mining and Knowledge Discovery via Logic-Based Methods

    Triantaphyllou, Evangelos

    2010-01-01

    There are many approaches to data mining and knowledge discovery (DM&KD), including neural networks, closest neighbor methods, and various statistical methods. This monograph, however, focuses on the development and use of a novel approach, based on mathematical logic, that the author and his research associates have worked on over the last 20 years. The methods presented in the book deal with key DM&KD issues in an intuitive manner and in a natural sequence. Compared to other DM&KD methods, those based on mathematical logic offer a direct and often intuitive approach for extracting easily int

  11. Harmony Search Method: Theory and Applications

    X. Z. Gao

    2015-01-01

    Full Text Available The Harmony Search (HS method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem.

  12. The Search for Neutrino-less Double-Beta Decay: A Decade of Discovery or Despair?

    CERN. Geneva

    2011-01-01

    The search for "neutrino-less double-bete decay" decay in candidate nuclear isotopes remains a central focus in contemporary particle physics, with the main goal of establishing whether the neutrino is its own anti-particle. A positive detection would also establish the presence of lepton number violation in this decay, and suggest the existence of processes beyond the Standard Model and reach of terrestrial accelerators. With the discovery and quantitative assessment of neutrino flavor oscillation, guaranteeing the presence of a non-zero neutrino mass – a requirement for "neutrino-less double-bete decay" decay to occur – motivation has surged. In a review of the present diverse and vigorous current experimental situation, I must focus on just a few approaches and candidate isotopes, in particular on 136Xe and a new experimental effort, NEXT, exploiting the unfamiliar phenomenon of electroluminescence. But, even if the neutrino is its own anti-particle, experiments may see no decays! Stil...

  13. Heuristic method for searching global maximum of multimodal unknown function

    Kamei, K; Araki, Y; Inoue, K

    1983-06-01

    The method is composed of three kinds of searches. They are called g (grasping)-mode search, f (finding)-mode search and c (confirming)-mode search. In the g-mode search and the c-mode search, a heuristic method is used which was extracted from search behaviors of human subjects. In f-mode search, the simplex method is used which is well known as a search method for unimodal unknown function. Each mode search and its transitions are shown in the form of flowchart. The numerical results for one-dimensional through six-dimensional multimodal functions prove the proposed search method to be an effective one. 11 references.

  14. A feature-based approach for best arm identification in the case of the Monte Carlo search algorithm discovery for one-player games

    Taralla, David

    2013-01-01

    The field of reinforcement learning recently received the contribution by Ernst et al. (2013) "Monte carlo search algorithm discovery for one player games" who introduced a new way to conceive completely new algorithms. Moreover, it brought an automatic method to find the best algorithm to use in a particular situation using a multi-arm bandit approach. We address here the problem of best arm identification. The main problem is that the generated algorithm space (ie. the arm space) can be qui...

  15. Macro cell assisted cell discovery method for 5G mobile networks

    Marcano, Andrea; Christiansen, Henrik Lehrmann

    2016-01-01

    , and requires a new system design. The aspects concerning the impact of using mmWave frequencies on the medium access (MAC) layer are one of the topics that need to be further analyzed. In this article we focus on the cell discovery process of the MAC laywe for mmWave communications. A new approach assuming...... a joint search of the user equipment (UE) between the mmWave small cell (SC) and the macro cell (MC) is proposed. The performance of this method is analyzed and compared with existing methods. The results show that using the MC as aid during the search process can allow for up to 99% improvement in terms...

  16. Complementary Value of Databases for Discovery of Scholarly Literature: A User Survey of Online Searching for Publications in Art History

    Nemeth, Erik

    2010-01-01

    Discovery of academic literature through Web search engines challenges the traditional role of specialized research databases. Creation of literature outside academic presses and peer-reviewed publications expands the content for scholarly research within a particular field. The resulting body of literature raises the question of whether scholars…

  17. NEW DISCOVERIES FROM THE ARECIBO 327 MHz DRIFT PULSAR SURVEY RADIO TRANSIENT SEARCH

    Deneva, J. S. [National Research Council, resident at the Naval Research Laboratory, Washington, DC 20375 (United States); Stovall, K. [Department of Physics and Astronomy, University of New Mexico, Albuquerque, NM 87131 (United States); McLaughlin, M. A.; Bagchi, M.; Garver-Daniels, N. [Department of Physics and Astronomy, West Virginia University, Morgantown, WV 26506 (United States); Bates, S. D. [The Institute of Mathematical Sciences, Chennai, 600113 (India); Freire, P. C. C.; Martinez, J. G. [Max-Planck-Institut für Radioastronomie, Bonn (Germany); Jenet, F. [Center for Advanced Radio Astronomy, Department of Physics and Astronomy, University of Texas at Brownsville, Brownsville, TX 78520 (United States)

    2016-04-10

    We present Clusterrank, a new algorithm for identifying dispersed astrophysical pulses. Such pulses are commonly detected from Galactic pulsars and rotating radio transients (RRATs), which are neutron stars with sporadic radio emission. More recently, isolated, highly dispersed pulses dubbed fast radio bursts (FRBs) have been identified as the potential signature of an extragalactic cataclysmic radio source distinct from pulsars and RRATs. Clusterrank helped us discover 14 pulsars and 8 RRATs in data from the Arecibo 327 MHz Drift Pulsar Survey (AO327). The new RRATs have DMs in the range 23.5–86.6 pc cm{sup −3} and periods in the range 0.172–3.901 s. The new pulsars have DMs in the range 23.6–133.3 pc cm{sup −3} and periods in the range 1.249–5.012 s, and include two nullers and a mode-switching object. We estimate an upper limit on the all-sky FRB rate of 10{sup 5} day{sup −1} for bursts with a width of 10 ms and flux density ≳83 mJy. The DMs of all new discoveries are consistent with a Galactic origin. In comparing statistics of the new RRATs with sources from the RRATalog, we find that both sets are drawn from the same period distribution. In contrast, we find that the period distribution of the new pulsars is different from the period distributions of canonical pulsars in the ATNF catalog or pulsars found in AO327 data by a periodicity search. This indicates that Clusterrank is a powerful complement to periodicity searches and uncovers a subset of the pulsar population that has so far been underrepresented in survey results and therefore in Galactic pulsar population models.

  18. NEW DISCOVERIES FROM THE ARECIBO 327 MHz DRIFT PULSAR SURVEY RADIO TRANSIENT SEARCH

    Deneva, J. S.; Stovall, K.; McLaughlin, M. A.; Bagchi, M.; Garver-Daniels, N.; Bates, S. D.; Freire, P. C. C.; Martinez, J. G.; Jenet, F.

    2016-01-01

    We present Clusterrank, a new algorithm for identifying dispersed astrophysical pulses. Such pulses are commonly detected from Galactic pulsars and rotating radio transients (RRATs), which are neutron stars with sporadic radio emission. More recently, isolated, highly dispersed pulses dubbed fast radio bursts (FRBs) have been identified as the potential signature of an extragalactic cataclysmic radio source distinct from pulsars and RRATs. Clusterrank helped us discover 14 pulsars and 8 RRATs in data from the Arecibo 327 MHz Drift Pulsar Survey (AO327). The new RRATs have DMs in the range 23.5–86.6 pc cm −3 and periods in the range 0.172–3.901 s. The new pulsars have DMs in the range 23.6–133.3 pc cm −3 and periods in the range 1.249–5.012 s, and include two nullers and a mode-switching object. We estimate an upper limit on the all-sky FRB rate of 10 5  day −1 for bursts with a width of 10 ms and flux density ≳83 mJy. The DMs of all new discoveries are consistent with a Galactic origin. In comparing statistics of the new RRATs with sources from the RRATalog, we find that both sets are drawn from the same period distribution. In contrast, we find that the period distribution of the new pulsars is different from the period distributions of canonical pulsars in the ATNF catalog or pulsars found in AO327 data by a periodicity search. This indicates that Clusterrank is a powerful complement to periodicity searches and uncovers a subset of the pulsar population that has so far been underrepresented in survey results and therefore in Galactic pulsar population models

  19. Augmenting collider searches and enhancing discovery potentials through stochastic jet grooming

    Roy, Tuhin S.; Thalapillil, Arun M.

    2017-04-01

    The jet trimming procedure has been demonstrated to greatly improve event reconstruction in hadron collisions by mitigating contamination due initial state radiation, multiple interactions, and event pileup. Meanwhile, Qjets—a nondeterministic approach to tree-based jet substructure—has been shown to be a powerful technique in decreasing random statistical fluctuations, yielding significant effective luminosity improvements. This manifests through an improvement in the significance S /δ B , relative to conventional methods. Qjets also provides novel observables in many cases, like mass-volatility, that could be used to further discriminate between signal and background events. The statistical robustness and volatility observables, for tagging, are obtained simultaneously. We explore here a combination of the two techniques, and demonstrate that significant enhancements in discovery potentials may be obtained in nontrivial ways. We will illustrate this by considering a diboson resonance analysis as a case study, enabling us to interpolate between scenarios where the gains are purely due to statistical robustness and scenarios where the gains are also reinforced by volatility variable discriminants. The former, for instance, is applicable to digluon/diquark resonances, while the latter will be of relevance to di -W±/di -Z0 resonances, where the boosted vector bosons are decaying hadronically and have an intrinsic mass scale attached to them. We argue that one can enhance signal significance and discovery potentials markedly through stochastic grooming, and help augment studies at the Large Hadron Collider and future hadron colliders.

  20. Comparison tomography relocation hypocenter grid search and guided grid search method in Java island

    Nurdian, S. W.; Adu, N.; Palupi, I. R.; Raharjo, W.

    2016-01-01

    The main data in this research is earthquake data recorded from 1952 to 2012 with 9162 P wave and 2426 events are recorded by 30 stations located around Java island. Relocation hypocenter processed using grid search and guidded grid search method. Then the result of relocation hypocenter become input for tomography pseudo bending inversion process. It can be used to identification the velocity distribution in subsurface. The result of relocation hypocenter by grid search and guided grid search method after tomography process shown in locally and globally. In locally area grid search method result is better than guided grid search according to geological reseach area. But in globally area the result of guided grid search method is better for a broad area because the velocity variation is more diverse than the other one and in accordance with local geological research conditions. (paper)

  1. Representation Methods in AI. Searching by Graphs

    Angel GARRIDO

    2012-12-01

    Full Text Available The historical origin of the Artificial Intelligence (A I is usually established in the Darmouth Conference, of 1956. But we can find many more arcane origins [1]. Also, we can consider, in more recent times, very great thinkers, as Janos Neumann (then, John von Neumann, arrived in USA, Norbert Wiener, Alan Mathison Turing, or Lofti Zadehfor instance [6, 7]. Frequently A I requires Logic. But its classical version shows too many insufficiencies. So, it was necessary to introduce more sophisticated tools, as fuzzy logic, modal logic, non-monotonic logic and so on [2]. Among the things that A I needs to represent are: categories, objects, properties, relations between objects, situations, states, time, events, causes and effects, knowledge about knowledge, and so on. The problems in A I can be classified in two general types [3, 4]: search problems and representation problems. In this last “mountain”, there exist different ways to reach their summit. So, we have [3]: logics, rules, frames, associative nets, scripts and so on, many times connectedamong them. We attempt, in this paper, a panoramic vision of the scope of application of such Representation Methods in A I. The two more disputable questions of both modern philosophy of mind and A I will be Turing Test and The Chinese Room Argument. To elucidate these very difficult questions, see both final Appendices.

  2. DES meets Gaia: discovery of strongly lensed quasars from a multiplet search

    Agnello, A.; et al.

    2017-11-10

    We report the discovery, spectroscopic confirmation and first lens models of the first two, strongly lensed quasars from a combined search in WISE and Gaia over the DES footprint. The four-image lensWGD2038-4008 (r.a.=20:38:02.65, dec.=-40:08:14.64) has source- and lens-redshifts $z_{s}=0.777 \\pm 0.001$ and $z_l = 0.230 \\pm 0.002$ respectively. Its deflector has effective radius $R_{\\rm eff} \\approx 3.4^{\\prime\\prime}$, stellar mass $\\log(M_{\\star}/M_{\\odot}) = 11.64^{+0.20}_{-0.43}$, and shows extended isophotal shape variation. Simple lens models yield Einstein radii $R_{\\rm E}=(1.30\\pm0.04)^{\\prime\\prime},$ axis ratio $q=0.75\\pm0.1$ (compatible with that of the starlight) and considerable shear-ellipticity degeneracies. The two-image lensWGD2021-4115 (r.a.=20:21:39.45, dec.=--41:15:57.11) has $z_{s}=1.390\\pm0.001$ and $z_l = 0.335 \\pm 0.002$, and Einstein radius $R_{\\rm E} = (1.1\\pm0.1)^{\\prime\\prime},$ but higher-resolution imaging is needed to accurately separate the deflector and faint quasar image. We also show high-rank candidate doubles selected this way, some of which have been independently identified with different techniques, and discuss a DES+WISE quasar multiplet selection.

  3. Emerging Computational Methods for the Rational Discovery of Allosteric Drugs.

    Wagner, Jeffrey R; Lee, Christopher T; Durrant, Jacob D; Malmstrom, Robert D; Feher, Victoria A; Amaro, Rommie E

    2016-06-08

    Allosteric drug development holds promise for delivering medicines that are more selective and less toxic than those that target orthosteric sites. To date, the discovery of allosteric binding sites and lead compounds has been mostly serendipitous, achieved through high-throughput screening. Over the past decade, structural data has become more readily available for larger protein systems and more membrane protein classes (e.g., GPCRs and ion channels), which are common allosteric drug targets. In parallel, improved simulation methods now provide better atomistic understanding of the protein dynamics and cooperative motions that are critical to allosteric mechanisms. As a result of these advances, the field of predictive allosteric drug development is now on the cusp of a new era of rational structure-based computational methods. Here, we review algorithms that predict allosteric sites based on sequence data and molecular dynamics simulations, describe tools that assess the druggability of these pockets, and discuss how Markov state models and topology analyses provide insight into the relationship between protein dynamics and allosteric drug binding. In each section, we first provide an overview of the various method classes before describing relevant algorithms and software packages.

  4. Efficient protein structure search using indexing methods.

    Kim, Sungchul; Sael, Lee; Yu, Hwanjo

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively.

  5. A projection and density estimation method for knowledge discovery.

    Adam Stanski

    Full Text Available A key ingredient to modern data analysis is probability density estimation. However, it is well known that the curse of dimensionality prevents a proper estimation of densities in high dimensions. The problem is typically circumvented by using a fixed set of assumptions about the data, e.g., by assuming partial independence of features, data on a manifold or a customized kernel. These fixed assumptions limit the applicability of a method. In this paper we propose a framework that uses a flexible set of assumptions instead. It allows to tailor a model to various problems by means of 1d-decompositions. The approach achieves a fast runtime and is not limited by the curse of dimensionality as all estimations are performed in 1d-space. The wide range of applications is demonstrated at two very different real world examples. The first is a data mining software that allows the fully automatic discovery of patterns. The software is publicly available for evaluation. As a second example an image segmentation method is realized. It achieves state of the art performance on a benchmark dataset although it uses only a fraction of the training data and very simple features.

  6. search.bioPreprint: a discovery tool for cutting edge, preprint biomedical research articles [version 2; referees: 2 approved

    Carrie L. Iwema

    2016-07-01

    Full Text Available The time it takes for a completed manuscript to be published traditionally can be extremely lengthy. Article publication delay, which occurs in part due to constraints associated with peer review, can prevent the timely dissemination of critical and actionable data associated with new information on rare diseases or developing health concerns such as Zika virus. Preprint servers are open access online repositories housing preprint research articles that enable authors (1 to make their research immediately and freely available and (2 to receive commentary and peer review prior to journal submission. There is a growing movement of preprint advocates aiming to change the current journal publication and peer review system, proposing that preprints catalyze biomedical discovery, support career advancement, and improve scientific communication. While the number of articles submitted to and hosted by preprint servers are gradually increasing, there has been no simple way to identify biomedical research published in a preprint format, as they are not typically indexed and are only discoverable by directly searching the specific preprint server websites. To address this issue, we created a search engine that quickly compiles preprints from disparate host repositories and provides a one-stop search solution. Additionally, we developed a web application that bolsters the discovery of preprints by enabling each and every word or phrase appearing on any web site to be integrated with articles from preprint servers. This tool, search.bioPreprint, is publicly available at http://www.hsls.pitt.edu/resources/preprint.

  7. The search for faint radio supernova remnants in the outer Galaxy: five new discoveries

    Gerbrandt, Stephanie; Foster, Tyler J.; Kothes, Roland; Geisbüsch, Jörn; Tung, Albert

    2014-06-01

    Context. High resolution and sensitivity large-scale radio surveys of the Milky Way are critical in the discovery of very low surface brightness supernova remnants (SNRs), which may constitute a significant portion of the Galactic SNRs still unaccounted for (ostensibly the "missing SNR problem"). Aims: The overall purpose here is to present the results of a systematic, deep data-mining of the Canadian Galactic plane Survey (CGPS) for faint, extended non-thermal and polarized emission structures that are likely the shells of uncatalogued SNRs. Methods: We examine 5 × 5 degree mosaics from the entire 1420 MHz continuum and polarization dataset of the CGPS after removing unresolved "point" sources and subsequently smoothing them. Newly revealed extended emission objects are compared to similarly prepared CGPS 408 MHz continuum mosaics, as well as to source-removed mosaics from various existing radio surveys at 4.8 GHz, 2.7 GHz, and 327 MHz, to identify candidates with non-thermal emission characteristics. We integrate flux densities at each frequency to characterise the radio spectra behaviour of these candidates. We further look for mid- and high-frequency (1420 MHz, 4.8 GHz) ordered polarized emission from the limb brightened "shell"-like continuum features that the candidates sport. Finally, we use IR and optical maps to provide additional backing evidence. Results: Here we present evidence that five new objects, identified as filling all or some of the criteria above, are strong candidates for new SNRs. These five are designated by their Galactic coordinate names G108.5+11.0, G128.5+2.6, G149.5+3.2, G150.8+3.8, and G160.1-1.1. The radio spectrum of each is presented, highlighting their steepness, which is characteristic of synchrotron radiation. CGPS 1420 MHz polarization data and 4.8 GHz polarization data also provide evidence that these objects are newly discovered SNRs. These discoveries represent a significant increase in the number of SNRs known in the outer

  8. Preference vs. Authority: A Comparison of Student Searching in a Subject-Specific Indexing and Abstracting Database and a Customized Discovery Layer

    Dahlen, Sarah P. C.; Hanson, Kathlene

    2017-01-01

    Discovery layers provide a simplified interface for searching library resources. Libraries with limited finances make decisions about retaining indexing and abstracting databases when similar information is available in discovery layers. These decisions should be informed by student success at finding quality information as well as satisfaction…

  9. Job Search Methods: Consequences for Gender-based Earnings Inequality.

    Huffman, Matt L.; Torres, Lisa

    2001-01-01

    Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)

  10. An automated full-symmetry Patterson search method

    Rius, J.; Miravitlles, C.

    1987-01-01

    A full-symmetry Patterson search method is presented that performs a molecular coarse rotation search in vector space and orientation refinement using the σ function. The oriented molecule is positioned using the fast translation function τ 0 , which is based on the automated interpretation of τ projections using the sum function. This strategy reduces the number of Patterson-function values to be stored in the rotation search, and the use of the τ 0 function minimizes the required time for the development of all probable rotation search solutions. The application of this method to five representative test examples is shown. (orig.)

  11. Maintaining the momentum of Open Search in Earth Science Data discovery

    Newman, D. J.; Lynnes, C.

    2013-12-01

    Federated Search for Earth Observation data has been a hallmark of EOSDIS (Earth Observing System Data and Information System) for two decades. Originally, the EOSDIS Version 0 system provided both data-collection-level and granule/file-level search in the mid 1990s with EOSDIS-specific socket protocols and message formats. Since that time, the advent of several standards has helped to simplify EOSDIS federated search, beginning with HTTP as the transfer protocol. Most recently, OpenSearch (www.opensearch.org) was employed for the EOS Clearinghouse (ECHO), based on a set of conventions that had been developed within the Earth Science Information Partners (ESIP) Federation. The ECHO OpenSearch API has evolved to encompass the ESIP RFC and the Open Geospatial Consortium (OGC) Open Search standard. Uptake of the ECHO Open Search API has been significant and has made ECHO accessible to client developers that found the previous ECHO SOAP API and current REST API too complex. Client adoption of the OpenSearch API appears to be largely driven by the simplicity of the OpenSearch convention. This simplicity is thus important to retain as the standard and convention evolve. For example, ECHO metrics indicate that the vast majority of ECHO users favor the following search criteria when using the REST API, - Spatial - bounding box, polygon, line and point - Temporal - start and end time - Keywords - free text Fewer than 10% of searches use additional constraints, particularly those requiring a controlled vocabulary, such as instrument, sensor, etc. This suggests that ongoing standardization efforts around OpenSearch usage for Earth Observation data may be more productive if oriented toward improving support for the Spatial, Temporal and Keyword search aspects. Areas still requiring improvement include support of - Concrete requirements for keyword constraints - Phrasal search for keyword constraints - Temporal constraint relations - Terminological symmetry between search URLs

  12. Fast radio burst search: cross spectrum vs. auto spectrum method

    Liu, Lei; Zheng, Weimin; Yan, Zhen; Zhang, Juan

    2018-06-01

    The search for fast radio bursts (FRBs) is a hot topic in current radio astronomy studies. In this work, we carry out a single pulse search with a very long baseline interferometry (VLBI) pulsar observation data set using both auto spectrum and cross spectrum search methods. The cross spectrum method, first proposed in Liu et al., maximizes the signal power by fully utilizing the fringe phase information of the baseline cross spectrum. The auto spectrum search method is based on the popular pulsar software package PRESTO, which extracts single pulses from the auto spectrum of each station. According to our comparison, the cross spectrum method is able to enhance the signal power and therefore extract single pulses from data contaminated by high levels of radio frequency interference (RFI), which makes it possible to carry out a search for FRBs in regular VLBI observations when RFI is present.

  13. Computational methods for a three-dimensional model of the petroleum-discovery process

    Schuenemeyer, J.H.; Bawiec, W.J.; Drew, L.J.

    1980-01-01

    A discovery-process model devised by Drew, Schuenemeyer, and Root can be used to predict the amount of petroleum to be discovered in a basin from some future level of exploratory effort: the predictions are based on historical drilling and discovery data. Because marginal costs of discovery and production are a function of field size, the model can be used to make estimates of future discoveries within deposit size classes. The modeling approach is a geometric one in which the area searched is a function of the size and shape of the targets being sought. A high correlation is assumed between the surface-projection area of the fields and the volume of petroleum. To predict how much oil remains to be found, the area searched must be computed, and the basin size and discovery efficiency must be estimated. The basin is assumed to be explored randomly rather than by pattern drilling. The model may be used to compute independent estimates of future oil at different depth intervals for a play involving multiple producing horizons. We have written FORTRAN computer programs that are used with Drew, Schuenemeyer, and Root's model to merge the discovery and drilling information and perform the necessary computations to estimate undiscovered petroleum. These program may be modified easily for the estimation of remaining quantities of commodities other than petroleum. ?? 1980.

  14. Real-time earthquake monitoring using a search engine method.

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  15. Trend analysis of time-series data: A novel method for untargeted metabolite discovery

    Peters, S.; Janssen, H.-G.; Vivó-Truyols, G.

    2010-01-01

    A new strategy for biomarker discovery is presented that uses time-series metabolomics data. Data sets from samples analysed at different time points after an intervention are searched for compounds that show a meaningful trend following the intervention. Obviously, this requires new data-analytical

  16. Building maps to search the web: the method Sewcom

    Corrado Petrucco

    2002-01-01

    Full Text Available Seeking information on the Internet is becoming a necessity 'at school, at work and in every social sphere. Unfortunately the difficulties' inherent in the use of search engines and the use of unconscious cognitive approaches inefficient limit their effectiveness. It is in this respect presented a method, called SEWCOM that lets you create conceptual maps through interaction with search engines.

  17. Job Search as Goal-Directed Behavior: Objectives and Methods

    Van Hoye, Greet; Saks, Alan M.

    2008-01-01

    This study investigated the relationship between job search objectives (finding a new job/turnover, staying aware of job alternatives, developing a professional network, and obtaining leverage against an employer) and job search methods (looking at job ads, visiting job sites, networking, contacting employment agencies, contacting employers, and…

  18. Prospects for SUSY discovery based on inclusive searches with the ATLAS detector

    Ventura, Andrea

    2009-01-01

    The search for Supersymmetry (SUSY) among the possible scenarios of new physics is one of the most relevant goals of the ATLAS experiment running at CERN's Large Hadron Collider. In the present work the expected prospects for discovering SUSY with the ATLAS detector are reviewed, in particular for the first fb -1 of collected integrated luminosity. All studies and results reported here are based on inclusive search analyses realized with Monte Carlo signal and background data simulated through the ATLAS apparatus.

  19. SemaTyP: a knowledge graph based literature mining method for drug discovery.

    Sang, Shengtian; Yang, Zhihao; Wang, Lei; Liu, Xiaoxia; Lin, Hongfei; Wang, Jian

    2018-05-30

    Drug discovery is the process through which potential new medicines are identified. High-throughput screening and computer-aided drug discovery/design are the two main drug discovery methods for now, which have successfully discovered a series of drugs. However, development of new drugs is still an extremely time-consuming and expensive process. Biomedical literature contains important clues for the identification of potential treatments. It could support experts in biomedicine on their way towards new discoveries. Here, we propose a biomedical knowledge graph-based drug discovery method called SemaTyP, which discovers candidate drugs for diseases by mining published biomedical literature. We first construct a biomedical knowledge graph with the relations extracted from biomedical abstracts, then a logistic regression model is trained by learning the semantic types of paths of known drug therapies' existing in the biomedical knowledge graph, finally the learned model is used to discover drug therapies for new diseases. The experimental results show that our method could not only effectively discover new drug therapies for new diseases, but also could provide the potential mechanism of action of the candidate drugs. In this paper we propose a novel knowledge graph based literature mining method for drug discovery. It could be a supplementary method for current drug discovery methods.

  20. BEST: Next-Generation Biomedical Entity Search Tool for Knowledge Discovery from Biomedical Literature.

    Sunwon Lee

    Full Text Available As the volume of publications rapidly increases, searching for relevant information from the literature becomes more challenging. To complement standard search engines such as PubMed, it is desirable to have an advanced search tool that directly returns relevant biomedical entities such as targets, drugs, and mutations rather than a long list of articles. Some existing tools submit a query to PubMed and process retrieved abstracts to extract information at query time, resulting in a slow response time and limited coverage of only a fraction of the PubMed corpus. Other tools preprocess the PubMed corpus to speed up the response time; however, they are not constantly updated, and thus produce outdated results. Further, most existing tools cannot process sophisticated queries such as searches for mutations that co-occur with query terms in the literature. To address these problems, we introduce BEST, a biomedical entity search tool. BEST returns, as a result, a list of 10 different types of biomedical entities including genes, diseases, drugs, targets, transcription factors, miRNAs, and mutations that are relevant to a user's query. To the best of our knowledge, BEST is the only system that processes free text queries and returns up-to-date results in real time including mutation information in the results. BEST is freely accessible at http://best.korea.ac.kr.

  1. Method of Improving Personal Name Search in Academic Information Service

    Heejun Han

    2012-12-01

    Full Text Available All academic information on the web or elsewhere has its creator, that is, a subject who has created the information. The subject can be an individual, a group, or an institution, and can be a nation depending on the nature of the relevant information. Most information is composed of a title, an author, and contents. An essay which is under the academic information category has metadata including a title, an author, keyword, abstract, data about publication, place of publication, ISSN, and the like. A patent has metadata including the title, an applicant, an inventor, an attorney, IPC, number of application, and claims of the invention. Most web-based academic information services enable users to search the information by processing the meta-information. An important element is to search information by using the author field which corresponds to a personal name. This study suggests a method of efficient indexing and using the adjacent operation result ranking algorithm to which phrase search-based boosting elements are applied, and thus improving the accuracy of the search results of personal names. It also describes a method for providing the results of searching co-authors and related researchers in searching personal names. This method can be effectively applied to providing accurate and additional search results in the academic information services.

  2. Graph-Based Methods for Discovery Browsing with Semantic Predications

    Wilkowski, Bartlomiej; Fiszman, Marcelo; Miller, Christopher M

    2011-01-01

    . Poorly understood relationships may be explored through novel points of view, and potentially interesting relationships need not be known ahead of time. In a process of "cooperative reciprocity" the user iteratively focuses system output, thus controlling the large number of relationships often generated...... in literature-based discovery systems. The underlying technology exploits SemRep semantic predications represented as a graph of interconnected nodes (predication arguments) and edges (predicates). The system suggests paths in this graph, which represent chains of relationships. The methodology is illustrated...

  3. Integrated Proteomic Pipeline Using Multiple Search Engines for a Proteogenomic Study with a Controlled Protein False Discovery Rate.

    Park, Gun Wook; Hwang, Heeyoun; Kim, Kwang Hoe; Lee, Ju Yeon; Lee, Hyun Kyoung; Park, Ji Yeong; Ji, Eun Sun; Park, Sung-Kyu Robin; Yates, John R; Kwon, Kyung-Hoon; Park, Young Mok; Lee, Hyoung-Joo; Paik, Young-Ki; Kim, Jin Young; Yoo, Jong Shin

    2016-11-04

    In the Chromosome-Centric Human Proteome Project (C-HPP), false-positive identification by peptide spectrum matches (PSMs) after database searches is a major issue for proteogenomic studies using liquid-chromatography and mass-spectrometry-based large proteomic profiling. Here we developed a simple strategy for protein identification, with a controlled false discovery rate (FDR) at the protein level, using an integrated proteomic pipeline (IPP) that consists of four engrailed steps as follows. First, using three different search engines, SEQUEST, MASCOT, and MS-GF+, individual proteomic searches were performed against the neXtProt database. Second, the search results from the PSMs were combined using statistical evaluation tools including DTASelect and Percolator. Third, the peptide search scores were converted into E-scores normalized using an in-house program. Last, ProteinInferencer was used to filter the proteins containing two or more peptides with a controlled FDR of 1.0% at the protein level. Finally, we compared the performance of the IPP to a conventional proteomic pipeline (CPP) for protein identification using a controlled FDR of <1% at the protein level. Using the IPP, a total of 5756 proteins (vs 4453 using the CPP) including 477 alternative splicing variants (vs 182 using the CPP) were identified from human hippocampal tissue. In addition, a total of 10 missing proteins (vs 7 using the CPP) were identified with two or more unique peptides, and their tryptic peptides were validated using MS/MS spectral pattern from a repository database or their corresponding synthetic peptides. This study shows that the IPP effectively improved the identification of proteins, including alternative splicing variants and missing proteins, in human hippocampal tissues for the C-HPP. All RAW files used in this study were deposited in ProteomeXchange (PXD000395).

  4. Remarks on search methods for stable, massive, elementary particles

    Perl, Martin L.

    2001-01-01

    This paper was presented at the 69th birthday celebration of Professor Eugene Commins, honoring his research achievements. These remarks are about the experimental techniques used in the search for new stable, massive particles, particles at least as massive as the electron. A variety of experimental methods such as accelerator experiments, cosmic ray studies, searches for halo particles in the galaxy and searches for exotic particles in bulk matter are described. A summary is presented of the measured limits on the existence of new stable, massive particle

  5. ARSTEC, Nonlinear Optimization Program Using Random Search Method

    Rasmuson, D. M.; Marshall, N. H.

    1979-01-01

    1 - Description of problem or function: The ARSTEC program was written to solve nonlinear, mixed integer, optimization problems. An example of such a problem in the nuclear industry is the allocation of redundant parts in the design of a nuclear power plant to minimize plant unavailability. 2 - Method of solution: The technique used in ARSTEC is the adaptive random search method. The search is started from an arbitrary point in the search region and every time a point that improves the objective function is found, the search region is centered at that new point. 3 - Restrictions on the complexity of the problem: Presently, the maximum number of independent variables allowed is 10. This can be changed by increasing the dimension of the arrays

  6. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    Gora, D.; Bernardini, E.; Cruz Silva, A.H.

    2011-04-01

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  7. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    Gora, D. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institute of Nuclear Physics PAN, Cracow (Poland); Bernardini, E.; Cruz Silva, A.H. [Institute of Nuclear Physics PAN, Cracow (Poland)

    2011-04-15

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  8. NASA's GeneLab Phase II: Federated Search and Data Discovery

    Berrios, Daniel C.; Costes, Sylvain V.; Tran, Peter B.

    2017-01-01

    GeneLab is currently being developed by NASA to accelerate 'open science' biomedical research in support of the human exploration of space and the improvement of life on earth. Phase I of the four-phase GeneLab Data Systems (GLDS) project emphasized capabilities for submission, curation, search, and retrieval of genomics, transcriptomics and proteomics ('omics') data from biomedical research of space environments. The focus of development of the GLDS for Phase II has been federated data search for and retrieval of these kinds of data across other open-access systems, so that users are able to conduct biological meta-investigations using data from a variety of sources. Such meta-investigations are key to corroborating findings from many kinds of assays and translating them into systems biology knowledge and, eventually, therapeutics.

  9. NASAs GeneLab Phase II: Federated Search and Data Discovery

    Berrios, Daniel C.; Costes, Sylvain; Tran, Peter

    2017-01-01

    GeneLab is currently being developed by NASA to accelerate open science biomedical research in support of the human exploration of space and the improvement of life on earth. Phase I of the four-phase GeneLab Data Systems (GLDS) project emphasized capabilities for submission, curation, search, and retrieval of genomics, transcriptomics and proteomics (omics) data from biomedical research of space environments. The focus of development of the GLDS for Phase II has been federated data search for and retrieval of these kinds of data across other open-access systems, so that users are able to conduct biological meta-investigations using data from a variety of sources. Such meta-investigations are key to corroborating findings from many kinds of assays and translating them into systems biology knowledge and, eventually, therapeutics.

  10. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  11. Distributed Information Search and Retrieval for Astronomical Resource Discovery and Data Mining

    Murtagh, Fionn; Guillaume, Damien

    Information search and retrieval has become by nature a distributed task. We look at tools and techniques which are of importance in this area. Current technological evolution can be summarized as the growing stability and cohesiveness of distributed architectures of searchable objects. The objects themselves are more often than not multimedia, including published articles or grey literature reports, yellow page services, image data, catalogs, presentation and online display materials, and ``operations'' information such as scheduling and publicly accessible proposal information. The evolution towards distributed architectures, protocols and formats, and the direction of our own work, are focussed on in this paper.

  12. Discovery Mondays "Particle collisions - searching for a needle in a haystack"

    2007-01-01

    Simulation of a collision in the ALICE detector.One of the great challenges facing the LHC experiments is how to find an interesting "needle" interaction in a "haystack" of data. The accelerator will generate up to 600 million proton collisions per second. Although the frequency of lead-ion collisions in the ALICE detector will be lower, ten times more data will be generated than in proton-proton collisions since each ion contains 82 protons and 126 neutrons. Each collision will produce, on average, 40,000 particles, so in the space of one month the experiment will potentially accumulate up to one petabyte (1015 bytes) of data! But the key question is how do you go about sorting, selecting and processing such colossal quantities of information? This challenge will be met by a state-of-the-art data acquisition, transmission, storage and processing chain. Come to the next Discovery Monday to find out about all the links in this ground-breaking chain. The event will be conducte...

  13. Discovery of pyridine-based agrochemicals by using Intermediate Derivatization Methods.

    Guan, Ai-Ying; Liu, Chang-Ling; Sun, Xu-Feng; Xie, Yong; Wang, Ming-An

    2016-02-01

    Pyridine-based compounds have been playing a crucial role as agrochemicals or pesticides including fungicides, insecticides/acaricides and herbicides, etc. Since most of the agrochemicals listed in the Pesticide Manual were discovered through screening programs that relied on trial-and-error testing and new agrochemical discovery is not benefiting as much from the in silico new chemical compound identification/discovery techniques used in pharmaceutical research, it has become more important to find new methods to enhance the efficiency of discovering novel lead compounds in the agrochemical field to shorten the time of research phases in order to meet changing market requirements. In this review, we selected 18 representative known agrochemicals containing a pyridine moiety and extrapolate their discovery from the perspective of Intermediate Derivatization Methods in the hope that this approach will have greater appeal to researchers engaged in the discovery of agrochemicals and/or pharmaceuticals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Developing a Data Discovery Tool for Interdisciplinary Science: Leveraging a Web-based Mapping Application and Geosemantic Searching

    Albeke, S. E.; Perkins, D. G.; Ewers, S. L.; Ewers, B. E.; Holbrook, W. S.; Miller, S. N.

    2015-12-01

    The sharing of data and results is paramount for advancing scientific research. The Wyoming Center for Environmental Hydrology and Geophysics (WyCEHG) is a multidisciplinary group that is driving scientific breakthroughs to help manage water resources in the Western United States. WyCEHG is mandated by the National Science Foundation (NSF) to share their data. However, the infrastructure from which to share such diverse, complex and massive amounts of data did not exist within the University of Wyoming. We developed an innovative framework to meet the data organization, sharing, and discovery requirements of WyCEHG by integrating both open and closed source software, embedded metadata tags, semantic web technologies, and a web-mapping application. The infrastructure uses a Relational Database Management System as the foundation, providing a versatile platform to store, organize, and query myriad datasets, taking advantage of both structured and unstructured formats. Detailed metadata are fundamental to the utility of datasets. We tag data with Uniform Resource Identifiers (URI's) to specify concepts with formal descriptions (i.e. semantic ontologies), thus allowing users the ability to search metadata based on the intended context rather than conventional keyword searches. Additionally, WyCEHG data are geographically referenced. Using the ArcGIS API for Javascript, we developed a web mapping application leveraging database-linked spatial data services, providing a means to visualize and spatially query available data in an intuitive map environment. Using server-side scripting (PHP), the mapping application, in conjunction with semantic search modules, dynamically communicates with the database and file system, providing access to available datasets. Our approach provides a flexible, comprehensive infrastructure from which to store and serve WyCEHG's highly diverse research-based data. This framework has not only allowed WyCEHG to meet its data stewardship

  15. Methodologies of Knowledge Discovery from Data and Data Mining Methods in Mechanical Engineering

    Rogalewicz Michał

    2016-12-01

    Full Text Available The paper contains a review of methodologies of a process of knowledge discovery from data and methods of data exploration (Data Mining, which are the most frequently used in mechanical engineering. The methodologies contain various scenarios of data exploring, while DM methods are used in their scope. The paper shows premises for use of DM methods in industry, as well as their advantages and disadvantages. Development of methodologies of knowledge discovery from data is also presented, along with a classification of the most widespread Data Mining methods, divided by type of realized tasks. The paper is summarized by presentation of selected Data Mining applications in mechanical engineering.

  16. Implementation Of Haversine Formula And Best First Search Method In Searching Of Tsunami Evacuation Route

    Anisya; Yoga Swara, Ganda

    2017-12-01

    Padang is one of the cities prone to earthquake disaster with tsunami due to its position at the meeting of two active plates, this is, a source of potentially powerful earthquake and tsunami. Central government and most offices are located in the red zone (vulnerable areas), it will also affect the evacuation of the population during the earthquake and tsunami disaster. In this study, researchers produced a system of search nearest shelter using best-first-search method. This method uses the heuristic function, the amount of cost taken and the estimated value or travel time, path length and population density. To calculate the length of the path, researchers used method of haversine formula. The value obtained from the calculation process is implemented on a web-based system. Some alternative paths and some of the closest shelters will be displayed in the system.

  17. The commission errors search and assessment (CESA) method

    Reer, B.; Dang, V. N

    2007-05-15

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  18. The commission errors search and assessment (CESA) method

    Reer, B.; Dang, V. N.

    2007-05-01

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  19. Fuzzy Search Method for Hi Education Information Security

    Grigory Grigorevich Novikov

    2016-03-01

    Full Text Available The main reason of the research is how to use fuzzy search method for information security of Hi Education or some similar purposes. So many sensitive information leaks are through non SUMMARY 149 classified documents legal publishing. That’s why many intelligence services so love to use the «mosaic» information collection method. This article is about how to prevent it.

  20. Geometrical Fuzzy Search Method for the Business Information Security Systems

    Grigory Grigorievich Novikov

    2014-12-01

    Full Text Available The main reason of the article is how to use one of new fuzzy search method for information security of business or some other purposes. So many sensitive information leaks are through non-classified documents legal publishing. That’s why many intelligence services like to use the “mosaic” information collection method so much: This article is about how to prevent it.

  1. Non-Adiabatic Molecular Dynamics Methods for Materials Discovery

    Furche, Filipp [Univ. of California, Irvine, CA (United States); Parker, Shane M. [Univ. of California, Irvine, CA (United States); Muuronen, Mikko J. [Univ. of California, Irvine, CA (United States); Roy, Saswata [Univ. of California, Irvine, CA (United States)

    2017-04-04

    The flow of radiative energy in light-driven materials such as photosensitizer dyes or photocatalysts is governed by non-adiabatic transitions between electronic states and cannot be described within the Born-Oppenheimer approximation commonly used in electronic structure theory. The non-adiabatic molecular dynamics (NAMD) methods based on Tully surface hopping and time-dependent density functional theory developed in this project have greatly extended the range of molecular materials that can be tackled by NAMD simulations. New algorithms to compute molecular excited state and response properties efficiently were developed. Fundamental limitations of common non-linear response methods were discovered and characterized. Methods for accurate computations of vibronic spectra of materials such as black absorbers were developed and applied. It was shown that open-shell TDDFT methods capture bond breaking in NAMD simulations, a longstanding challenge for single-reference molecular dynamics simulations. The methods developed in this project were applied to study the photodissociation of acetaldehyde and revealed that non-adiabatic effects are experimentally observable in fragment kinetic energy distributions. Finally, the project enabled the first detailed NAMD simulations of photocatalytic water oxidation by titania nanoclusters, uncovering the mechanism of this fundamentally important reaction for fuel generation and storage.

  2. The Discovery of Processing Stages: Extension of Sternberg's Method

    Anderson, John R; Zhang, Qiong; Borst, Jelmer P; Walsh, Matthew M

    2016-01-01

    We introduce a method for measuring the number and durations of processing stages from the electroencephalographic signal and apply it to the study of associative recognition. Using an extension of past research that combines multivariate pattern analysis with hidden semi-Markov models, the approach

  3. The Effect of Discovery Learning Method Application on Increasing Students' Listening Outcome and Social Attitude

    Hanafi

    2016-01-01

    Curriculum of 2013 has been started in schools appointed as the implementer. This curriculum, for English subject demands the students to improve their skills. To reach this one of the suggested methods is discovery learning since this method is considered appropriate to implement for increasing the students' ability especially to fulfill minimum…

  4. Study on boundary search method for DFM mesh generation

    Li Ri

    2012-08-01

    Full Text Available The boundary mesh of the casting model was determined by direct calculation on the triangular facets extracted from the STL file of the 3D model. Then the inner and outer grids of the model were identified by the algorithm in which we named Inner Seed Grid Method. Finally, a program to automatically generate a 3D FDM mesh was compiled. In the paper, a method named Triangle Contraction Search Method (TCSM was put forward to ensure not losing the boundary grids; while an algorithm to search inner seed grids to identify inner/outer grids of the casting model was also brought forward. Our algorithm was simple, clear and easy to construct program. Three examples for the casting mesh generation testified the validity of the program.

  5. The Use of Resistivity Methods in Terrestrial Forensic Searches

    Wolf, R. C.; Raisuddin, I.; Bank, C.

    2013-12-01

    The increasing use of near-surface geophysical methods in forensic searches has demonstrated the need for further studies to identify the ideal physical, environmental and temporal settings for each geophysical method. Previous studies using resistivity methods have shown promising results, but additional work is required to more accurately interpret and analyze survey findings. The Ontario Provincial Police's UCRT (Urban Search and Rescue; Chemical, Biolgical, Radiological, Nuclear and Explosives; Response Team) is collaborating with the University of Toronto and two additional universities in a multi-year study investigating the applications of near-surface geophysical methods to terrestrial forensic searches. In the summer of 2012, on a test site near Bolton, Ontario, the OPP buried weapons, drums and pigs (naked, tarped, and clothed) to simulate clandestine graves and caches. Our study aims to conduct repeat surveys using an IRIS Syscal Junior with 48 electrode switching system resistivity-meter. These surveys will monitor changes in resistivity reflecting decomposition of the object since burial, and identify the strengths and weaknesses of resistivity when used in a rural, clandestine burial setting. Our initial findings indicate the usefulness of this method, as prominent resistivity changes have been observed. We anticipate our results will help to assist law enforcement agencies in determining the type of resistivity results to expect based on time since burial, depth of burial and state of dress of the body.

  6. Exploration of Stellarator Configuration Space with Global Search Methods

    Mynick, H.E.; Pomphrey, N.; Ethier, S.

    2001-01-01

    An exploration of stellarator configuration space z for quasi-axisymmetric stellarator (QAS) designs is discussed, using methods which provide a more global view of that space. To this end, we have implemented a ''differential evolution'' (DE) search algorithm in an existing stellarator optimizer, which is much less prone to become trapped in local, suboptimal minima of the cost function chi than the local search methods used previously. This search algorithm is complemented by mapping studies of chi over z aimed at gaining insight into the results of the automated searches. We find that a wide range of the attractive QAS configurations previously found fall into a small number of classes, with each class corresponding to a basin of chi(z). We develop maps on which these earlier stellarators can be placed, the relations among them seen, and understanding gained into the physics differences between them. It is also found that, while still large, the region of z space containing practically realizable QAS configurations is much smaller than earlier supposed

  7. Assessment of the effectiveness of uranium deposit searching methods

    Suran, J.

    1998-01-01

    The following groups of uranium deposit searching methods are described: radiometric review of foreign work; aerial radiometric survey; automobile radiometric survey; emanation survey up to 1 m; emanation survey up to 2 m; ground radiometric survey; radiometric survey in pits; deep radiometric survey; combination of the above methods; and other methods (drilling survey). For vein-type deposits, the majority of Czech deposits were discovered in 1945-1965 by radiometric review of foreign work, automobile radiometric survey, and emanation survey up to 1 m. The first significant indications of sandstone type uranium deposits were observed in the mid-1960 by aerial radiometric survey and confirmed later by drilling. (P.A.)

  8. Bayesian methods in the search for MH370

    Davey, Sam; Holland, Ian; Rutten, Mark; Williams, Jason

    2016-01-01

    This book demonstrates how nonlinear/non-Gaussian Bayesian time series estimation methods were used to produce a probability distribution of potential MH370 flight paths. It provides details of how the probabilistic models of aircraft flight dynamics, satellite communication system measurements, environmental effects and radar data were constructed and calibrated. The probability distribution was used to define the search zone in the southern Indian Ocean. The book describes particle-filter based numerical calculation of the aircraft flight-path probability distribution and validates the method using data from several of the involved aircraft’s previous flights. Finally it is shown how the Reunion Island flaperon debris find affects the search probability distribution.

  9. Integration of first-principles methods and crystallographic database searches for new ferroelectrics: Strategies and explorations

    Bennett, Joseph W.; Rabe, Karin M.

    2012-01-01

    In this concept paper, the development of strategies for the integration of first-principles methods with crystallographic database mining for the discovery and design of novel ferroelectric materials is discussed, drawing on the results and experience derived from exploratory investigations on three different systems: (1) the double perovskite Sr(Sb 1/2 Mn 1/2 )O 3 as a candidate semiconducting ferroelectric; (2) polar derivatives of schafarzikite MSb 2 O 4 ; and (3) ferroelectric semiconductors with formula M 2 P 2 (S,Se) 6 . A variety of avenues for further research and investigation are suggested, including automated structure type classification, low-symmetry improper ferroelectrics, and high-throughput first-principles searches for additional representatives of structural families with desirable functional properties. - Graphical abstract: Integration of first-principles methods with crystallographic database mining, for the discovery and design of novel ferroelectric materials, could potentially lead to new classes of multifunctional materials. Highlights: ► Integration of first-principles methods and database mining. ► Minor structural families with desirable functional properties. ► Survey of polar entries in the Inorganic Crystal Structural Database.

  10. Polyphony: superposition independent methods for ensemble-based drug discovery.

    Pitt, William R; Montalvão, Rinaldo W; Blundell, Tom L

    2014-09-30

    Structure-based drug design is an iterative process, following cycles of structural biology, computer-aided design, synthetic chemistry and bioassay. In favorable circumstances, this process can lead to the structures of hundreds of protein-ligand crystal structures. In addition, molecular dynamics simulations are increasingly being used to further explore the conformational landscape of these complexes. Currently, methods capable of the analysis of ensembles of crystal structures and MD trajectories are limited and usually rely upon least squares superposition of coordinates. Novel methodologies are described for the analysis of multiple structures of a protein. Statistical approaches that rely upon residue equivalence, but not superposition, are developed. Tasks that can be performed include the identification of hinge regions, allosteric conformational changes and transient binding sites. The approaches are tested on crystal structures of CDK2 and other CMGC protein kinases and a simulation of p38α. Known interaction - conformational change relationships are highlighted but also new ones are revealed. A transient but druggable allosteric pocket in CDK2 is predicted to occur under the CMGC insert. Furthermore, an evolutionarily-conserved conformational link from the location of this pocket, via the αEF-αF loop, to phosphorylation sites on the activation loop is discovered. New methodologies are described and validated for the superimposition independent conformational analysis of large collections of structures or simulation snapshots of the same protein. The methodologies are encoded in a Python package called Polyphony, which is released as open source to accompany this paper [http://wrpitt.bitbucket.org/polyphony/].

  11. Cumulative query method for influenza surveillance using search engine data.

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  12. KNODWAT: a scientific framework application for testing knowledge discovery methods for the biomedical domain.

    Holzinger, Andreas; Zupan, Mario

    2013-06-13

    Professionals in the biomedical domain are confronted with an increasing mass of data. Developing methods to assist professional end users in the field of Knowledge Discovery to identify, extract, visualize and understand useful information from these huge amounts of data is a huge challenge. However, there are so many diverse methods and methodologies available, that for biomedical researchers who are inexperienced in the use of even relatively popular knowledge discovery methods, it can be very difficult to select the most appropriate method for their particular research problem. A web application, called KNODWAT (KNOwledge Discovery With Advanced Techniques) has been developed, using Java on Spring framework 3.1. and following a user-centered approach. The software runs on Java 1.6 and above and requires a web server such as Apache Tomcat and a database server such as the MySQL Server. For frontend functionality and styling, Twitter Bootstrap was used as well as jQuery for interactive user interface operations. The framework presented is user-centric, highly extensible and flexible. Since it enables methods for testing using existing data to assess suitability and performance, it is especially suitable for inexperienced biomedical researchers, new to the field of knowledge discovery and data mining. For testing purposes two algorithms, CART and C4.5 were implemented using the WEKA data mining framework.

  13. Virtual screening methods as tools for drug lead discovery from large chemical libraries.

    Ma, X H; Zhu, F; Liu, X; Shi, Z; Zhang, J X; Yang, S Y; Wei, Y Q; Chen, Y Z

    2012-01-01

    Virtual screening methods have been developed and explored as useful tools for searching drug lead compounds from chemical libraries, including large libraries that have become publically available. In this review, we discussed the new developments in exploring virtual screening methods for enhanced performance in searching large chemical libraries, their applications in screening libraries of ~ 1 million or more compounds in the last five years, the difficulties in their applications, and the strategies for further improving these methods.

  14. Improving Junior High School Students' Mathematical Analogical Ability Using Discovery Learning Method

    Maarif, Samsul

    2016-01-01

    The aim of this study was to identify the influence of discovery learning method towards the mathematical analogical ability of junior high school's students. This is a research using factorial design 2x2 with ANOVA-Two ways. The population of this research included the entire students of SMPN 13 Jakarta (State Junior High School 13 of Jakarta)…

  15. A Fast Radio Burst Search Method for VLBI Observation

    Liu, Lei; Tong, Fengxian; Zheng, Weimin; Zhang, Juan; Tong, Li

    2018-02-01

    We introduce the cross-spectrum-based fast radio burst (FRB) search method for Very Long Baseline Interferometer (VLBI) observation. This method optimizes the fringe fitting scheme in geodetic VLBI data post-processing, which fully utilizes the cross-spectrum fringe phase information and therefore maximizes the power of single-pulse signals. Working with cross-spectrum greatly reduces the effect of radio frequency interference compared with using auto-power spectrum. Single-pulse detection confidence increases by cross-identifying detections from multiple baselines. By combining the power of multiple baselines, we may improve the detection sensitivity. Our method is similar to that of coherent beam forming, but without the computational expense to form a great number of beams to cover the whole field of view of our telescopes. The data processing pipeline designed for this method is easy to implement and parallelize, which can be deployed in various kinds of VLBI observations. In particular, we point out that VGOS observations are very suitable for FRB search.

  16. Bioanalytical methods for food allergy diagnosis, allergen detection and new allergen discovery

    Gasilova, Natalia; Girault, Hubert H

    2015-01-01

    For effective monitoring and prevention of the food allergy, one of the emerging health problems nowadays, existing diagnostic procedures and allergen detection techniques are constantly improved. Meanwhile, new methods are also developed, and more and more putative allergens are discovered. This review describes traditional methods and summarizes recent advances in the fast evolving field of the in vitro food allergy diagnosis, allergen detection in food products and discovery of the new all...

  17. IMPROVING NEAREST NEIGHBOUR SEARCH IN 3D SPATIAL ACCESS METHOD

    A. Suhaibaha

    2016-10-01

    Full Text Available Nearest Neighbour (NN is one of the important queries and analyses for spatial application. In normal practice, spatial access method structure is used during the Nearest Neighbour query execution to retrieve information from the database. However, most of the spatial access method structures are still facing with unresolved issues such as overlapping among nodes and repetitive data entry. This situation will perform an excessive Input/Output (IO operation which is inefficient for data retrieval. The situation will become more crucial while dealing with 3D data. The size of 3D data is usually large due to its detail geometry and other attached information. In this research, a clustered 3D hierarchical structure is introduced as a 3D spatial access method structure. The structure is expected to improve the retrieval of Nearest Neighbour information for 3D objects. Several tests are performed in answering Single Nearest Neighbour search and k Nearest Neighbour (kNN search. The tests indicate that clustered hierarchical structure is efficient in handling Nearest Neighbour query compared to its competitor. From the results, clustered hierarchical structure reduced the repetitive data entry and the accessed page. The proposed structure also produced minimal Input/Output operation. The query response time is also outperformed compared to the other competitor. For future outlook of this research several possible applications are discussed and summarized.

  18. New procedure for criticality search using coarse mesh nodal methods

    Pereira, Wanderson F.; Silva, Fernando C. da; Martinez, Aquilino S.

    2011-01-01

    The coarse mesh nodal methods have as their primary goal to calculate the neutron flux inside the reactor core. Many computer systems use a specific form of calculation, which is called nodal method. In classical computing systems that use the criticality search is made after the complete convergence of the iterative process of calculating the neutron flux. In this paper, we proposed a new method for the calculation of criticality, condition which will be over very iterative process of calculating the neutron flux. Thus, the processing time for calculating the neutron flux was reduced by half compared with the procedure developed by the Nuclear Engineering Program of COPPE/UFRJ (PEN/COPPE/UFRJ). (author)

  19. New procedure for criticality search using coarse mesh nodal methods

    Pereira, Wanderson F.; Silva, Fernando C. da; Martinez, Aquilino S., E-mail: wneto@con.ufrj.b, E-mail: fernando@con.ufrj.b, E-mail: Aquilino@lmp.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    The coarse mesh nodal methods have as their primary goal to calculate the neutron flux inside the reactor core. Many computer systems use a specific form of calculation, which is called nodal method. In classical computing systems that use the criticality search is made after the complete convergence of the iterative process of calculating the neutron flux. In this paper, we proposed a new method for the calculation of criticality, condition which will be over very iterative process of calculating the neutron flux. Thus, the processing time for calculating the neutron flux was reduced by half compared with the procedure developed by the Nuclear Engineering Program of COPPE/UFRJ (PEN/COPPE/UFRJ). (author)

  20. Work Stress Interventions in Hospital Care: Effectiveness of the DISCovery Method

    Irene Niks

    2018-02-01

    Full Text Available Effective interventions to prevent work stress and to improve health, well-being, and performance of employees are of the utmost importance. This quasi-experimental intervention study presents a specific method for diagnosis of psychosocial risk factors at work and subsequent development and implementation of tailored work stress interventions, the so-called DISCovery method. This method aims at improving employee health, well-being, and performance by optimizing the balance between job demands, job resources, and recovery from work. The aim of the study is to quantitatively assess the effectiveness of the DISCovery method in hospital care. Specifically, we used a three-wave longitudinal, quasi-experimental multiple-case study approach with intervention and comparison groups in health care work. Positive changes were found for members of the intervention groups, relative to members of the corresponding comparison groups, with respect to targeted work-related characteristics and targeted health, well-being, and performance outcomes. Overall, results lend support for the effectiveness of the DISCovery method in hospital care.

  1. Work Stress Interventions in Hospital Care: Effectiveness of the DISCovery Method

    Niks, Irene; Gevers, Josette

    2018-01-01

    Effective interventions to prevent work stress and to improve health, well-being, and performance of employees are of the utmost importance. This quasi-experimental intervention study presents a specific method for diagnosis of psychosocial risk factors at work and subsequent development and implementation of tailored work stress interventions, the so-called DISCovery method. This method aims at improving employee health, well-being, and performance by optimizing the balance between job demands, job resources, and recovery from work. The aim of the study is to quantitatively assess the effectiveness of the DISCovery method in hospital care. Specifically, we used a three-wave longitudinal, quasi-experimental multiple-case study approach with intervention and comparison groups in health care work. Positive changes were found for members of the intervention groups, relative to members of the corresponding comparison groups, with respect to targeted work-related characteristics and targeted health, well-being, and performance outcomes. Overall, results lend support for the effectiveness of the DISCovery method in hospital care. PMID:29438350

  2. Work Stress Interventions in Hospital Care: Effectiveness of the DISCovery Method.

    Niks, Irene; de Jonge, Jan; Gevers, Josette; Houtman, Irene

    2018-02-13

    Effective interventions to prevent work stress and to improve health, well-being, and performance of employees are of the utmost importance. This quasi-experimental intervention study presents a specific method for diagnosis of psychosocial risk factors at work and subsequent development and implementation of tailored work stress interventions, the so-called DISCovery method. This method aims at improving employee health, well-being, and performance by optimizing the balance between job demands, job resources, and recovery from work. The aim of the study is to quantitatively assess the effectiveness of the DISCovery method in hospital care. Specifically, we used a three-wave longitudinal, quasi-experimental multiple-case study approach with intervention and comparison groups in health care work. Positive changes were found for members of the intervention groups, relative to members of the corresponding comparison groups, with respect to targeted work-related characteristics and targeted health, well-being, and performance outcomes. Overall, results lend support for the effectiveness of the DISCovery method in hospital care.

  3. Global OpenSearch

    Newman, D. J.; Mitchell, A. E.

    2015-12-01

    At AGU 2014, NASA EOSDIS demonstrated a case-study of an OpenSearch framework for Earth science data discovery. That framework leverages the IDN and CWIC OpenSearch API implementations to provide seamless discovery of data through the 'two-step' discovery process as outlined by the Federation for Earth Sciences (ESIP) OpenSearch Best Practices. But how would an Earth Scientist leverage this framework and what are the benefits? Using a client that understands the OpenSearch specification and, for further clarity, the various best practices and extensions, a scientist can discovery a plethora of data not normally accessible either by traditional methods (NASA Earth Data Search, Reverb, etc) or direct methods (going to the source of the data) We will demonstrate, via the CWICSmart web client, how an earth scientist can access regional data on a regional phenomena in a uniform and aggregated manner. We will demonstrate how an earth scientist can 'globalize' their discovery. You want to find local data on 'sea surface temperature of the Indian Ocean'? We can help you with that. 'European meteorological data'? Yes. 'Brazilian rainforest satellite imagery'? That too. CWIC allows you to get earth science data in a uniform fashion from a large number of disparate, world-wide agencies. This is what we mean by Global OpenSearch.

  4. Three-dimensional compound comparison methods and their application in drug discovery.

    Shin, Woong-Hee; Zhu, Xiaolei; Bures, Mark Gregory; Kihara, Daisuke

    2015-07-16

    Virtual screening has been widely used in the drug discovery process. Ligand-based virtual screening (LBVS) methods compare a library of compounds with a known active ligand. Two notable advantages of LBVS methods are that they do not require structural information of a target receptor and that they are faster than structure-based methods. LBVS methods can be classified based on the complexity of ligand structure information utilized: one-dimensional (1D), two-dimensional (2D), and three-dimensional (3D). Unlike 1D and 2D methods, 3D methods can have enhanced performance since they treat the conformational flexibility of compounds. In this paper, a number of 3D methods will be reviewed. In addition, four representative 3D methods were benchmarked to understand their performance in virtual screening. Specifically, we tested overall performance in key aspects including the ability to find dissimilar active compounds, and computational speed.

  5. Three-Dimensional Compound Comparison Methods and Their Application in Drug Discovery

    Woong-Hee Shin

    2015-07-01

    Full Text Available Virtual screening has been widely used in the drug discovery process. Ligand-based virtual screening (LBVS methods compare a library of compounds with a known active ligand. Two notable advantages of LBVS methods are that they do not require structural information of a target receptor and that they are faster than structure-based methods. LBVS methods can be classified based on the complexity of ligand structure information utilized: one-dimensional (1D, two-dimensional (2D, and three-dimensional (3D. Unlike 1D and 2D methods, 3D methods can have enhanced performance since they treat the conformational flexibility of compounds. In this paper, a number of 3D methods will be reviewed. In addition, four representative 3D methods were benchmarked to understand their performance in virtual screening. Specifically, we tested overall performance in key aspects including the ability to find dissimilar active compounds, and computational speed.

  6. Topology optimization based on the harmony search method

    Lee, Seung-Min; Han, Seog-Young

    2017-01-01

    A new topology optimization scheme based on a Harmony search (HS) as a metaheuristic method was proposed and applied to static stiffness topology optimization problems. To apply the HS to topology optimization, the variables in HS were transformed to those in topology optimization. Compliance was used as an objective function, and harmony memory was defined as the set of the optimized topology. Also, a parametric study for Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), and Bandwidth (BW) was performed to find the appropriate range for topology optimization. Various techniques were employed such as a filtering scheme, simple average scheme and harmony rate. To provide a robust optimized topology, the concept of the harmony rate update rule was also implemented. Numerical examples are provided to verify the effectiveness of the HS by comparing the optimal layouts of the HS with those of Bidirectional evolutionary structural optimization (BESO) and Artificial bee colony algorithm (ABCA). The following conclu- sions could be made: (1) The proposed topology scheme is very effective for static stiffness topology optimization problems in terms of stability, robustness and convergence rate. (2) The suggested method provides a symmetric optimized topology despite the fact that the HS is a stochastic method like the ABCA. (3) The proposed scheme is applicable and practical in manufacturing since it produces a solid-void design of the optimized topology. (4) The suggested method appears to be very effective for large scale problems like topology optimization.

  7. Topology optimization based on the harmony search method

    Lee, Seung-Min; Han, Seog-Young [Hanyang University, Seoul (Korea, Republic of)

    2017-06-15

    A new topology optimization scheme based on a Harmony search (HS) as a metaheuristic method was proposed and applied to static stiffness topology optimization problems. To apply the HS to topology optimization, the variables in HS were transformed to those in topology optimization. Compliance was used as an objective function, and harmony memory was defined as the set of the optimized topology. Also, a parametric study for Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), and Bandwidth (BW) was performed to find the appropriate range for topology optimization. Various techniques were employed such as a filtering scheme, simple average scheme and harmony rate. To provide a robust optimized topology, the concept of the harmony rate update rule was also implemented. Numerical examples are provided to verify the effectiveness of the HS by comparing the optimal layouts of the HS with those of Bidirectional evolutionary structural optimization (BESO) and Artificial bee colony algorithm (ABCA). The following conclu- sions could be made: (1) The proposed topology scheme is very effective for static stiffness topology optimization problems in terms of stability, robustness and convergence rate. (2) The suggested method provides a symmetric optimized topology despite the fact that the HS is a stochastic method like the ABCA. (3) The proposed scheme is applicable and practical in manufacturing since it produces a solid-void design of the optimized topology. (4) The suggested method appears to be very effective for large scale problems like topology optimization.

  8. Development of Pulsar Detection Methods for a Galactic Center Search

    Thornton, Stephen; Wharton, Robert; Cordes, James; Chatterjee, Shami

    2018-01-01

    Finding pulsars within the inner parsec of the galactic center would be incredibly beneficial: for pulsars sufficiently close to Sagittarius A*, extremely precise tests of general relativity in the strong field regime could be performed through measurement of post-Keplerian parameters. Binary pulsar systems with sufficiently short orbital periods could provide the same laboratories with which to test existing theories. Fast and efficient methods are needed to parse large sets of time-domain data from different telescopes to search for periodicity in signals and differentiate radio frequency interference (RFI) from pulsar signals. Here we demonstrate several techniques to reduce red noise (low-frequency interference), generate signals from pulsars in binary orbits, and create plots that allow for fast detection of both RFI and pulsars.

  9. Evaluation of a new method for librarian-mediated literature searches for systematic reviews

    W.M. Bramer (Wichor); Rethlefsen, M.L. (Melissa L.); F. Mast (Frans); J. Kleijnen (Jos)

    2017-01-01

    textabstractObjective: To evaluate and validate the time of completion and results of a new method of searching for systematic reviews, the exhaustive search method (ESM), using a pragmatic comparison. Methods: Single-line search strategies were prepared in a text document. Term completeness was

  10. A broken promise: microbiome differential abundance methods do not control the false discovery rate.

    Hawinkel, Stijn; Mattiello, Federico; Bijnens, Luc; Thas, Olivier

    2017-08-22

    High-throughput sequencing technologies allow easy characterization of the human microbiome, but the statistical methods to analyze microbiome data are still in their infancy. Differential abundance methods aim at detecting associations between the abundances of bacterial species and subject grouping factors. The results of such methods are important to identify the microbiome as a prognostic or diagnostic biomarker or to demonstrate efficacy of prodrug or antibiotic drugs. Because of a lack of benchmarking studies in the microbiome field, no consensus exists on the performance of the statistical methods. We have compared a large number of popular methods through extensive parametric and nonparametric simulation as well as real data shuffling algorithms. The results are consistent over the different approaches and all point to an alarming excess of false discoveries. This raises great doubts about the reliability of discoveries in past studies and imperils reproducibility of microbiome experiments. To further improve method benchmarking, we introduce a new simulation tool that allows to generate correlated count data following any univariate count distribution; the correlation structure may be inferred from real data. Most simulation studies discard the correlation between species, but our results indicate that this correlation can negatively affect the performance of statistical methods. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. In search of new methods. Qigong in stuttering therapy

    Paweł Półrola

    2013-10-01

    Full Text Available Introduction : Even though stuttering is probably as old a phenomenon as the human speech itself, the stuttering therapy is still a challenge for the therapist and requires constant searching for new methods. Qigong may prove to be one of them. Aim of the research: The research paper presents the results of an experimental investigation evaluating the usefulness of qigong practice in stuttering therapy. Material and methods: Two groups of stuttering adults underwent 6-month therapy. In group I – the experimental one (n = 11 – the therapy consisted of speech fluency training, psychotherapy and qigong practice. In group II – the control one (n = 12 – it included speech fluency training and psychotherapy. In both groups 2-hour sessions of speech fluency training and psychotherapy were conducted twice a week. Two-hour qigong sessions took place once a week. Results: After 6 months the therapy results were compared with regard to the basic stuttering parameters, such as the degree of speech disfluency, the level of logophobia and speech disfluency symptoms. Improvement was observed in both groups, the beneficial effects, however, being more prominent in the qigong-practising group. Conclusions : Qigong exercises used in the therapy of stuttering people along with speech fluency training and psychotherapy give beneficial effects.

  12. The self-organizing fractal theory as a universal discovery method: the phenomenon of life

    Kurakin Alexei

    2011-03-01

    Full Text Available Abstract A universal discovery method potentially applicable to all disciplines studying organizational phenomena has been developed. This method takes advantage of a new form of global symmetry, namely, scale-invariance of self-organizational dynamics of energy/matter at all levels of organizational hierarchy, from elementary particles through cells and organisms to the Universe as a whole. The method is based on an alternative conceptualization of physical reality postulating that the energy/matter comprising the Universe is far from equilibrium, that it exists as a flow, and that it develops via self-organization in accordance with the empirical laws of nonequilibrium thermodynamics. It is postulated that the energy/matter flowing through and comprising the Universe evolves as a multiscale, self-similar structure-process, i.e., as a self-organizing fractal. This means that certain organizational structures and processes are scale-invariant and are reproduced at all levels of the organizational hierarchy. Being a form of symmetry, scale-invariance naturally lends itself to a new discovery method that allows for the deduction of missing information by comparing scale-invariant organizational patterns across different levels of the organizational hierarchy. An application of the new discovery method to life sciences reveals that moving electrons represent a keystone physical force (flux that powers, animates, informs, and binds all living structures-processes into a planetary-wide, multiscale system of electron flow/circulation, and that all living organisms and their larger-scale organizations emerge to function as electron transport networks that are supported by and, at the same time, support the flow of electrons down the Earth's redox gradient maintained along the core-mantle-crust-ocean-atmosphere axis of the planet. The presented findings lead to a radically new perspective on the nature and origin of life, suggesting that living matter

  13. The self-organizing fractal theory as a universal discovery method: the phenomenon of life.

    Kurakin, Alexei

    2011-03-29

    A universal discovery method potentially applicable to all disciplines studying organizational phenomena has been developed. This method takes advantage of a new form of global symmetry, namely, scale-invariance of self-organizational dynamics of energy/matter at all levels of organizational hierarchy, from elementary particles through cells and organisms to the Universe as a whole. The method is based on an alternative conceptualization of physical reality postulating that the energy/matter comprising the Universe is far from equilibrium, that it exists as a flow, and that it develops via self-organization in accordance with the empirical laws of nonequilibrium thermodynamics. It is postulated that the energy/matter flowing through and comprising the Universe evolves as a multiscale, self-similar structure-process, i.e., as a self-organizing fractal. This means that certain organizational structures and processes are scale-invariant and are reproduced at all levels of the organizational hierarchy. Being a form of symmetry, scale-invariance naturally lends itself to a new discovery method that allows for the deduction of missing information by comparing scale-invariant organizational patterns across different levels of the organizational hierarchy.An application of the new discovery method to life sciences reveals that moving electrons represent a keystone physical force (flux) that powers, animates, informs, and binds all living structures-processes into a planetary-wide, multiscale system of electron flow/circulation, and that all living organisms and their larger-scale organizations emerge to function as electron transport networks that are supported by and, at the same time, support the flow of electrons down the Earth's redox gradient maintained along the core-mantle-crust-ocean-atmosphere axis of the planet. The presented findings lead to a radically new perspective on the nature and origin of life, suggesting that living matter is an organizational state

  14. Computational methods for 2D materials: discovery, property characterization, and application design.

    Paul, J T; Singh, A K; Dong, Z; Zhuang, H; Revard, B C; Rijal, B; Ashton, M; Linscheid, A; Blonsky, M; Gluhovic, D; Guo, J; Hennig, R G

    2017-11-29

    The discovery of two-dimensional (2D) materials comes at a time when computational methods are mature and can predict novel 2D materials, characterize their properties, and guide the design of 2D materials for applications. This article reviews the recent progress in computational approaches for 2D materials research. We discuss the computational techniques and provide an overview of the ongoing research in the field. We begin with an overview of known 2D materials, common computational methods, and available cyber infrastructures. We then move onto the discovery of novel 2D materials, discussing the stability criteria for 2D materials, computational methods for structure prediction, and interactions of monolayers with electrochemical and gaseous environments. Next, we describe the computational characterization of the 2D materials' electronic, optical, magnetic, and superconducting properties and the response of the properties under applied mechanical strain and electrical fields. From there, we move on to discuss the structure and properties of defects in 2D materials, and describe methods for 2D materials device simulations. We conclude by providing an outlook on the needs and challenges for future developments in the field of computational research for 2D materials.

  15. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    Zheluk, Andrey; Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's a...

  16. Performance comparison of a new hybrid conjugate gradient method under exact and inexact line searches

    Ghani, N. H. A.; Mohamed, N. S.; Zull, N.; Shoid, S.; Rivaie, M.; Mamat, M.

    2017-09-01

    Conjugate gradient (CG) method is one of iterative techniques prominently used in solving unconstrained optimization problems due to its simplicity, low memory storage, and good convergence analysis. This paper presents a new hybrid conjugate gradient method, named NRM1 method. The method is analyzed under the exact and inexact line searches in given conditions. Theoretically, proofs show that the NRM1 method satisfies the sufficient descent condition with both line searches. The computational result indicates that NRM1 method is capable in solving the standard unconstrained optimization problems used. On the other hand, the NRM1 method performs better under inexact line search compared with exact line search.

  17. Methods for Discovery and Surveillance of Pathogens in Hotspots of Emerging Infectious Diseases

    Jensen, Randi Holm

    Viruses are everywhere, and can infect all living things. They are constantly evolving, and new diseases are emerging as a result. Consequently, they have always been of interest to scientists and people in general. Several outbreaks of emerging infectious diseases transmitting from animals...... to virion enrichment compared to samples with no enrichment. We have used these methods to perform pathogen discovery in faecal samples collected from small mammals in Sierra Leone, to describe the presence of pathogenic viruses and bacteria in this area. From these data we were furthermore able to acquire...

  18. Systems-based biological concordance and predictive reproducibility of gene set discovery methods in cardiovascular disease.

    Azuaje, Francisco; Zheng, Huiru; Camargo, Anyela; Wang, Haiying

    2011-08-01

    The discovery of novel disease biomarkers is a crucial challenge for translational bioinformatics. Demonstration of both their classification power and reproducibility across independent datasets are essential requirements to assess their potential clinical relevance. Small datasets and multiplicity of putative biomarker sets may explain lack of predictive reproducibility. Studies based on pathway-driven discovery approaches have suggested that, despite such discrepancies, the resulting putative biomarkers tend to be implicated in common biological processes. Investigations of this problem have been mainly focused on datasets derived from cancer research. We investigated the predictive and functional concordance of five methods for discovering putative biomarkers in four independently-generated datasets from the cardiovascular disease domain. A diversity of biosignatures was identified by the different methods. However, we found strong biological process concordance between them, especially in the case of methods based on gene set analysis. With a few exceptions, we observed lack of classification reproducibility using independent datasets. Partial overlaps between our putative sets of biomarkers and the primary studies exist. Despite the observed limitations, pathway-driven or gene set analysis can predict potentially novel biomarkers and can jointly point to biomedically-relevant underlying molecular mechanisms. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. A modified harmony search based method for optimal rural radial ...

    International Journal of Engineering, Science and Technology. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 2, No 3 (2010) >. Log in or Register to get access to full text downloads.

  20. Bioanalytical methods for food allergy diagnosis, allergen detection and new allergen discovery.

    Gasilova, Natalia; Girault, Hubert H

    2015-01-01

    For effective monitoring and prevention of the food allergy, one of the emerging health problems nowadays, existing diagnostic procedures and allergen detection techniques are constantly improved. Meanwhile, new methods are also developed, and more and more putative allergens are discovered. This review describes traditional methods and summarizes recent advances in the fast evolving field of the in vitro food allergy diagnosis, allergen detection in food products and discovery of the new allergenic molecules. A special attention is paid to the new diagnostic methods under laboratory development like various immuno- and aptamer-based assays, including immunoaffinity capillary electrophoresis. The latter technique shows the importance of MS application not only for the allergen detection but also for the allergy diagnosis.

  1. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. Methods A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. Results We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r s = 0.88, P < .001), “Bychkov” (r s = .78, P < .001) and “Khimki”(r s = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for “Bychkov” and

  2. NMR and pattern recognition methods in metabolomics: From data acquisition to biomarker discovery: A review

    Smolinska, Agnieszka; Blanchet, Lionel; Buydens, Lutgarde M.C.; Wijmenga, Sybren S.

    2012-01-01

    Highlights: ► Procedures for acquisition of different biofluids by NMR. ► Recent developments in metabolic profiling of different biofluids by NMR are presented. ► The crucial steps involved in data preprocessing and multivariate chemometric analysis are reviewed. ► Emphasis is given on recent findings on Multiple Sclerosis via NMR and pattern recognition methods. - Abstract: Metabolomics is the discipline where endogenous and exogenous metabolites are assessed, identified and quantified in different biological samples. Metabolites are crucial components of biological system and highly informative about its functional state, due to their closeness to functional endpoints and to the organism's phenotypes. Nuclear Magnetic Resonance (NMR) spectroscopy, next to Mass Spectrometry (MS), is one of the main metabolomics analytical platforms. The technological developments in the field of NMR spectroscopy have enabled the identification and quantitative measurement of the many metabolites in a single sample of biofluids in a non-targeted and non-destructive manner. Combination of NMR spectra of biofluids and pattern recognition methods has driven forward the application of metabolomics in the field of biomarker discovery. The importance of metabolomics in diagnostics, e.g. in identifying biomarkers or defining pathological status, has been growing exponentially as evidenced by the number of published papers. In this review, we describe the developments in data acquisition and multivariate analysis of NMR-based metabolomics data, with particular emphasis on the metabolomics of Cerebrospinal Fluid (CSF) and biomarker discovery in Multiple Sclerosis (MScl).

  3. Searching for truth: internet search patterns as a method of investigating online responses to a Russian illicit drug policy debate.

    Zheluk, Andrey; Gillespie, James A; Quinn, Casey

    2012-12-13

    This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r(s) = 0.88, P < .001), "Bychkov" (r(s) = .78, P < .001) and "Khimki"(r(s) = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for "Bychkov" and 48,084 for "Egor Bychkov", compared to 53

  4. Evaluation of gene association methods for coexpression network construction and biological knowledge discovery.

    Sapna Kumari

    Full Text Available BACKGROUND: Constructing coexpression networks and performing network analysis using large-scale gene expression data sets is an effective way to uncover new biological knowledge; however, the methods used for gene association in constructing these coexpression networks have not been thoroughly evaluated. Since different methods lead to structurally different coexpression networks and provide different information, selecting the optimal gene association method is critical. METHODS AND RESULTS: In this study, we compared eight gene association methods - Spearman rank correlation, Weighted Rank Correlation, Kendall, Hoeffding's D measure, Theil-Sen, Rank Theil-Sen, Distance Covariance, and Pearson - and focused on their true knowledge discovery rates in associating pathway genes and construction coordination networks of regulatory genes. We also examined the behaviors of different methods to microarray data with different properties, and whether the biological processes affect the efficiency of different methods. CONCLUSIONS: We found that the Spearman, Hoeffding and Kendall methods are effective in identifying coexpressed pathway genes, whereas the Theil-sen, Rank Theil-Sen, Spearman, and Weighted Rank methods perform well in identifying coordinated transcription factors that control the same biological processes and traits. Surprisingly, the widely used Pearson method is generally less efficient, and so is the Distance Covariance method that can find gene pairs of multiple relationships. Some analyses we did clearly show Pearson and Distance Covariance methods have distinct behaviors as compared to all other six methods. The efficiencies of different methods vary with the data properties to some degree and are largely contingent upon the biological processes, which necessitates the pre-analysis to identify the best performing method for gene association and coexpression network construction.

  5. Top Quark Produced Through the Electroweak Force: Discovery Using the Matrix Element Analysis and Search for Heavy Gauge Bosons Using Boosted Decision Trees

    Pangilinan, Monica [Brown Univ., Providence, RI (United States)

    2010-05-01

    The top quark produced through the electroweak channel provides a direct measurement of the Vtb element in the CKM matrix which can be viewed as a transition rate of a top quark to a bottom quark. This production channel of top quark is also sensitive to different theories beyond the Standard Model such as heavy charged gauged bosons termed W'. This thesis measures the cross section of the electroweak produced top quark using a technique based on using the matrix elements of the processes under consideration. The technique is applied to 2.3 fb-1 of data from the D0 detector. From a comparison of the matrix element discriminants between data and the signal and background model using Bayesian statistics, we measure the cross section of the top quark produced through the electroweak mechanism σ(p$\\bar{p}$ → tb + X, tqb + X) = 4.30-1.20+0.98 pb. The measured result corresponds to a 4.9σ Gaussian-equivalent significance. By combining this analysis with other analyses based on the Bayesian Neural Network (BNN) and Boosted Decision Tree (BDT) method, the measured cross section is 3.94 ± 0.88 pb with a significance of 5.0σ, resulting in the discovery of electroweak produced top quarks. Using this measured cross section and constraining |Vtb| < 1, the 95% confidence level (C.L.) lower limit is |Vtb| > 0.78. Additionally, a search is made for the production of W' using the same samples from the electroweak produced top quark. An analysis based on the BDT method is used to separate the signal from expected backgrounds. No significant excess is found and 95% C.L. upper limits on the production cross section are set for W' with masses within 600-950 GeV. For four general models of W{prime} boson production using decay channel W' → t$\\bar{p}$, the lower mass limits are the following: M(W'L with SM couplings) > 840 GeV; M(W'R) > 880 GeV or 890 GeV if the

  6. A human-machine interface evaluation method: A difficulty evaluation method in information searching (DEMIS)

    Ha, Jun Su; Seong, Poong Hyun

    2009-01-01

    A human-machine interface (HMI) evaluation method, which is named 'difficulty evaluation method in information searching (DEMIS)', is proposed and demonstrated with an experimental study. The DEMIS is based on a human performance model and two measures of attentional-resource effectiveness in monitoring and detection tasks in nuclear power plants (NPPs). Operator competence and HMI design are modeled to be most significant factors to human performance. One of the two effectiveness measures is fixation-to-importance ratio (FIR) which represents attentional resource (eye fixations) spent on an information source compared to importance of the information source. The other measure is selective attention effectiveness (SAE) which incorporates FIRs for all information sources. The underlying principle of the measures is that the information source should be selectively attended to according to its informational importance. In this study, poor performance in information searching tasks is modeled to be coupled with difficulties caused by poor mental models of operators or/and poor HMI design. Human performance in information searching tasks is evaluated by analyzing the FIR and the SAE. Operator mental models are evaluated by a questionnaire-based method. Then difficulties caused by a poor HMI design are evaluated by a focused interview based on the FIR evaluation and then root causes leading to poor performance are identified in a systematic way.

  7. A MODERN SEARCH FOR WOLF–RAYET STARS IN THE MAGELLANIC CLOUDS. II. A SECOND YEAR OF DISCOVERIES

    Massey, Philip; Neugent, Kathryn F. [Lowell Observatory, 1400 W Mars Hill Road, Flagstaff, AZ 86001 (United States); Morrell, Nidia, E-mail: phil.massey@lowell.edu, E-mail: kneugent@lowell.edu, E-mail: nmorrell@lco.cl [Las Campanas Observatory, Carnegie Observatories, Casilla 601, La Serena (Chile)

    2015-07-01

    The numbers and types of evolved massive stars found in nearby galaxies provide an exacting test of stellar evolution models. Because of their proximity and rich massive star populations, the Magellanic Clouds have long served as the linchpins for such studies. Yet the continued accidental discoveries of Wolf–Rayet (WR) stars in these systems demonstrate that our knowledge is not as complete as usually assumed. Therefore, we undertook a multi-year survey for WRs in the Magellanic Clouds. Our results from our first year (reported previously) confirmed nine new LMC WRs. Of these, six were of a type never before recognized, with WN3-type emission combined with O3-type absorption features. Yet these stars are 2–3 mag too faint to be WN3+O3 V binaries. Here we report on the second year of our survey, including the discovery of four more WRs, two of which are also WN3/O3s, plus two “slash” WRs. This brings the total of known LMC WRs to 152, 13 (8.2%) of which were found by our survey, which is now ∼60% complete. We find that the spatial distribution of the WN3/O3s is similar to that of other WRs in the LMC, suggesting that they are descended from the same progenitors. We call attention to the fact that 5 of the 12 known SMC WRs may in fact be similar WN3/O3s rather than the binaries they have often assumed to be. We also discuss our other discoveries: a newly discovered Onfp-type star, and a peculiar emission-line object. Finally, we consider the completeness limits of our survey.

  8. Search Method Based on Figurative Indexation of Folksonomic Features of Graphic Files

    Oleg V. Bisikalo

    2013-11-01

    Full Text Available In this paper the search method based on usage of figurative indexation of folksonomic characteristics of graphical files is described. The method takes into account extralinguistic information, is based on using a model of figurative thinking of humans. The paper displays the creation of a method of searching image files based on their formal, including folksonomical clues.

  9. Searching methods for biometric identification systems: Fundamental limits

    Willems, F.M.J.

    2009-01-01

    We study two-stage search procedures for biometric identification systems in an information-theoretical setting. Our main conclusion is that clustering based on vector-quantization achieves the optimum trade-off between the number of clusters (cluster rate) and the number of individuals within a

  10. A SEARCH FOR L/T TRANSITION DWARFS WITH PAN-STARRS1 AND WISE. II. L/T TRANSITION ATMOSPHERES AND YOUNG DISCOVERIES

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene A.; Aller, Kimberly M.; Chambers, K. C.; Flewelling, H.; Hodapp, K. W.; Kaiser, N.; Tonry, J. L.; Wainscoat, R. J.; Waters, C.; Deacon, Niall R.; Redstone, Joshua; Burgett, W. S.; Draper, P.; Metcalfe, N.

    2015-01-01

    The evolution of brown dwarfs from L to T spectral types is one of the least understood aspects of the ultracool population, partly for lack of a large, well-defined, and well-characterized sample in the L/T transition. To improve the existing census, we have searched ≈28,000 deg 2 using the Pan-STARRS1 and Wide-field Infrared Survey Explorer surveys for L/T transition dwarfs within 25 pc. We present 130 ultracool dwarf discoveries with estimated distances ≈9–130 pc, including 21 that were independently discovered by other authors and 3 that were previously identified as photometric candidates. Seventy-nine of our objects have near-IR spectral types of L6–T4.5, the most L/T transition dwarfs from any search to date, and we have increased the census of L9–T1.5 objects within 25 pc by over 50%. The color distribution of our discoveries provides further evidence for the “L/T gap,” a deficit of objects with (J − K) MKO  ≈ 0.0–0.5 mag in the L/T transition, and thus reinforces the idea that the transition from cloudy to clear photospheres occurs rapidly. Among our discoveries are 31 candidate binaries based on their low-resolution spectral features. Two of these candidates are common proper motion companions to nearby main sequence stars; if confirmed as binaries, these would be rare benchmark systems with the potential to stringently test ultracool evolutionary models. Our search also serendipitously identified 23 late-M and L dwarfs with spectroscopic signs of low gravity implying youth, including 10 with vl-g or int-g gravity classifications and another 13 with indications of low gravity whose spectral types or modest spectral signal-to-noise ratio do not allow us to assign formal classifications. Finally, we identify 10 candidate members of nearby young moving groups (YMG) with spectral types L7–T4.5, including three showing spectroscopic signs of low gravity. If confirmed, any of these would be among the coolest known YMG members and would

  11. NMR and pattern recognition methods in metabolomics: From data acquisition to biomarker discovery: A review

    Smolinska, Agnieszka, E-mail: A.Smolinska@science.ru.nl [Institute for Molecules and Materials, Radboud University Nijmegen, Nijmegen (Netherlands); Blanchet, Lionel [Institute for Molecules and Materials, Radboud University Nijmegen, Nijmegen (Netherlands); Department of Biochemistry, Nijmegen Centre for Molecular Life Sciences, Radboud University Nijmegen Medical Centre, Nijmegen (Netherlands); Buydens, Lutgarde M.C.; Wijmenga, Sybren S. [Institute for Molecules and Materials, Radboud University Nijmegen, Nijmegen (Netherlands)

    2012-10-31

    Highlights: Black-Right-Pointing-Pointer Procedures for acquisition of different biofluids by NMR. Black-Right-Pointing-Pointer Recent developments in metabolic profiling of different biofluids by NMR are presented. Black-Right-Pointing-Pointer The crucial steps involved in data preprocessing and multivariate chemometric analysis are reviewed. Black-Right-Pointing-Pointer Emphasis is given on recent findings on Multiple Sclerosis via NMR and pattern recognition methods. - Abstract: Metabolomics is the discipline where endogenous and exogenous metabolites are assessed, identified and quantified in different biological samples. Metabolites are crucial components of biological system and highly informative about its functional state, due to their closeness to functional endpoints and to the organism's phenotypes. Nuclear Magnetic Resonance (NMR) spectroscopy, next to Mass Spectrometry (MS), is one of the main metabolomics analytical platforms. The technological developments in the field of NMR spectroscopy have enabled the identification and quantitative measurement of the many metabolites in a single sample of biofluids in a non-targeted and non-destructive manner. Combination of NMR spectra of biofluids and pattern recognition methods has driven forward the application of metabolomics in the field of biomarker discovery. The importance of metabolomics in diagnostics, e.g. in identifying biomarkers or defining pathological status, has been growing exponentially as evidenced by the number of published papers. In this review, we describe the developments in data acquisition and multivariate analysis of NMR-based metabolomics data, with particular emphasis on the metabolomics of Cerebrospinal Fluid (CSF) and biomarker discovery in Multiple Sclerosis (MScl).

  12. SemantGeo: Powering Ecological and Environment Data Discovery and Search with Standards-Based Geospatial Reasoning

    Seyed, P.; Ashby, B.; Khan, I.; Patton, E. W.; McGuinness, D. L.

    2013-12-01

    Recent efforts to create and leverage standards for geospatial data specification and inference include the GeoSPARQL standard, Geospatial OWL ontologies (e.g., GAZ, Geonames), and RDF triple stores that support GeoSPARQL (e.g., AllegroGraph, Parliament) that use RDF instance data for geospatial features of interest. However, there remains a gap on how best to fuse software engineering best practices and GeoSPARQL within semantic web applications to enable flexible search driven by geospatial reasoning. In this abstract we introduce the SemantGeo module for the SemantEco framework that helps fill this gap, enabling scientists find data using geospatial semantics and reasoning. SemantGeo provides multiple types of geospatial reasoning for SemantEco modules. The server side implementation uses the Parliament SPARQL Endpoint accessed via a Tomcat servlet. SemantGeo uses the Google Maps API for user-specified polygon construction and JsTree for providing containment and categorical hierarchies for search. SemantGeo uses GeoSPARQL for spatial reasoning alone and in concert with RDFS/OWL reasoning capabilities to determine, e.g., what geofeatures are within, partially overlap with, or within a certain distance from, a given polygon. We also leverage qualitative relationships defined by the Gazetteer ontology that are composites of spatial relationships as well as administrative designations or geophysical phenomena. We provide multiple mechanisms for exploring data, such as polygon (map-based) and named-feature (hierarchy-based) selection, that enable flexible search constraints using boolean combination of selections. JsTree-based hierarchical search facets present named features and include a 'part of' hierarchy (e.g., measurement-site-01, Lake George, Adirondack Region, NY State) and type hierarchies (e.g., nodes in the hierarchy for WaterBody, Park, MeasurementSite), depending on the ';axis of choice' option selected. Using GeoSPARQL and aforementioned ontology

  13. A study of certain Monte Carlo search and optimisation methods

    Budd, C.

    1984-11-01

    Studies are described which might lead to the development of a search and optimisation facility for the Monte Carlo criticality code MONK. The facility envisaged could be used to maximise a function of k-effective with respect to certain parameters of the system or, alternatively, to find the system (in a given range of systems) for which that function takes a given value. (UK)

  14. Discovery of abundant cellulose microfibers encased in 250 Ma Permian halite: a macromolecular target in the search for life on other planets.

    Griffith, Jack D; Willcox, Smaranda; Powers, Dennis W; Nelson, Roger; Baxter, Bonnie K

    2008-04-01

    In this study, we utilized transmission electron microscopy to examine the contents of fluid inclusions in halite (NaCl) and solid halite crystals collected 650 m below the surface from the Late Permian Salado Formation in southeastern New Mexico (USA). The halite has been isolated from contaminating groundwater since deposition approximately 250 Ma ago. We show that abundant cellulose microfibers are present in the halite and appear remarkably intact. The cellulose is in the form of 5 nm microfibers as well as composite ropes and mats, and was identified by resistance to 0.5 N NaOH treatment and susceptibility to cellulase enzyme treatment. These cellulose microfibers represent the oldest native biological macromolecules to have been directly isolated, examined biochemically, and visualized (without growth or replication) to date. This discovery points to cellulose as an ideal macromolecular target in the search for life on other planets in our Solar System.

  15. An Evaluation of Active Learning Causal Discovery Methods for Reverse-Engineering Local Causal Pathways of Gene Regulation

    Ma, Sisi; Kemmeren, Patrick; Aliferis, Constantin F.; Statnikov, Alexander

    2016-01-01

    Reverse-engineering of causal pathways that implicate diseases and vital cellular functions is a fundamental problem in biomedicine. Discovery of the local causal pathway of a target variable (that consists of its direct causes and direct effects) is essential for effective intervention and can facilitate accurate diagnosis and prognosis. Recent research has provided several active learning methods that can leverage passively observed high-throughput data to draft causal pathways and then refine the inferred relations with a limited number of experiments. The current study provides a comprehensive evaluation of the performance of active learning methods for local causal pathway discovery in real biological data. Specifically, 54 active learning methods/variants from 3 families of algorithms were applied for local causal pathways reconstruction of gene regulation for 5 transcription factors in S. cerevisiae. Four aspects of the methods’ performance were assessed, including adjacency discovery quality, edge orientation accuracy, complete pathway discovery quality, and experimental cost. The results of this study show that some methods provide significant performance benefits over others and therefore should be routinely used for local causal pathway discovery tasks. This study also demonstrates the feasibility of local causal pathway reconstruction in real biological systems with significant quality and low experimental cost. PMID:26939894

  16. 2011 HM{sub 102}: DISCOVERY OF A HIGH-INCLINATION L5 NEPTUNE TROJAN IN THE SEARCH FOR A POST-PLUTO NEW HORIZONS TARGET

    Parker, Alex H.; Holman, Matthew J.; McLeod, Brian A. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Buie, Marc W.; Borncamp, David M.; Spencer, John R.; Stern, S. Alan [Southwest Research Institute, 6220 Culebra Road, San Antonio, TX 78238 (United States); Osip, David J. [Carnegie Observatories, Las Campanas Observatory, Casilla 601, La Serena (Chile); Gwyn, Stephen D. J.; Fabbro, Sebastian; Kavelaars, J. J. [Canadian Astronomy Data Centre, National Research Council of Canada, 5071 W. Saanich Road, Victoria, BC V9E 2E7 (Canada); Benecchi, Susan D.; Sheppard, Scott S. [Department of Terrestrial Magnetism, Carnegie Institute of Washington, 5251 Broad Branch Road NW, Washington, DC 20015 (United States); Binzel, Richard P.; DeMeo, Francesca E. [Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139 (United States); Fuentes, Cesar I.; Trilling, David E. [Department of Physics and Astronomy, Northern Arizona University, S San Francisco St, Flagstaff, AZ 86011 (United States); Gay, Pamela L. [Center for Science, Technology, Engineering and Mathematics (STEM) Research, Education, and Outreach, Southern Illinois University, 1220 Lincoln Dr, Carbondale, IL 62901 (United States); Petit, Jean-Marc [CNRS, UTINAM, Universite de Franche Comte, Route de Gray, F-25030 Besancon Cedex, (France); Tholen, David J., E-mail: aparker@cfa.harvard.edu [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Dr, Honolulu, HI 96822 (United States); and others

    2013-04-15

    We present the discovery of a long-term stable L5 (trailing) Neptune Trojan in data acquired to search for candidate trans-Neptunian objects for the New Horizons spacecraft to fly by during an extended post-Pluto mission. This Neptune Trojan, 2011 HM{sub 102}, has the highest inclination (29. Degree-Sign 4) of any known member of this population. It is intrinsically brighter than any single L5 Jupiter Trojan at H{sub V} {approx} 8.18. We have determined its gri colors (a first for any L5 Neptune Trojan), which we find to be similar to the moderately red colors of the L4 Neptune Trojans, suggesting similar surface properties for members of both Trojan clouds. We also present colors derived from archival data for two L4 Neptune Trojans (2006 RJ{sub 103} and 2007 VL{sub 305}), better refining the overall color distribution of the population. In this document we describe the discovery circumstances, our physical characterization of 2011 HM{sub 102}, and this object's implications for the Neptune Trojan population overall. Finally, we discuss the prospects for detecting 2011 HM{sub 102} from the New Horizons spacecraft during its close approach in mid- to late-2013.

  17. 2011 HM102: DISCOVERY OF A HIGH-INCLINATION L5 NEPTUNE TROJAN IN THE SEARCH FOR A POST-PLUTO NEW HORIZONS TARGET

    Parker, Alex H.; Holman, Matthew J.; McLeod, Brian A.; Buie, Marc W.; Borncamp, David M.; Spencer, John R.; Stern, S. Alan; Osip, David J.; Gwyn, Stephen D. J.; Fabbro, Sébastian; Kavelaars, J. J.; Benecchi, Susan D.; Sheppard, Scott S.; Binzel, Richard P.; DeMeo, Francesca E.; Fuentes, Cesar I.; Trilling, David E.; Gay, Pamela L.; Petit, Jean-Marc; Tholen, David J.

    2013-01-01

    We present the discovery of a long-term stable L5 (trailing) Neptune Trojan in data acquired to search for candidate trans-Neptunian objects for the New Horizons spacecraft to fly by during an extended post-Pluto mission. This Neptune Trojan, 2011 HM 102 , has the highest inclination (29.°4) of any known member of this population. It is intrinsically brighter than any single L5 Jupiter Trojan at H V ∼ 8.18. We have determined its gri colors (a first for any L5 Neptune Trojan), which we find to be similar to the moderately red colors of the L4 Neptune Trojans, suggesting similar surface properties for members of both Trojan clouds. We also present colors derived from archival data for two L4 Neptune Trojans (2006 RJ 103 and 2007 VL 305 ), better refining the overall color distribution of the population. In this document we describe the discovery circumstances, our physical characterization of 2011 HM 102 , and this object's implications for the Neptune Trojan population overall. Finally, we discuss the prospects for detecting 2011 HM 102 from the New Horizons spacecraft during its close approach in mid- to late-2013.

  18. NetiNeti: discovery of scientific names from text using machine learning methods

    Akella Lakshmi

    2012-08-01

    Full Text Available Abstract Background A scientific name for an organism can be associated with almost all biological data. Name identification is an important step in many text mining tasks aiming to extract useful information from biological, biomedical and biodiversity text sources. A scientific name acts as an important metadata element to link biological information. Results We present NetiNeti (Name Extraction from Textual Information-Name Extraction for Taxonomic Indexing, a machine learning based approach for recognition of scientific names including the discovery of new species names from text that will also handle misspellings, OCR errors and other variations in names. The system generates candidate names using rules for scientific names and applies probabilistic machine learning methods to classify names based on structural features of candidate names and features derived from their contexts. NetiNeti can also disambiguate scientific names from other names using the contextual information. We evaluated NetiNeti on legacy biodiversity texts and biomedical literature (MEDLINE. NetiNeti performs better (precision = 98.9% and recall = 70.5% compared to a popular dictionary based approach (precision = 97.5% and recall = 54.3% on a 600-page biodiversity book that was manually marked by an annotator. On a small set of PubMed Central’s full text articles annotated with scientific names, the precision and recall values are 98.5% and 96.2% respectively. NetiNeti found more than 190,000 unique binomial and trinomial names in more than 1,880,000 PubMed records when used on the full MEDLINE database. NetiNeti also successfully identifies almost all of the new species names mentioned within web pages. Conclusions We present NetiNeti, a machine learning based approach for identification and discovery of scientific names. The system implementing the approach can be accessed at http://namefinding.ubio.org.

  19. Research on Large-Scale Road Network Partition and Route Search Method Combined with Traveler Preferences

    De-Xin Yu

    2013-01-01

    Full Text Available Combined with improved Pallottino parallel algorithm, this paper proposes a large-scale route search method, which considers travelers’ route choice preferences. And urban road network is decomposed into multilayers effectively. Utilizing generalized travel time as road impedance function, the method builds a new multilayer and multitasking road network data storage structure with object-oriented class definition. Then, the proposed path search algorithm is verified by using the real road network of Guangzhou city as an example. By the sensitive experiments, we make a comparative analysis of the proposed path search method with the current advanced optimal path algorithms. The results demonstrate that the proposed method can increase the road network search efficiency by more than 16% under different search proportion requests, node numbers, and computing process numbers, respectively. Therefore, this method is a great breakthrough in the guidance field of urban road network.

  20. Search for transient ultralight dark matter signatures with networks of precision measurement devices using a Bayesian statistics method

    Roberts, B. M.; Blewitt, G.; Dailey, C.; Derevianko, A.

    2018-04-01

    We analyze the prospects of employing a distributed global network of precision measurement devices as a dark matter and exotic physics observatory. In particular, we consider the atomic clocks of the global positioning system (GPS), consisting of a constellation of 32 medium-Earth orbit satellites equipped with either Cs or Rb microwave clocks and a number of Earth-based receiver stations, some of which employ highly-stable H-maser atomic clocks. High-accuracy timing data is available for almost two decades. By analyzing the satellite and terrestrial atomic clock data, it is possible to search for transient signatures of exotic physics, such as "clumpy" dark matter and dark energy, effectively transforming the GPS constellation into a 50 000 km aperture sensor array. Here we characterize the noise of the GPS satellite atomic clocks, describe the search method based on Bayesian statistics, and test the method using simulated clock data. We present the projected discovery reach using our method, and demonstrate that it can surpass the existing constrains by several order of magnitude for certain models. Our method is not limited in scope to GPS or atomic clock networks, and can also be applied to other networks of precision measurement devices.

  1. A hybrid computational method for the discovery of novel reproduction-related genes.

    Chen, Lei; Chu, Chen; Kong, Xiangyin; Huang, Guohua; Huang, Tao; Cai, Yu-Dong

    2015-01-01

    Uncovering the molecular mechanisms underlying reproduction is of great importance to infertility treatment and to the generation of healthy offspring. In this study, we discovered novel reproduction-related genes with a hybrid computational method, integrating three different types of method, which offered new clues for further reproduction research. This method was first executed on a weighted graph, constructed based on known protein-protein interactions, to search the shortest paths connecting any two known reproduction-related genes. Genes occurring in these paths were deemed to have a special relationship with reproduction. These newly discovered genes were filtered with a randomization test. Then, the remaining genes were further selected according to their associations with known reproduction-related genes measured by protein-protein interaction score and alignment score obtained by BLAST. The in-depth analysis of the high confidence novel reproduction genes revealed hidden mechanisms of reproduction and provided guidelines for further experimental validations.

  2. Impact of the Choice of Normalization Method on Molecular Cancer Class Discovery Using Nonnegative Matrix Factorization.

    Yang, Haixuan; Seoighe, Cathal

    2016-01-01

    Nonnegative Matrix Factorization (NMF) has proved to be an effective method for unsupervised clustering analysis of gene expression data. By the nonnegativity constraint, NMF provides a decomposition of the data matrix into two matrices that have been used for clustering analysis. However, the decomposition is not unique. This allows different clustering results to be obtained, resulting in different interpretations of the decomposition. To alleviate this problem, some existing methods directly enforce uniqueness to some extent by adding regularization terms in the NMF objective function. Alternatively, various normalization methods have been applied to the factor matrices; however, the effects of the choice of normalization have not been carefully investigated. Here we investigate the performance of NMF for the task of cancer class discovery, under a wide range of normalization choices. After extensive evaluations, we observe that the maximum norm showed the best performance, although the maximum norm has not previously been used for NMF. Matlab codes are freely available from: http://maths.nuigalway.ie/~haixuanyang/pNMF/pNMF.htm.

  3. MSSM Higgs boson searches at the LHC: benchmark scenarios after the discovery of a Higgs-like particle

    Carena, M.; Heinemeyer, S.; Staal, O.; Wagner, C.E.M.; Weiglein, G.

    2013-01-01

    A Higgs-like particle with a mass of about 125.5 GeV has been discovered at the LHC. Within the current experimental uncertainties, this new state is compatible with both the predictions for the Standard Model (SM) Higgs boson and with the Higgs sector in the Minimal Supersymmetric Standard Model (MSSM). We propose new low-energy MSSM benchmark scenarios that, over a wide parameter range, are compatible with the mass and production rates of the observed signal. These scenarios also exhibit interesting phenomenology for the MSSM Higgs sector. We propose a slightly updated version of the well-known m h max scenario, and a modified scenario (m h mod ), where the light CP-even Higgs boson can be interpreted as the LHC signal in large parts of the M A -tan β plane. Furthermore, we define a light stop scenario that leads to a suppression of the lightest CP-even Higgs gluon fusion rate, and a light stau scenario with an enhanced decay rate of h→γγ at large tan β. We also suggest a τ-phobic Higgs scenario in which the lightest Higgs can have suppressed couplings to down-type fermions. We propose to supplement the specified value of the μ parameter in some of these scenarios with additional values of both signs. This has a significant impact on the interpretation of searches for the non-SM-like MSSM Higgs bosons. We also discuss the sensitivity of the searches to heavy Higgs decays into light charginos and neutralinos, and to decays of the form H→hh. Finally, in addition to all the other scenarios where the lightest CP-even Higgs is interpreted as the LHC signal, we propose a low-M H scenario, where instead the heavy CP-even Higgs boson corresponds to the new state around 125.5 GeV. (orig.)

  4. The High Time Resolution Universe Pulsar Survey - XII. Galactic plane acceleration search and the discovery of 60 pulsars

    Ng, C.; Champion, D. J.; Bailes, M.; Barr, E. D.; Bates, S. D.; Bhat, N. D. R.; Burgay, M.; Burke-Spolaor, S.; Flynn, C. M. L.; Jameson, A.; Johnston, S.; Keith, M. J.; Kramer, M.; Levin, L.; Petroff, E.; Possenti, A.; Stappers, B. W.; van Straten, W.; Tiburzi, C.; Eatough, R. P.; Lyne, A. G.

    2015-07-01

    We present initial results from the low-latitude Galactic plane region of the High Time Resolution Universe pulsar survey conducted at the Parkes 64-m radio telescope. We discuss the computational challenges arising from the processing of the terabyte-sized survey data. Two new radio interference mitigation techniques are introduced, as well as a partially coherent segmented acceleration search algorithm which aims to increase our chances of discovering highly relativistic short-orbit binary systems, covering a parameter space including potential pulsar-black hole binaries. We show that under a constant acceleration approximation, a ratio of data length over orbital period of ≈0.1 results in the highest effectiveness for this search algorithm. From the 50 per cent of data processed thus far, we have redetected 435 previously known pulsars and discovered a further 60 pulsars, two of which are fast-spinning pulsars with periods less than 30 ms. PSR J1101-6424 is a millisecond pulsar whose heavy white dwarf (WD) companion and short spin period of 5.1 ms indicate a rare example of full-recycling via Case A Roche lobe overflow. PSR J1757-27 appears to be an isolated recycled pulsar with a relatively long spin period of 17 ms. In addition, PSR J1244-6359 is a mildly recycled binary system with a heavy WD companion, PSR J1755-25 has a significant orbital eccentricity of 0.09 and PSR J1759-24 is likely to be a long-orbit eclipsing binary with orbital period of the order of tens of years. Comparison of our newly discovered pulsar sample to the known population suggests that they belong to an older population. Furthermore, we demonstrate that our current pulsar detection yield is as expected from population synthesis.

  5. A SEARCH FOR L/T TRANSITION DWARFS WITH Pan-STARRS1 AND WISE: DISCOVERY OF SEVEN NEARBY OBJECTS INCLUDING TWO CANDIDATE SPECTROSCOPIC VARIABLES

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene A.; Aller, Kimberly M.; Burgett, W. S.; Chambers, K. C.; Hodapp, K. W.; Kaiser, N.; Kudritzki, R.-P.; Morgan, J. S.; Tonry, J. L.; Wainscoat, R. J.; Deacon, Niall R.; Dupuy, Trent J.; Redstone, Joshua; Price, P. A.

    2013-01-01

    We present initial results from a wide-field (30,000 deg 2 ) search for L/T transition brown dwarfs within 25 pc using the Pan-STARRS1 and Wide-field Infrared Survey Explorer (WISE) surveys. Previous large-area searches have been incomplete for L/T transition dwarfs, because these objects are faint in optical bands and have near-infrared (near-IR) colors that are difficult to distinguish from background stars. To overcome these obstacles, we have cross-matched the Pan-STARRS1 (optical) and WISE (mid-IR) catalogs to produce a unique multi-wavelength database for finding ultracool dwarfs. As part of our initial discoveries, we have identified seven brown dwarfs in the L/T transition within 9-15 pc of the Sun. The L9.5 dwarf PSO J140.2308+45.6487 and the T1.5 dwarf PSO J307.6784+07.8263 (both independently discovered by Mace et al.) show possible spectroscopic variability at the Y and J bands. Two more objects in our sample show evidence of photometric J-band variability, and two others are candidate unresolved binaries based on their spectra. We expect our full search to yield a well-defined, volume-limited sample of L/T transition dwarfs that will include many new targets for study of this complex regime. PSO J307.6784+07.8263 in particular may be an excellent candidate for in-depth study of variability, given its brightness (J = 14.2 mag) and proximity (11 pc)

  6. A SEARCH FOR L/T TRANSITION DWARFS WITH Pan-STARRS1 AND WISE: DISCOVERY OF SEVEN NEARBY OBJECTS INCLUDING TWO CANDIDATE SPECTROSCOPIC VARIABLES

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene A.; Aller, Kimberly M.; Burgett, W. S.; Chambers, K. C.; Hodapp, K. W.; Kaiser, N.; Kudritzki, R.-P.; Morgan, J. S.; Tonry, J. L.; Wainscoat, R. J. [Institute for Astronomy, University of Hawaii at Manoa, Honolulu, HI 96822 (United States); Deacon, Niall R. [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Dupuy, Trent J. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Redstone, Joshua [Facebook, 335 Madison Ave, New York, NY 10017-4677 (United States); Price, P. A., E-mail: wbest@ifa.hawaii.edu [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2013-11-10

    We present initial results from a wide-field (30,000 deg{sup 2}) search for L/T transition brown dwarfs within 25 pc using the Pan-STARRS1 and Wide-field Infrared Survey Explorer (WISE) surveys. Previous large-area searches have been incomplete for L/T transition dwarfs, because these objects are faint in optical bands and have near-infrared (near-IR) colors that are difficult to distinguish from background stars. To overcome these obstacles, we have cross-matched the Pan-STARRS1 (optical) and WISE (mid-IR) catalogs to produce a unique multi-wavelength database for finding ultracool dwarfs. As part of our initial discoveries, we have identified seven brown dwarfs in the L/T transition within 9-15 pc of the Sun. The L9.5 dwarf PSO J140.2308+45.6487 and the T1.5 dwarf PSO J307.6784+07.8263 (both independently discovered by Mace et al.) show possible spectroscopic variability at the Y and J bands. Two more objects in our sample show evidence of photometric J-band variability, and two others are candidate unresolved binaries based on their spectra. We expect our full search to yield a well-defined, volume-limited sample of L/T transition dwarfs that will include many new targets for study of this complex regime. PSO J307.6784+07.8263 in particular may be an excellent candidate for in-depth study of variability, given its brightness (J = 14.2 mag) and proximity (11 pc)

  7. Text mining for search term development in systematic reviewing: A discussion of some methods and challenges.

    Stansfield, Claire; O'Mara-Eves, Alison; Thomas, James

    2017-09-01

    Using text mining to aid the development of database search strings for topics described by diverse terminology has potential benefits for systematic reviews; however, methods and tools for accomplishing this are poorly covered in the research methods literature. We briefly review the literature on applications of text mining for search term development for systematic reviewing. We found that the tools can be used in 5 overarching ways: improving the precision of searches; identifying search terms to improve search sensitivity; aiding the translation of search strategies across databases; searching and screening within an integrated system; and developing objectively derived search strategies. Using a case study and selected examples, we then reflect on the utility of certain technologies (term frequency-inverse document frequency and Termine, term frequency, and clustering) in improving the precision and sensitivity of searches. Challenges in using these tools are discussed. The utility of these tools is influenced by the different capabilities of the tools, the way the tools are used, and the text that is analysed. Increased awareness of how the tools perform facilitates the further development of methods for their use in systematic reviews. Copyright © 2017 John Wiley & Sons, Ltd.

  8. New hybrid conjugate gradient methods with the generalized Wolfe line search.

    Xu, Xiao; Kong, Fan-Yu

    2016-01-01

    The conjugate gradient method was an efficient technique for solving the unconstrained optimization problem. In this paper, we made a linear combination with parameters β k of the DY method and the HS method, and putted forward the hybrid method of DY and HS. We also proposed the hybrid of FR and PRP by the same mean. Additionally, to present the two hybrid methods, we promoted the Wolfe line search respectively to compute the step size α k of the two hybrid methods. With the new Wolfe line search, the two hybrid methods had descent property and global convergence property of the two hybrid methods that can also be proved.

  9. Search for new and improved radiolabeling methods for monoclonal antibodies

    Hiltunen, J.V.

    1993-01-01

    In this review the selection of different radioisotopes is discussed as well as the various traditional or newer methods to introduce the radiolabel into the antibody structure. Labeling methods for radiohalogens, for technetium and rhenium isotopes, and for 3-valent cation radiometals are reviewed. Some of the newer methods offer simplified labeling procedures, but usually the new methods are more complicated than the earlier ones. However, new labeling methods are available for almost any radioelement group and they may result in better preserved original natural of the antibody and lead to better clinical results. (orig./MG)

  10. Early detection of pharmacovigilance signals with automated methods based on false discovery rates: a comparative study.

    Ahmed, Ismaïl; Thiessard, Frantz; Miremont-Salamé, Ghada; Haramburu, Françoise; Kreft-Jais, Carmen; Bégaud, Bernard; Tubert-Bitter, Pascale

    2012-06-01

    Improving the detection of drug safety signals has led several pharmacovigilance regulatory agencies to incorporate automated quantitative methods into their spontaneous reporting management systems. The three largest worldwide pharmacovigilance databases are routinely screened by the lower bound of the 95% confidence interval of proportional reporting ratio (PRR₀₂.₅), the 2.5% quantile of the Information Component (IC₀₂.₅) or the 5% quantile of the Gamma Poisson Shrinker (GPS₀₅). More recently, Bayesian and non-Bayesian False Discovery Rate (FDR)-based methods were proposed that address the arbitrariness of thresholds and allow for a built-in estimate of the FDR. These methods were also shown through simulation studies to be interesting alternatives to the currently used methods. The objective of this work was twofold. Based on an extensive retrospective study, we compared PRR₀₂.₅, GPS₀₅ and IC₀₂.₅ with two FDR-based methods derived from the Fisher's exact test and the GPS model (GPS(pH0) [posterior probability of the null hypothesis H₀ calculated from the Gamma Poisson Shrinker model]). Secondly, restricting the analysis to GPS(pH0), we aimed to evaluate the added value of using automated signal detection tools compared with 'traditional' methods, i.e. non-automated surveillance operated by pharmacovigilance experts. The analysis was performed sequentially, i.e. every month, and retrospectively on the whole French pharmacovigilance database over the period 1 January 1996-1 July 2002. Evaluation was based on a list of 243 reference signals (RSs) corresponding to investigations launched by the French Pharmacovigilance Technical Committee (PhVTC) during the same period. The comparison of detection methods was made on the basis of the number of RSs detected as well as the time to detection. Results comparing the five automated quantitative methods were in favour of GPS(pH0) in terms of both number of detections of true signals and

  11. Reverse screening methods to search for the protein targets of chemopreventive compounds

    Huang, Hongbin; Zhang, Guigui; Zhou, Yuquan; Lin, Chenru; Chen, Suling; Lin, Yutong; Mai, Shangkang; Huang, Zunnan

    2018-05-01

    This article is a systematic review of reverse screening methods used to search for the protein targets of chemopreventive compounds or drugs. Typical chemopreventive compounds include components of traditional Chinese medicine, natural compounds and Food and Drug Administration (FDA)-approved drugs. Such compounds are somewhat selective but are predisposed to bind multiple protein targets distributed throughout diverse signaling pathways in human cells. In contrast to conventional virtual screening, which identifies the ligands of a targeted protein from a compound database, reverse screening is used to identify the potential targets or unintended targets of a given compound from a large number of receptors by examining their known ligands or crystal structures. This method, also known as in silico or computational target fishing, is highly valuable for discovering the target receptors of query molecules from terrestrial or marine natural products, exploring the molecular mechanisms of chemopreventive compounds, finding alternative indications of existing drugs by drug repositioning, and detecting adverse drug reactions and drug toxicity. Reverse screening can be divided into three major groups: shape screening, pharmacophore screening and reverse docking. Several large software packages, such as Schrödinger and Discovery Studio; typical software/network services such as ChemMapper, PharmMapper, idTarget and INVDOCK; and practical databases of known target ligands and receptor crystal structures, such as ChEMBL, BindingDB and the Protein Data Bank (PDB), are available for use in these computational methods. Different programs, online services and databases have different applications and constraints. Here, we conducted a systematic analysis and multilevel classification of the computational programs, online services and compound libraries available for shape screening, pharmacophore screening and reverse docking to enable non-specialist users to quickly learn and

  12. MSSM Higgs boson searches at the LHC. Benchmark scenarios after the discovery of a Higgs-like particle

    Carena, M. [Fermilab, Batavia, IL (United States). Theoretical Physics Dept.; Chicago Univ., IL (United States). Enrico Fermi Inst.; Chicago Univ., IL (United States). Kavli Inst. for Cosmological Physics; Heinemeyer, S. [Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Staal, O. [Stockholm Univ. (Sweden). Dept. of Physics; Wagner, C.E.M. [Chicago Univ., IL (United States). Enrico Fermi Inst.; Chicago Univ., IL (United States). Kavli Inst. for Cosmological Physics; Argonne National Laboratory, Argonne, IL (United States). HEP Division; Weiglein, G. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2013-02-15

    A Higgs-like particle with a mass of about 125.5 GeV has been discovered at the LHC. Within the current experimental uncertainties, this new state is compatible with both the predictions for the Standard Model (SM) Higgs boson and with the Higgs sector in the Minimal Supersymmetric Standard Model (MSSM). We propose new low- energy MSSM benchmark scenarios that, over a wide parameter range, are compatible with the mass and production rates of the observed signal. These scenarios also exhibit interesting phenomenology for the MSSM Higgs sector. We propose a slightly updated version of the well-known m{sup max}{sub h} scenario, and a modified scenario (m{sup mod}{sub h}), where the light CP-even Higgs boson can be interpreted as the LHC signal in large parts of the M{sub A}-tan {beta} plane. Furthermore, we define a light stop scenario that leads to a suppression of the lightest CP-even Higgs gluon fusion rate, and a light stau scenario with an enhanced decay rate of h{yields}{gamma}{gamma} at large tan {beta}. We also suggest a {tau}-phobic Higgs scenario in which the lightest Higgs can have suppressed couplings to down-type fermions. We propose to supplement the specified value of the {mu} parameter in some of these scenarios with additional values of both signs. This has a significant impact on the interpretation of searches for the non SM-like MSSM Higgs bosons. We also discuss the sensitivity of the searches to heavy Higgs decays into light charginos and neutralinos, and to decays of the form H{yields}{gamma}{gamma}. Finally, in addition to all the other scenarios where the lightest CP-even Higgs is interpreted as the LHC signal, we propose a low-M{sub H} scenario, where instead the heavy CP-even Higgs boson corresponds to the new state around 125.5 GeV.

  13. Modification of the Armijo line search to satisfy the convergence properties of HS method

    Mohammed Belloufi

    2013-07-01

    Full Text Available The Hestenes-Stiefel (HS conjugate gradient algorithm is a useful tool of unconstrainednumerical optimization, which has good numerical performance but no global convergence result under traditional line searches. This paper proposes a line search technique that guarantee the globalconvergence of the Hestenes-Stiefel (HS conjugate gradient method. Numerical tests are presented tovalidate the different approaches.

  14. Comparison of sequencing based CNV discovery methods using monozygotic twin quartets.

    Marc-André Legault

    Full Text Available The advent of high throughput sequencing methods breeds an important amount of technical challenges. Among those is the one raised by the discovery of copy-number variations (CNVs using whole-genome sequencing data. CNVs are genomic structural variations defined as a variation in the number of copies of a large genomic fragment, usually more than one kilobase. Here, we aim to compare different CNV calling methods in order to assess their ability to consistently identify CNVs by comparison of the calls in 9 quartets of identical twin pairs. The use of monozygotic twins provides a means of estimating the error rate of each algorithm by observing CNVs that are inconsistently called when considering the rules of Mendelian inheritance and the assumption of an identical genome between twins. The similarity between the calls from the different tools and the advantage of combining call sets were also considered.ERDS and CNVnator obtained the best performance when considering the inherited CNV rate with a mean of 0.74 and 0.70, respectively. Venn diagrams were generated to show the agreement between the different algorithms, before and after filtering out familial inconsistencies. This filtering revealed a high number of false positives for CNVer and Breakdancer. A low overall agreement between the methods suggested a high complementarity of the different tools when calling CNVs. The breakpoint sensitivity analysis indicated that CNVnator and ERDS achieved better resolution of CNV borders than the other tools. The highest inherited CNV rate was achieved through the intersection of these two tools (81%.This study showed that ERDS and CNVnator provide good performance on whole genome sequencing data with respect to CNV consistency across families, CNV breakpoint resolution and CNV call specificity. The intersection of the calls from the two tools would be valuable for CNV genotyping pipelines.

  15. How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project

    Butler, Ricky W.; Hagen, George; Maddalon, Jeffrey M.; Munoz, Cesar A.; Narkawicz, Anthony; Dowek, Gilles

    2010-01-01

    In this paper we describe a process of algorithmic discovery that was driven by our goal of achieving complete, mechanically verified algorithms that compute conflict prevention bands for use in en route air traffic management. The algorithms were originally defined in the PVS specification language and subsequently have been implemented in Java and C++. We do not present the proofs in this paper: instead, we describe the process of discovery and the key ideas that enabled the final formal proof of correctness

  16. Searches for new Milky Way satellites from the first two years of data of the Subaru/Hyper Suprime-Cam survey: Discovery of Cetus III

    Homma, Daisuke; Chiba, Masashi; Okamoto, Sakurako; Komiyama, Yutaka; Tanaka, Masayuki; Tanaka, Mikito; Ishigaki, Miho N.; Hayashi, Kohei; Arimoto, Nobuo; Garmilla, José A.; Lupton, Robert H.; Strauss, Michael A.; Miyazaki, Satoshi; Wang, Shiang-Yu; Murayama, Hitoshi

    2018-01-01

    We present the results from a search for new Milky Way (MW) satellites from the first two years of data from the Hyper Suprime-Cam (HSC) Subaru Strategic Program (SSP) ˜300 deg2 and report the discovery of a highly compelling ultra-faint dwarf galaxy candidate in Cetus. This is the second ultra-faint dwarf we have discovered after Virgo I reported in our previous paper. This satellite, Cetus III, has been identified as a statistically significant (10.7 σ) spatial overdensity of star-like objects, which are selected from a relevant isochrone filter designed for a metal-poor and old stellar population. This stellar system is located at a heliocentric distance of 251^{+24}_{-11}kpc with a most likely absolute magnitude of MV = -2.4 ± 0.6 mag estimated from a Monte Carlo analysis. Cetus III is extended with a half-light radius of r_h = 90^{+42}_{-17}pc, suggesting that this is a faint dwarf satellite in the MW located beyond the detection limit of the Sloan Digital Sky Survey. Further spectroscopic studies are needed to assess the nature of this stellar system. We also revisit and update the parameters for Virgo I, finding M_V = -0.33^{+0.75}_{-0.87}mag and r_h = 47^{+19}_{-13}pc. Using simulations of Λ-dominated cold dark matter models, we predict that we should find one or two new MW satellites from ˜300 deg2 HSC-SSP data, in rough agreement with the discovery rate so far. The further survey and completion of HSC-SSP over ˜1400 deg2 will provide robust insights into the missing satellites problem.

  17. Program for searching for semiempirical parameters by the MNDO method

    Bliznyuk, A.A.; Voityuk, A.A.

    1987-01-01

    The authors describe an program for optimizing atomic models constructed using the MNDO method which varies not only the parameters but also the scope for simple changes in the calculation scheme. The target function determines properties such as formation enthalpies, dipole moments, ionization potentials, and geometrical parameters. Software used to minimize the target function is based on the simplex method on the Nelder-Mead algorithm and on the Fletcher variable-metric method. The program is written in FORTRAN IV and implemented on the ES computer

  18. An effective suggestion method for keyword search of databases

    Huang, Hai; Chen, Zonghai; Liu, Chengfei; Huang, He; Zhang, Xiangliang

    2016-01-01

    This paper solves the problem of providing high-quality suggestions for user keyword queries over databases. With the assumption that the returned suggestions are independent, existing query suggestion methods over databases score candidate

  19. Search and foraging behaviors from movement data: A comparison of methods.

    Bennison, Ashley; Bearhop, Stuart; Bodey, Thomas W; Votier, Stephen C; Grecian, W James; Wakefield, Ewan D; Hamer, Keith C; Jessopp, Mark

    2018-01-01

    Search behavior is often used as a proxy for foraging effort within studies of animal movement, despite it being only one part of the foraging process, which also includes prey capture. While methods for validating prey capture exist, many studies rely solely on behavioral annotation of animal movement data to identify search and infer prey capture attempts. However, the degree to which search correlates with prey capture is largely untested. This study applied seven behavioral annotation methods to identify search behavior from GPS tracks of northern gannets ( Morus bassanus ), and compared outputs to the occurrence of dives recorded by simultaneously deployed time-depth recorders. We tested how behavioral annotation methods vary in their ability to identify search behavior leading to dive events. There was considerable variation in the number of dives occurring within search areas across methods. Hidden Markov models proved to be the most successful, with 81% of all dives occurring within areas identified as search. k -Means clustering and first passage time had the highest rates of dives occurring outside identified search behavior. First passage time and hidden Markov models had the lowest rates of false positives, identifying fewer search areas with no dives. All behavioral annotation methods had advantages and drawbacks in terms of the complexity of analysis and ability to reflect prey capture events while minimizing the number of false positives and false negatives. We used these results, with consideration of analytical difficulty, to provide advice on the most appropriate methods for use where prey capture behavior is not available. This study highlights a need to critically assess and carefully choose a behavioral annotation method suitable for the research question being addressed, or resulting species management frameworks established.

  20. Short Term Gain, Long Term Pain:Informal Job Search Methods and Post-Displacement Outcomes

    Green, Colin

    2012-01-01

    This paper examines the role of informal job search methods on the labour market outcomes of displaced workers. Informal job search methods could alleviate short-term labour market difficulties of displaced workers by providing information on job opportunities, allowing them to signal their productivity and may mitigate wage losses through better post-displacement job matching. However if displacement results from reductions in demand for specific sectors/skills, the use of informal job searc...

  1. Wide Binaries in TGAS: Search Method and First Results

    Andrews, Jeff J.; Chanamé, Julio; Agüeros, Marcel A.

    2018-04-01

    Half of all stars reside in binary systems, many of which have orbital separations in excess of 1000 AU. Such binaries are typically identified in astrometric catalogs by matching the proper motions vectors of close stellar pairs. We present a fully Bayesian method that properly takes into account positions, proper motions, parallaxes, and their correlated uncertainties to identify widely separated stellar binaries. After applying our method to the >2 × 106 stars in the Tycho-Gaia astrometric solution from Gaia DR1, we identify over 6000 candidate wide binaries. For those pairs with separations less than 40,000 AU, we determine the contamination rate to be ~5%. This sample has an orbital separation (a) distribution that is roughly flat in log space for separations less than ~5000 AU and follows a power law of a -1.6 at larger separations.

  2. Searching for Suicide Methods: Accessibility of Information About Helium as a Method of Suicide on the Internet.

    Gunnell, David; Derges, Jane; Chang, Shu-Sen; Biddle, Lucy

    2015-01-01

    Helium gas suicides have increased in England and Wales; easy-to-access descriptions of this method on the Internet may have contributed to this rise. To investigate the availability of information on using helium as a method of suicide and trends in searching about this method on the Internet. We analyzed trends in (a) Google searching (2004-2014) and (b) hits on a Wikipedia article describing helium as a method of suicide (2013-2014). We also investigated the extent to which helium was described as a method of suicide on web pages and discussion forums identified via Google. We found no evidence of rises in Internet searching about suicide using helium. News stories about helium suicides were associated with increased search activity. The Wikipedia article may have been temporarily altered to increase awareness of suicide using helium around the time of a celebrity suicide. Approximately one third of the links retrieved using Google searches for suicide methods mentioned helium. Information about helium as a suicide method is readily available on the Internet; the Wikipedia article describing its use was highly accessed following celebrity suicides. Availability of online information about this method may contribute to rises in helium suicides.

  3. A TARGETED SEARCH FOR PECULIARLY RED L AND T DWARFS IN SDSS, 2MASS, AND WISE: DISCOVERY OF A POSSIBLE L7 MEMBER OF THE TW HYDRAE ASSOCIATION

    Kellogg, Kendra; Metchev, Stanimir [Western University, Centre for Planetary and Space Exploration, 1151 Richmond St, London, ON N6A 3K7 (Canada); Geißler, Kerstin; Hicks, Shannon [Stony Brook University, Stony Brook, NY 11790 (United States); Kirkpatrick, J. Davy [Infrared Processing and Analysis Center, Mail Code 100-22, California Institute of Technology, 1200 E. California Blvd., Pasadena, CA 91125 (United States); Kurtev, Radostin, E-mail: kkellogg@uwo.ca, E-mail: smetchev@uwo.ca [Instituto de Física y Astronomía, Facultad de Ciencias, Universidad de Valparaíso, Ave. Gran Bretaña 1111, Playa Ancha, Casilla 53, Valparaíso (Chile)

    2015-12-15

    We present the first results from a targeted search for brown dwarfs with unusual red colors indicative of peculiar atmospheric characteristics. These include objects with low surface gravities or with unusual dust content or cloud properties. From a positional cross-match of SDSS, 2MASS, and WISE, we have identified 40 candidate peculiar early-L to early-T dwarfs that are either new objects or have not been identified as peculiar through prior spectroscopy. Using low-resolution spectra, we confirm that 10 of the candidates are either peculiar or potential L/T binaries. With a J − K{sub s} color of 2.62 ± 0.15 mag, one of the new objects—the L7 dwarf 2MASS J11193254–1137466—is among the reddest field dwarfs currently known. Its proper motion and photometric parallax indicate that it is a possible member of the TW Hydrae moving group. If confirmed, it would be the lowest-mass (5–6 M{sub Jup}) free-floating member. We also report a new T dwarf, 2MASS J22153705+2110554, that was previously overlooked in the SDSS footprint. These new discoveries demonstrate that despite the considerable scrutiny already devoted to the SDSS and 2MASS surveys, our exploration of these data sets is not yet complete.

  4. A TARGETED SEARCH FOR PECULIARLY RED L AND T DWARFS IN SDSS, 2MASS, AND WISE: DISCOVERY OF A POSSIBLE L7 MEMBER OF THE TW HYDRAE ASSOCIATION

    Kellogg, Kendra; Metchev, Stanimir; Geißler, Kerstin; Hicks, Shannon; Kirkpatrick, J. Davy; Kurtev, Radostin

    2015-01-01

    We present the first results from a targeted search for brown dwarfs with unusual red colors indicative of peculiar atmospheric characteristics. These include objects with low surface gravities or with unusual dust content or cloud properties. From a positional cross-match of SDSS, 2MASS, and WISE, we have identified 40 candidate peculiar early-L to early-T dwarfs that are either new objects or have not been identified as peculiar through prior spectroscopy. Using low-resolution spectra, we confirm that 10 of the candidates are either peculiar or potential L/T binaries. With a J − K s color of 2.62 ± 0.15 mag, one of the new objects—the L7 dwarf 2MASS J11193254–1137466—is among the reddest field dwarfs currently known. Its proper motion and photometric parallax indicate that it is a possible member of the TW Hydrae moving group. If confirmed, it would be the lowest-mass (5–6 M Jup ) free-floating member. We also report a new T dwarf, 2MASS J22153705+2110554, that was previously overlooked in the SDSS footprint. These new discoveries demonstrate that despite the considerable scrutiny already devoted to the SDSS and 2MASS surveys, our exploration of these data sets is not yet complete

  5. COMPUTER-IMPLEMENTED METHOD OF PERFORMING A SEARCH USING SIGNATURES

    2017-01-01

    A computer-implemented method of processing a query vector and a data vector), comprising: generating a set of masks and a first set of multiple signatures and a second set of multiple signatures by applying the set of masks to the query vector and the data vector, respectively, and generating...... candidate pairs, of a first signature and a second signature, by identifying matches of a first signature and a second signature. The set of masks comprises a configuration of the elements that is a Hadamard code; a permutation of a Hadamard code; or a code that deviates from a Hadamard code...

  6. An effective suggestion method for keyword search of databases

    Huang, Hai

    2016-09-09

    This paper solves the problem of providing high-quality suggestions for user keyword queries over databases. With the assumption that the returned suggestions are independent, existing query suggestion methods over databases score candidate suggestions individually and return the top-k best of them. However, the top-k suggestions have high redundancy with respect to the topics. To provide informative suggestions, the returned k suggestions are expected to be diverse, i.e., maximizing the relevance to the user query and the diversity with respect to topics that the user might be interested in simultaneously. In this paper, an objective function considering both factors is defined for evaluating a suggestion set. We show that maximizing the objective function is a submodular function maximization problem subject to n matroid constraints, which is an NP-hard problem. An greedy approximate algorithm with an approximation ratio O((Formula presented.)) is also proposed. Experimental results show that our suggestion outperforms other methods on providing relevant and diverse suggestions. © 2016 Springer Science+Business Media New York

  7. Beam angle optimization for intensity-modulated radiation therapy using a guided pattern search method

    Rocha, Humberto; Dias, Joana M; Ferreira, Brígida C; Lopes, Maria C

    2013-01-01

    Generally, the inverse planning of radiation therapy consists mainly of the fluence optimization. The beam angle optimization (BAO) in intensity-modulated radiation therapy (IMRT) consists of selecting appropriate radiation incidence directions and may influence the quality of the IMRT plans, both to enhance better organ sparing and to improve tumor coverage. However, in clinical practice, most of the time, beam directions continue to be manually selected by the treatment planner without objective and rigorous criteria. The goal of this paper is to introduce a novel approach that uses beam’s-eye-view dose ray tracing metrics within a pattern search method framework in the optimization of the highly non-convex BAO problem. Pattern search methods are derivative-free optimization methods that require a few function evaluations to progress and converge and have the ability to better avoid local entrapment. The pattern search method framework is composed of a search step and a poll step at each iteration. The poll step performs a local search in a mesh neighborhood and ensures the convergence to a local minimizer or stationary point. The search step provides the flexibility for a global search since it allows searches away from the neighborhood of the current iterate. Beam’s-eye-view dose metrics assign a score to each radiation beam direction and can be used within the pattern search framework furnishing a priori knowledge of the problem so that directions with larger dosimetric scores are tested first. A set of clinical cases of head-and-neck tumors treated at the Portuguese Institute of Oncology of Coimbra is used to discuss the potential of this approach in the optimization of the BAO problem. (paper)

  8. BICLUSTERING METHODS FOR RE-ORDERING DATA MATRICES IN SYSTEMS BIOLOGY, DRUG DISCOVERY AND TOXICOLOGY

    Christodoulos A. Floudas

    2010-12-01

    Full Text Available Biclustering has emerged as an important problem in the analysis of gene expression data since genes may only jointly respond over a subset of conditions. Many of the methods for biclustering, and clustering algorithms in general, utilize simplified models or heuristic strategies for identifying the ``best'' grouping of elements according to some metric and cluster definition and thus result in suboptimal clusters. In the first part of the presentation, we present a rigorous approach to biclustering, OREO, which is based on the Optimal RE-Ordering of the rows and columns of a data matrix so as to globally minimize the dissimilarity metric [1,2]. The physical permutations of the rows and columns of the data matrix can be modeled as either a network flow problem or a traveling salesman problem. The performance of OREO is tested on several important data matrices arising in systems biology to validate the ability of the proposed method and compare it to existing biclustering and clustering methods. In the second part of the talk, we will focus on novel methods for clustering of data matrices that are very sparse [3]. These types of data matrices arise in drug discovery where the x- and y-axis of a data matrix can correspond to different functional groups for two distinct substituent sites on a molecular scaffold. Each possible x and y pair corresponds to a single molecule which can be synthesized and tested for a certain property, such as percent inhibition of a protein function. For even moderate size matrices, synthesizing and testing a small fraction of the molecules is labor intensive and not economically feasible. Thus, it is of paramount importance to have a reliable method for guiding the synthesis process to select molecules that have a high probability of success. In the second part of the presentation, we introduce a new strategy to enable efficient substituent reordering and descriptor-free property estimation. Our approach casts

  9. Evaluation Tool for the Application of Discovery Teaching Method in the Greek Environmental School Projects

    Kalathaki, Maria

    2015-01-01

    Greek school community emphasizes on the discovery direction of teaching methodology in the school Environmental Education (EE) in order to promote Education for the Sustainable Development (ESD). In ESD school projects the used methodology is experiential teamwork for inquiry based learning. The proposed tool checks whether and how a school…

  10. Accidental Discovery of Information on the User-Defined Social Web: A Mixed-Method Study

    Lu, Chi-Jung

    2012-01-01

    Frequently interacting with other people or working in an information-rich environment can foster the "accidental discovery of information" (ADI) (Erdelez, 2000; McCay-Peet & Toms, 2010). With the increasing adoption of social web technologies, online user-participation communities and user-generated content have provided users the…

  11. Machine Learning Methods for Knowledge Discovery in Medical Data on Atherosclerosis

    Serrano, J.I.; Tomečková, Marie; Zvárová, Jana

    2006-01-01

    Roč. 1, - (2006), s. 6-33 ISSN 1801-5603 Institutional research plan: CEZ:AV0Z10300504 Keywords : knowledge discovery * supervised machine learning * biomedical data mining * risk factors of atherosclerosis Subject RIV: BB - Applied Statistics, Operational Research

  12. A review of the scientific rationale and methods used in the search for other planetary systems

    Black, D. C.

    1985-01-01

    Planetary systems appear to be one of the crucial links in the chain leading from simple molecules to living systems, particularly complex (intelligent?) living systems. Although there is currently no observational proof of the existence of any planetary system other than our own, techniques are now being developed which will permit a comprehensive search for other planetary systems. The scientific rationale for and methods used in such a search effort are reviewed here.

  13. Low-Mode Conformational Search Method with Semiempirical Quantum Mechanical Calculations: Application to Enantioselective Organocatalysis.

    Kamachi, Takashi; Yoshizawa, Kazunari

    2016-02-22

    A conformational search program for finding low-energy conformations of large noncovalent complexes has been developed. A quantitatively reliable semiempirical quantum mechanical PM6-DH+ method, which is able to accurately describe noncovalent interactions at a low computational cost, was employed in contrast to conventional conformational search programs in which molecular mechanical methods are usually adopted. Our approach is based on the low-mode method whereby an initial structure is perturbed along one of its low-mode eigenvectors to generate new conformations. This method was applied to determine the most stable conformation of transition state for enantioselective alkylation by the Maruoka and cinchona alkaloid catalysts and Hantzsch ester hydrogenation of imines by chiral phosphoric acid. Besides successfully reproducing the previously reported most stable DFT conformations, the conformational search with the semiempirical quantum mechanical calculations newly discovered a more stable conformation at a low computational cost.

  14. New particle searches and discoveries

    Trippe, T.G.; Barbaro-Galtieri, A.; Horne, C.P.; Kelly, R.L.; Rittenberg, A.; Rosenfeld, A.H.; Yost, G.P.; Armstrong, B.; Bricman, C.; Hemingway, R.J.; Losty, M.J.; Roos, M.

    1977-01-01

    This supplement to the 1976 edition of 'Review of particle properties', Particle Data Group [Rev. Mod. Phys. 48, No. 2, Part II (1976)], contains tabulations of experimental data bearing on the 'new particles' and related topics; categories covered include charmed particles, psi's and their decay products, and heavy leptons. Errata to the previous edition are also given. (Auth.)

  15. A cross-correlation method to search for gravitational wave bursts with AURIGA and Virgo

    Bignotto, M.; Bonaldi, M.; Camarda, M.; Cerdonio, M.; Conti, L.; Drago, M.; Falferi, P.; Liguori, N.; Longo, S.; Mezzena, R.; Mion, A.; Ortolan, A.; Prodi, G. A.; Re, V.; Salemi, F.; Taffarello, L.; Vedovato, G.; Vinante, A.; Vitale, S.; Zendri, J. -P.; Acernese, F.; Alshourbagy, Mohamed; Amico, Paolo; Antonucci, Federica; Aoudia, S.; Astone, P.; Avino, Saverio; Baggio, L.; Ballardin, G.; Barone, F.; Barsotti, L.; Barsuglia, M.; Bauer, Th. S.; Bigotta, Stefano; Birindelli, Simona; Boccara, Albert-Claude; Bondu, F.; Bosi, Leone; Braccini, Stefano; Bradaschia, C.; Brillet, A.; Brisson, V.; Buskulic, D.; Cagnoli, G.; Calloni, E.; Campagna, Enrico; Carbognani, F.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cesarini, E.; Chassande-Mottin, E.; Clapson, A-C; Cleva, F.; Coccia, E.; Corda, C.; Corsi, A.; Cottone, F.; Coulon, J. -P.; Cuoco, E.; D'Antonio, S.; Dari, A.; Dattilo, V.; Davier, M.; Rosa, R.; Del Prete, M.; Di Fiore, L.; Di Lieto, A.; Emilio, M. Di Paolo; Di Virgilio, A.; Evans, M.; Fafone, V.; Ferrante, I.; Fidecaro, F.; Fiori, I.; Flaminio, R.; Fournier, J. -D.; Frasca, S.; Frasconi, F.; Gammaitoni, L.; Garufi, F.; Genin, E.; Gennai, A.; Giazotto, A.; Giordano, L.; Granata, V.; Greverie, C.; Grosjean, D.; Guidi, G.; Hamdani, S.U.; Hebri, S.; Heitmann, H.; Hello, P.; Huet, D.; Kreckelbergh, S.; La Penna, P.; Laval, M.; Leroy, N.; Letendre, N.; Lopez, B.; Lorenzini, M.; Loriette, V.; Losurdo, G.; Mackowski, J. -M.; Majorana, E.; Man, C. N.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marque, J.; Martelli, F.; Masserot, A.; Menzinger, F.; Milano, L.; Minenkov, Y.; Moins, C.; Moreau, J.; Morgado, N.; Mosca, S.; Mours, B.; Neri, I.; Nocera, F.; Pagliaroli, G.; Palomba, C.; Paoletti, F.; Pardi, S.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Piergiovanni, F.; Pinard, L.; Poggiani, R.; Punturo, M.; Puppo, P.; Rapagnani, P.; Regimbau, T.; Remillieux, A.; Ricci, F.; Ricciardi, I.; Rocchi, A.; Rolland, L.; Romano, R.; Ruggi, P.; Russo, G.; Solimeno, S.; Spallicci, A.; Swinkels, B. L.; Tarallo, M.; Terenzi, R.; Toncelli, A.; Tonelli, M.; Tournefier, E.; Travasso, F.; Vajente, G.; van den Brand, J. F. J.; van der Putten, S.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinet, J. -Y.; Vocca, H.; Yvert, M.

    2008-01-01

    We present a method to search for transient gravitational waves using a network of detectors with different spectral and directional sensitivities: the interferometer Virgo and the bar detector AURIGA. The data analysis method is based on the measurements of the correlated energy in the network by

  16. Digital One Disc One Compound Method for High Throughput Discovery of Prostate Cancer Targeting Ligands

    2016-12-01

    efficiency of drug discovery and make a potential impact on modern pharmaceutical industries . 15. SUBJECT TERMS ODOC carriers, barcode, split-mix...approach4-7. Array technologies can construct high density of molecules in an array format on a solid substrate (microchip), from which the chemical...and-play microfluidic packaging scheme, known as Microflego – 3D Microfluidic Assembly, to facilely establish complex 3D microfluidic networks using

  17. Applications and methods utilizing the Simple Semantic Web Architecture and Protocol (SSWAP for bioinformatics resource discovery and disparate data and service integration

    Nelson Rex T

    2010-06-01

    Full Text Available Abstract Background Scientific data integration and computational service discovery are challenges for the bioinformatic community. This process is made more difficult by the separate and independent construction of biological databases, which makes the exchange of data between information resources difficult and labor intensive. A recently described semantic web protocol, the Simple Semantic Web Architecture and Protocol (SSWAP; pronounced "swap" offers the ability to describe data and services in a semantically meaningful way. We report how three major information resources (Gramene, SoyBase and the Legume Information System [LIS] used SSWAP to semantically describe selected data and web services. Methods We selected high-priority Quantitative Trait Locus (QTL, genomic mapping, trait, phenotypic, and sequence data and associated services such as BLAST for publication, data retrieval, and service invocation via semantic web services. Data and services were mapped to concepts and categories as implemented in legacy and de novo community ontologies. We used SSWAP to express these offerings in OWL Web Ontology Language (OWL, Resource Description Framework (RDF and eXtensible Markup Language (XML documents, which are appropriate for their semantic discovery and retrieval. We implemented SSWAP services to respond to web queries and return data. These services are registered with the SSWAP Discovery Server and are available for semantic discovery at http://sswap.info. Results A total of ten services delivering QTL information from Gramene were created. From SoyBase, we created six services delivering information about soybean QTLs, and seven services delivering genetic locus information. For LIS we constructed three services, two of which allow the retrieval of DNA and RNA FASTA sequences with the third service providing nucleic acid sequence comparison capability (BLAST. Conclusions The need for semantic integration technologies has preceded

  18. A three-term conjugate gradient method under the strong-Wolfe line search

    Khadijah, Wan; Rivaie, Mohd; Mamat, Mustafa

    2017-08-01

    Recently, numerous studies have been concerned in conjugate gradient methods for solving large-scale unconstrained optimization method. In this paper, a three-term conjugate gradient method is proposed for unconstrained optimization which always satisfies sufficient descent direction and namely as Three-Term Rivaie-Mustafa-Ismail-Leong (TTRMIL). Under standard conditions, TTRMIL method is proved to be globally convergent under strong-Wolfe line search. Finally, numerical results are provided for the purpose of comparison.

  19. The Implementation of Discovery Learning Method to Increase Learning Outcomes and Motivation of Student in Senior High School

    Nanda Saridewi

    2017-11-01

    Full Text Available Based on data from the observation of high school students grade XI that daily low student test scores due to a lack of role of students in the learning process. This classroom action research aims to improve learning outcomes and student motivation through discovery learning method in colloidal material. This study uses the approach developed by Lewin consisting of planning, action, observation, and reflection. Data collection techniques used the questionnaires and ability tests end. Based on the research that results for students received a positive influence on learning by discovery learning model by increasing the average value of 74 students from the first cycle to 90.3 in the second cycle and increased student motivation in the form of two statements based competence (KD categories (sometimes on the first cycle and the first statement KD category in the second cycle. Thus the results of this study can be used to improve learning outcomes and student motivation

  20. Dual-mode nested search method for categorical uncertain multi-objective optimization

    Tang, Long; Wang, Hu

    2016-10-01

    Categorical multi-objective optimization is an important issue involved in many matching design problems. Non-numerical variables and their uncertainty are the major challenges of such optimizations. Therefore, this article proposes a dual-mode nested search (DMNS) method. In the outer layer, kriging metamodels are established using standard regular simplex mapping (SRSM) from categorical candidates to numerical values. Assisted by the metamodels, a k-cluster-based intelligent sampling strategy is developed to search Pareto frontier points. The inner layer uses an interval number method to model the uncertainty of categorical candidates. To improve the efficiency, a multi-feature convergent optimization via most-promising-area stochastic search (MFCOMPASS) is proposed to determine the bounds of objectives. Finally, typical numerical examples are employed to demonstrate the effectiveness of the proposed DMNS method.

  1. 14 CFR 406.143 - Discovery.

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Discovery. 406.143 Section 406.143... Transportation Adjudications § 406.143 Discovery. (a) Initiation of discovery. Any party may initiate discovery... after a complaint has been filed. (b) Methods of discovery. The following methods of discovery are...

  2. New Internet search volume-based weighting method for integrating various environmental impacts

    Ji, Changyoon, E-mail: changyoon@yonsei.ac.kr; Hong, Taehoon, E-mail: hong7@yonsei.ac.kr

    2016-01-15

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  3. New Internet search volume-based weighting method for integrating various environmental impacts

    Ji, Changyoon; Hong, Taehoon

    2016-01-01

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  4. A conjugate gradient method with descent properties under strong Wolfe line search

    Zull, N.; ‘Aini, N.; Shoid, S.; Ghani, N. H. A.; Mohamed, N. S.; Rivaie, M.; Mamat, M.

    2017-09-01

    The conjugate gradient (CG) method is one of the optimization methods that are often used in practical applications. The continuous and numerous studies conducted on the CG method have led to vast improvements in its convergence properties and efficiency. In this paper, a new CG method possessing the sufficient descent and global convergence properties is proposed. The efficiency of the new CG algorithm relative to the existing CG methods is evaluated by testing them all on a set of test functions using MATLAB. The tests are measured in terms of iteration numbers and CPU time under strong Wolfe line search. Overall, this new method performs efficiently and comparable to the other famous methods.

  5. Surfing for suicide methods and help: content analysis of websites retrieved with search engines in Austria and the United States.

    Till, Benedikt; Niederkrotenthaler, Thomas

    2014-08-01

    The Internet provides a variety of resources for individuals searching for suicide-related information. Structured content-analytic approaches to assess intercultural differences in web contents retrieved with method-related and help-related searches are scarce. We used the 2 most popular search engines (Google and Yahoo/Bing) to retrieve US-American and Austrian search results for the term suicide, method-related search terms (e.g., suicide methods, how to kill yourself, painless suicide, how to hang yourself), and help-related terms (e.g., suicidal thoughts, suicide help) on February 11, 2013. In total, 396 websites retrieved with US search engines and 335 websites from Austrian searches were analyzed with content analysis on the basis of current media guidelines for suicide reporting. We assessed the quality of websites and compared findings across search terms and between the United States and Austria. In both countries, protective outweighed harmful website characteristics by approximately 2:1. Websites retrieved with method-related search terms (e.g., how to hang yourself) contained more harmful (United States: P search engines generally had more protective characteristics (P search engines. Resources with harmful characteristics were better ranked than those with protective characteristics (United States: P < .01, Austria: P < .05). The quality of suicide-related websites obtained depends on the search terms used. Preventive efforts to improve the ranking of preventive web content, particularly regarding method-related search terms, seem necessary. © Copyright 2014 Physicians Postgraduate Press, Inc.

  6. Distributed Cooperative Search Control Method of Multiple UAVs for Moving Target

    Chang-jian Ru

    2015-01-01

    Full Text Available To reduce the impact of uncertainties caused by unknown motion parameters on searching plan of moving targets and improve the efficiency of UAV’s searching, a novel distributed Multi-UAVs cooperative search control method for moving target is proposed in this paper. Based on detection results of onboard sensors, target probability map is updated using Bayesian theory. A Gaussian distribution of target transition probability density function is introduced to calculate prediction probability of moving target existence, and then target probability map can be further updated in real-time. A performance index function combining with target cost, environment cost, and cooperative cost is constructed, and the cooperative searching problem can be transformed into a central optimization problem. To improve computational efficiency, the distributed model predictive control method is presented, and thus the control command of each UAV can be obtained. The simulation results have verified that the proposed method can avoid the blindness of UAV searching better and improve overall efficiency of the team effectively.

  7. A semantics-based method for clustering of Chinese web search results

    Zhang, Hui; Wang, Deqing; Wang, Li; Bi, Zhuming; Chen, Yong

    2014-01-01

    Information explosion is a critical challenge to the development of modern information systems. In particular, when the application of an information system is over the Internet, the amount of information over the web has been increasing exponentially and rapidly. Search engines, such as Google and Baidu, are essential tools for people to find the information from the Internet. Valuable information, however, is still likely submerged in the ocean of search results from those tools. By clustering the results into different groups based on subjects automatically, a search engine with the clustering feature allows users to select most relevant results quickly. In this paper, we propose an online semantics-based method to cluster Chinese web search results. First, we employ the generalised suffix tree to extract the longest common substrings (LCSs) from search snippets. Second, we use the HowNet to calculate the similarities of the words derived from the LCSs, and extract the most representative features by constructing the vocabulary chain. Third, we construct a vector of text features and calculate snippets' semantic similarities. Finally, we improve the Chameleon algorithm to cluster snippets. Extensive experimental results have shown that the proposed algorithm has outperformed over the suffix tree clustering method and other traditional clustering methods.

  8. The search conference as a method in planning community health promotion actions

    Eva Magnus

    2016-08-01

    Full Text Available Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives.

  9. A new greedy search method for the design of digital IIR filter

    Ranjit Kaur

    2015-07-01

    Full Text Available A new greedy search method is applied in this paper to design the optimal digital infinite impulse response (IIR filter. The greedy search method is based on binary successive approximation (BSA and evolutionary search (ES. The suggested greedy search method optimizes the magnitude response and the phase response simultaneously and also finds the lowest order of the filter. The order of the filter is controlled by a control gene whose value is also optimized along with the filter coefficients to obtain optimum order of designed IIR filter. The stability constraints of IIR filter are taken care of during the design procedure. To determine the trade-off relationship between conflicting objectives in the non-inferior domain, the weighting method is exploited. The proposed approach is effectively applied to solve the multiobjective optimization problems of designing the digital low-pass (LP, high-pass (HP, bandpass (BP, and bandstop (BS filters. It has been demonstrated that this technique not only fulfills all types of filter performance requirements, but also the lowest order of the filter can be found. The computational experiments show that the proposed approach gives better digital IIR filters than the existing evolutionary algorithm (EA based methods.

  10. Introducing PALETTE: an iterative method for conducting a literature search for a review in palliative care.

    Zwakman, Marieke; Verberne, Lisa M; Kars, Marijke C; Hooft, Lotty; van Delden, Johannes J M; Spijker, René

    2018-06-02

    In the rapidly developing specialty of palliative care, literature reviews have become increasingly important to inform and improve the field. When applying widely used methods for literature reviews developed for intervention studies onto palliative care, challenges are encountered such as the heterogeneity of palliative care in practice (wide range of domains in patient characteristics, stages of illness and stakeholders), the explorative character of review questions, and the poorly defined keywords and concepts. To overcome the challenges and to provide guidance for researchers to conduct a literature search for a review in palliative care, Palliative cAre Literature rEview iTeraTive mEthod (PALLETE), a pragmatic framework, was developed. We assessed PALETTE with a detailed description. PALETTE consists of four phases; developing the review question, building the search strategy, validating the search strategy and performing the search. The framework incorporates different information retrieval techniques: contacting experts, pearl growing, citation tracking and Boolean searching in a transparent way to maximize the retrieval of literature relevant to the topic of interest. The different components and techniques are repeated until no new articles are qualified for inclusion. The phases within PALETTE are interconnected by a recurrent process of validation on 'golden bullets' (articles that undoubtedly should be part of the review), citation tracking and concept terminology reflecting the review question. To give insight in the value of PALETTE, we compared PALETTE with the recommended search method for reviews of intervention studies. By using PALETTE on two palliative care literature reviews, we were able to improve our review questions and search strategies. Moreover, in comparison with the recommended search for intervention reviews, the number of articles needed to be screened was decreased whereas more relevant articles were retrieved. Overall, PALETTE

  11. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  12. System and method for improving video recorder performance in a search mode

    2000-01-01

    A method and apparatus wherein video images are recorded on a plurality of tracks of a tape such that, for playback in a search mode at a speed, higher than the recording speed the displayed image will consist of a plurality of contiguous parts, some of the parts being read out from tracks each

  13. System and method for improving video recorder performance in a search mode

    1991-01-01

    A method and apparatus wherein video images are recorded on a plurality of tracks of a tape such that, for playback in a search mode at a speed higher than the recording speed the displayed image will consist of a plurality of contiguous parts, some of the parts being read out from tracks each

  14. A Teaching Approach from the Exhaustive Search Method to the Needleman-Wunsch Algorithm

    Xu, Zhongneng; Yang, Yayun; Huang, Beibei

    2017-01-01

    The Needleman-Wunsch algorithm has become one of the core algorithms in bioinformatics; however, this programming requires more suitable explanations for students with different major backgrounds. In supposing sample sequences and using a simple store system, the connection between the exhaustive search method and the Needleman-Wunsch algorithm…

  15. Automated discovery systems and the inductivist controversy

    Giza, Piotr

    2017-09-01

    The paper explores possible influences that some developments in the field of branches of AI, called automated discovery and machine learning systems, might have upon some aspects of the old debate between Francis Bacon's inductivism and Karl Popper's falsificationism. Donald Gillies facetiously calls this controversy 'the duel of two English knights', and claims, after some analysis of historical cases of discovery, that Baconian induction had been used in science very rarely, or not at all, although he argues that the situation has changed with the advent of machine learning systems. (Some clarification of terms machine learning and automated discovery is required here. The key idea of machine learning is that, given data with associated outcomes, software can be trained to make those associations in future cases which typically amounts to inducing some rules from individual cases classified by the experts. Automated discovery (also called machine discovery) deals with uncovering new knowledge that is valuable for human beings, and its key idea is that discovery is like other intellectual tasks and that the general idea of heuristic search in problem spaces applies also to discovery tasks. However, since machine learning systems discover (very low-level) regularities in data, throughout this paper I use the generic term automated discovery for both kinds of systems. I will elaborate on this later on). Gillies's line of argument can be generalised: thanks to automated discovery systems, philosophers of science have at their disposal a new tool for empirically testing their philosophical hypotheses. Accordingly, in the paper, I will address the question, which of the two philosophical conceptions of scientific method is better vindicated in view of the successes and failures of systems developed within three major research programmes in the field: machine learning systems in the Turing tradition, normative theory of scientific discovery formulated by Herbert Simon

  16. Mathematical programming models for solving in equal-sized facilities layout problems. A genetic search method

    Tavakkoli-Moghaddam, R.

    1999-01-01

    This paper present unequal-sized facilities layout solutions generated by a genetic search program. named Layout Design using a Genetic Algorithm) 9. The generalized quadratic assignment problem requiring pre-determined distance and material flow matrices as the input data and the continuous plane model employing a dynamic distance measure and a material flow matrix are discussed. Computational results on test problems are reported as compared with layout solutions generated by the branch - and bound algorithm a hybrid method merging simulated annealing and local search techniques, and an optimization process of an enveloped block

  17. An Efficient Hybrid Conjugate Gradient Method with the Strong Wolfe-Powell Line Search

    Ahmad Alhawarat

    2015-01-01

    Full Text Available Conjugate gradient (CG method is an interesting tool to solve optimization problems in many fields, such as design, economics, physics, and engineering. In this paper, we depict a new hybrid of CG method which relates to the famous Polak-Ribière-Polyak (PRP formula. It reveals a solution for the PRP case which is not globally convergent with the strong Wolfe-Powell (SWP line search. The new formula possesses the sufficient descent condition and the global convergent properties. In addition, we further explained about the cases where PRP method failed with SWP line search. Furthermore, we provide numerical computations for the new hybrid CG method which is almost better than other related PRP formulas in both the number of iterations and the CPU time under some standard test functions.

  18. A peak value searching method of the MCA based on digital logic devices

    Sang Ziru; Huang Shanshan; Chen Lian; Jin Ge

    2010-01-01

    Digital multi-channel analyzers play a more important role in multi-channel pulse height analysis technique. The direction of digitalization are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper introduces a method of searching peak value of waveform based on digital logic with FPGA. This method reduce the dead time. Then data correction offline can improvement the non-linearity of MCA. It gives the α energy spectrum of 241 Am. (authors)

  19. Local Path Planning of Driverless Car Navigation Based on Jump Point Search Method Under Urban Environment

    Kaijun Zhou

    2017-09-01

    Full Text Available The Jump Point Search (JPS algorithm is adopted for local path planning of the driverless car under urban environment, and it is a fast search method applied in path planning. Firstly, a vector Geographic Information System (GIS map, including Global Positioning System (GPS position, direction, and lane information, is built for global path planning. Secondly, the GIS map database is utilized in global path planning for the driverless car. Then, the JPS algorithm is adopted to avoid the front obstacle, and to find an optimal local path for the driverless car in the urban environment. Finally, 125 different simulation experiments in the urban environment demonstrate that JPS can search out the optimal and safety path successfully, and meanwhile, it has a lower time complexity compared with the Vector Field Histogram (VFH, the Rapidly Exploring Random Tree (RRT, A*, and the Probabilistic Roadmaps (PRM algorithms. Furthermore, JPS is validated usefully in the structured urban environment.

  20. An adaptive bin framework search method for a beta-sheet protein homopolymer model

    Hoos Holger H

    2007-04-01

    Full Text Available Abstract Background The problem of protein structure prediction consists of predicting the functional or native structure of a protein given its linear sequence of amino acids. This problem has played a prominent role in the fields of biomolecular physics and algorithm design for over 50 years. Additionally, its importance increases continually as a result of an exponential growth over time in the number of known protein sequences in contrast to a linear increase in the number of determined structures. Our work focuses on the problem of searching an exponentially large space of possible conformations as efficiently as possible, with the goal of finding a global optimum with respect to a given energy function. This problem plays an important role in the analysis of systems with complex search landscapes, and particularly in the context of ab initio protein structure prediction. Results In this work, we introduce a novel approach for solving this conformation search problem based on the use of a bin framework for adaptively storing and retrieving promising locally optimal solutions. Our approach provides a rich and general framework within which a broad range of adaptive or reactive search strategies can be realized. Here, we introduce adaptive mechanisms for choosing which conformations should be stored, based on the set of conformations already stored in memory, and for biasing choices when retrieving conformations from memory in order to overcome search stagnation. Conclusion We show that our bin framework combined with a widely used optimization method, Monte Carlo search, achieves significantly better performance than state-of-the-art generalized ensemble methods for a well-known protein-like homopolymer model on the face-centered cubic lattice.

  1. A comparison of two search methods for determining the scope of systematic reviews and health technology assessments.

    Forsetlund, Louise; Kirkehei, Ingvild; Harboe, Ingrid; Odgaard-Jensen, Jan

    2012-01-01

    This study aims to compare two different search methods for determining the scope of a requested systematic review or health technology assessment. The first method (called the Direct Search Method) included performing direct searches in the Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts of Reviews of Effects (DARE) and the Health Technology Assessments (HTA). Using the comparison method (called the NHS Search Engine) we performed searches by means of the search engine of the British National Health Service, NHS Evidence. We used an adapted cross-over design with a random allocation of fifty-five requests for systematic reviews. The main analyses were based on repeated measurements adjusted for the order in which the searches were conducted. The Direct Search Method generated on average fewer hits (48 percent [95 percent confidence interval {CI} 6 percent to 72 percent], had a higher precision (0.22 [95 percent CI, 0.13 to 0.30]) and more unique hits than when searching by means of the NHS Search Engine (50 percent [95 percent CI, 7 percent to 110 percent]). On the other hand, the Direct Search Method took longer (14.58 minutes [95 percent CI, 7.20 to 21.97]) and was perceived as somewhat less user-friendly than the NHS Search Engine (-0.60 [95 percent CI, -1.11 to -0.09]). Although the Direct Search Method had some drawbacks such as being more time-consuming and less user-friendly, it generated more unique hits than the NHS Search Engine, retrieved on average fewer references and fewer irrelevant results.

  2. Computational neuropharmacology: dynamical approaches in drug discovery.

    Aradi, Ildiko; Erdi, Péter

    2006-05-01

    Computational approaches that adopt dynamical models are widely accepted in basic and clinical neuroscience research as indispensable tools with which to understand normal and pathological neuronal mechanisms. Although computer-aided techniques have been used in pharmaceutical research (e.g. in structure- and ligand-based drug design), the power of dynamical models has not yet been exploited in drug discovery. We suggest that dynamical system theory and computational neuroscience--integrated with well-established, conventional molecular and electrophysiological methods--offer a broad perspective in drug discovery and in the search for novel targets and strategies for the treatment of neurological and psychiatric diseases.

  3. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  4. A fast tomographic method for searching the minimum free energy path

    Chen, Changjun; Huang, Yanzhao; Xiao, Yi; Jiang, Xuewei

    2014-01-01

    Minimum Free Energy Path (MFEP) provides a lot of important information about the chemical reactions, like the free energy barrier, the location of the transition state, and the relative stability between reactant and product. With MFEP, one can study the mechanisms of the reaction in an efficient way. Due to a large number of degrees of freedom, searching the MFEP is a very time-consuming process. Here, we present a fast tomographic method to perform the search. Our approach first calculates the free energy surfaces in a sequence of hyperplanes perpendicular to a transition path. Based on an objective function and the free energy gradient, the transition path is optimized in the collective variable space iteratively. Applications of the present method to model systems show that our method is practical. It can be an alternative approach for finding the state-to-state MFEP

  5. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.

  6. Novel citation-based search method for scientific literature: application to meta-analyses.

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  7. Fast optimization of binary clusters using a novel dynamic lattice searching method

    Wu, Xia; Cheng, Wen

    2014-01-01

    Global optimization of binary clusters has been a difficult task despite of much effort and many efficient methods. Directing toward two types of elements (i.e., homotop problem) in binary clusters, two classes of virtual dynamic lattices are constructed and a modified dynamic lattice searching (DLS) method, i.e., binary DLS (BDLS) method, is developed. However, it was found that the BDLS can only be utilized for the optimization of binary clusters with small sizes because homotop problem is hard to be solved without atomic exchange operation. Therefore, the iterated local search (ILS) method is adopted to solve homotop problem and an efficient method based on the BDLS method and ILS, named as BDLS-ILS, is presented for global optimization of binary clusters. In order to assess the efficiency of the proposed method, binary Lennard-Jones clusters with up to 100 atoms are investigated. Results show that the method is proved to be efficient. Furthermore, the BDLS-ILS method is also adopted to study the geometrical structures of (AuPd) 79 clusters with DFT-fit parameters of Gupta potential

  8. Comparison of Deep Learning With Multiple Machine Learning Methods and Metrics Using Diverse Drug Discovery Data Sets.

    Korotcov, Alexandru; Tkachenko, Valery; Russo, Daniel P; Ekins, Sean

    2017-12-04

    Machine learning methods have been applied to many data sets in pharmaceutical research for several decades. The relative ease and availability of fingerprint type molecular descriptors paired with Bayesian methods resulted in the widespread use of this approach for a diverse array of end points relevant to drug discovery. Deep learning is the latest machine learning algorithm attracting attention for many of pharmaceutical applications from docking to virtual screening. Deep learning is based on an artificial neural network with multiple hidden layers and has found considerable traction for many artificial intelligence applications. We have previously suggested the need for a comparison of different machine learning methods with deep learning across an array of varying data sets that is applicable to pharmaceutical research. End points relevant to pharmaceutical research include absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) properties, as well as activity against pathogens and drug discovery data sets. In this study, we have used data sets for solubility, probe-likeness, hERG, KCNQ1, bubonic plague, Chagas, tuberculosis, and malaria to compare different machine learning methods using FCFP6 fingerprints. These data sets represent whole cell screens, individual proteins, physicochemical properties as well as a data set with a complex end point. Our aim was to assess whether deep learning offered any improvement in testing when assessed using an array of metrics including AUC, F1 score, Cohen's kappa, Matthews correlation coefficient and others. Based on ranked normalized scores for the metrics or data sets Deep Neural Networks (DNN) ranked higher than SVM, which in turn was ranked higher than all the other machine learning methods. Visualizing these properties for training and test sets using radar type plots indicates when models are inferior or perhaps over trained. These results also suggest the need for assessing deep learning further

  9. Frequency domain optical tomography using a conjugate gradient method without line search

    Kim, Hyun Keol; Charette, Andre

    2007-01-01

    A conjugate gradient method without line search (CGMWLS) is presented. This method is used to retrieve the local maps of absorption and scattering coefficients inside the tissue-like test medium, with the synthetic data. The forward problem is solved with a discrete-ordinates finite-difference method based on the frequency domain formulation of radiative transfer equation. The inversion results demonstrate that the CGMWLS can retrieve simultaneously the spatial distributions of optical properties inside the medium within a reasonable accuracy, by reducing cross-talk between absorption and scattering coefficients

  10. Non-contact method of search and analysis of pulsating vessels

    Avtomonov, Yuri N.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Despite the variety of existing methods of recording the human pulse and a solid history of their development, there is still considerable interest in this topic. The development of new non-contact methods, based on advanced image processing, caused a new wave of interest in this issue. We present a simple but quite effective method for analyzing the mechanical pulsations of blood vessels lying close to the surface of the skin. Our technique is a modification of imaging (or remote) photoplethysmography (i-PPG). We supplemented this method with the addition of a laser light source, which made it possible to use other methods of searching for the proposed pulsation zone. During the testing of the method, several series of experiments were carried out with both artificial oscillating objects as well as with the target signal source (human wrist). The obtained results show that our method allows correct interpretation of complex data. To summarize, we proposed and tested an alternative method for the search and analysis of pulsating vessels.

  11. An Efficient Method to Search Real-Time Bulk Data for an Information Processing System

    Kim, Seong Jin; Kim, Jong Myung; Suh, Yong Suk; Keum, Jong Yong; Park, Heui Youn

    2005-01-01

    The Man Machine Interface System (MMIS) of System-integrated Modular Advanced ReacTor (SMART) is designed with fully digitalized features. The Information Processing System (IPS) of the MMIS acquires and processes plant data from other systems. In addition, the IPS provides plant operation information to operators in the control room. The IPS is required to process bulky data in a real-time. So, it is necessary to consider a special processing method with regards to flexibility and performance because more than a few thousands of Plant Information converges on the IPS. Among other things, the processing time for searching for data from the bulk data consumes much more than other the processing times. Thus, this paper explores an efficient method for the search and examines its feasibility

  12. Neural Based Tabu Search method for solving unit commitment problem with cooling-banking constraints

    Rajan Asir Christober Gnanakkan Charles

    2009-01-01

    Full Text Available This paper presents a new approach to solve short-term unit commitment problem (UCP using Neural Based Tabu Search (NBTS with cooling and banking constraints. The objective of this paper is to find the generation scheduling such that the total operating cost can be minimized, when subjected to a variety of constraints. This also means that it is desirable to find the optimal generating unit commitment in the power system for next H hours. A 7-unit utility power system in India demonstrates the effectiveness of the proposed approach; extensive studies have also been performed for different IEEE test systems consist of 10, 26 and 34 units. Numerical results are shown to compare the superiority of the cost solutions obtained using the Tabu Search (TS method, Dynamic Programming (DP and Lagrangian Relaxation (LR methods in reaching proper unit commitment.

  13. The Search Conference as a Method in Planning Community Health Promotion Actions

    Magnus, Eva; Knudtsen, Margunn Skjei; Wist, Guri; Weiss, Daniel; Lillefjell, Monica

    2016-01-01

    Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives. Significance for public health This article describe and discuss how the Search conference can be used as a method when working with knowledge based health promotion actions in local communities. The article describe the sequences of the conference and shows how this have been adapted when planning and prioritizing health promotion actions in three Norwegian municipalities. The significance of the article is that it shows how central elements in the planning of health promotion actions, as participation and involvements as well as evidence was a fundamental thinking in how the conference were accomplished. The article continue discussing how the method function as both a top-down and a bottom-up strategy, and in what way working evidence based can be in conflict with a bottom-up strategy. The experiences described can be used as guidance planning knowledge based health promotion actions in communities. PMID:27747199

  14. SWATHtoMRM: Development of High-Coverage Targeted Metabolomics Method Using SWATH Technology for Biomarker Discovery.

    Zha, Haihong; Cai, Yuping; Yin, Yandong; Wang, Zhuozhong; Li, Kang; Zhu, Zheng-Jiang

    2018-03-20

    The complexity of metabolome presents a great analytical challenge for quantitative metabolite profiling, and restricts the application of metabolomics in biomarker discovery. Targeted metabolomics using multiple-reaction monitoring (MRM) technique has excellent capability for quantitative analysis, but suffers from the limited metabolite coverage. To address this challenge, we developed a new strategy, namely, SWATHtoMRM, which utilizes the broad coverage of SWATH-MS technology to develop high-coverage targeted metabolomics method. Specifically, SWATH-MS technique was first utilized to untargeted profile one pooled biological sample and to acquire the MS 2 spectra for all metabolites. Then, SWATHtoMRM was used to extract the large-scale MRM transitions for targeted analysis with coverage as high as 1000-2000 metabolites. Then, we demonstrated the advantages of SWATHtoMRM method in quantitative analysis such as coverage, reproducibility, sensitivity, and dynamic range. Finally, we applied our SWATHtoMRM approach to discover potential metabolite biomarkers for colorectal cancer (CRC) diagnosis. A high-coverage targeted metabolomics method with 1303 metabolites in one injection was developed to profile colorectal cancer tissues from CRC patients. A total of 20 potential metabolite biomarkers were discovered and validated for CRC diagnosis. In plasma samples from CRC patients, 17 out of 20 potential biomarkers were further validated to be associated with tumor resection, which may have a great potential in assessing the prognosis of CRC patients after tumor resection. Together, the SWATHtoMRM strategy provides a new way to develop high-coverage targeted metabolomics method, and facilitates the application of targeted metabolomics in disease biomarker discovery. The SWATHtoMRM program is freely available on the Internet ( http://www.zhulab.cn/software.php ).

  15. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method.

    Tien, Shin-Ming; Hsu, Chih-Yuan; Chen, Bor-Sen

    2016-01-01

    Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella's rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the "brake component" in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each "brake component" were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the "brake component". Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate "brake component" in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains.

  16. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method.

    Shin-Ming Tien

    Full Text Available Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella's rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the "brake component" in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each "brake component" were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the "brake component". Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate "brake component" in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains.

  17. Efficient and accurate Greedy Search Methods for mining functional modules in protein interaction networks.

    He, Jieyue; Li, Chaojun; Ye, Baoliu; Zhong, Wei

    2012-06-25

    Most computational algorithms mainly focus on detecting highly connected subgraphs in PPI networks as protein complexes but ignore their inherent organization. Furthermore, many of these algorithms are computationally expensive. However, recent analysis indicates that experimentally detected protein complexes generally contain Core/attachment structures. In this paper, a Greedy Search Method based on Core-Attachment structure (GSM-CA) is proposed. The GSM-CA method detects densely connected regions in large protein-protein interaction networks based on the edge weight and two criteria for determining core nodes and attachment nodes. The GSM-CA method improves the prediction accuracy compared to other similar module detection approaches, however it is computationally expensive. Many module detection approaches are based on the traditional hierarchical methods, which is also computationally inefficient because the hierarchical tree structure produced by these approaches cannot provide adequate information to identify whether a network belongs to a module structure or not. In order to speed up the computational process, the Greedy Search Method based on Fast Clustering (GSM-FC) is proposed in this work. The edge weight based GSM-FC method uses a greedy procedure to traverse all edges just once to separate the network into the suitable set of modules. The proposed methods are applied to the protein interaction network of S. cerevisiae. Experimental results indicate that many significant functional modules are detected, most of which match the known complexes. Results also demonstrate that the GSM-FC algorithm is faster and more accurate as compared to other competing algorithms. Based on the new edge weight definition, the proposed algorithm takes advantages of the greedy search procedure to separate the network into the suitable set of modules. Experimental analysis shows that the identified modules are statistically significant. The algorithm can reduce the

  18. A new essential protein discovery method based on the integration of protein-protein interaction and gene expression data

    Li Min

    2012-03-01

    Full Text Available Abstract Background Identification of essential proteins is always a challenging task since it requires experimental approaches that are time-consuming and laborious. With the advances in high throughput technologies, a large number of protein-protein interactions are available, which have produced unprecedented opportunities for detecting proteins' essentialities from the network level. There have been a series of computational approaches proposed for predicting essential proteins based on network topologies. However, the network topology-based centrality measures are very sensitive to the robustness of network. Therefore, a new robust essential protein discovery method would be of great value. Results In this paper, we propose a new centrality measure, named PeC, based on the integration of protein-protein interaction and gene expression data. The performance of PeC is validated based on the protein-protein interaction network of Saccharomyces cerevisiae. The experimental results show that the predicted precision of PeC clearly exceeds that of the other fifteen previously proposed centrality measures: Degree Centrality (DC, Betweenness Centrality (BC, Closeness Centrality (CC, Subgraph Centrality (SC, Eigenvector Centrality (EC, Information Centrality (IC, Bottle Neck (BN, Density of Maximum Neighborhood Component (DMNC, Local Average Connectivity-based method (LAC, Sum of ECC (SoECC, Range-Limited Centrality (RL, L-index (LI, Leader Rank (LR, Normalized α-Centrality (NC, and Moduland-Centrality (MC. Especially, the improvement of PeC over the classic centrality measures (BC, CC, SC, EC, and BN is more than 50% when predicting no more than 500 proteins. Conclusions We demonstrate that the integration of protein-protein interaction network and gene expression data can help improve the precision of predicting essential proteins. The new centrality measure, PeC, is an effective essential protein discovery method.

  19. Combination of Multiple Spectral Libraries Improves the Current Search Methods Used to Identify Missing Proteins in the Chromosome-Centric Human Proteome Project.

    Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Kim, Kwang-Youl; Kwon, Kyung-Hoon; Yoo, Jong Shin; Omenn, Gilbert S; Baker, Mark S; Hancock, William S; Paik, Young-Ki

    2015-12-04

    Approximately 2.9 billion long base-pair human reference genome sequences are known to encode some 20 000 representative proteins. However, 3000 proteins, that is, ~15% of all proteins, have no or very weak proteomic evidence and are still missing. Missing proteins may be present in rare samples in very low abundance or be only temporarily expressed, causing problems in their detection and protein profiling. In particular, some technical limitations cause missing proteins to remain unassigned. For example, current mass spectrometry techniques have high limits and error rates for the detection of complex biological samples. An insufficient proteome coverage in a reference sequence database and spectral library also raises major issues. Thus, the development of a better strategy that results in greater sensitivity and accuracy in the search for missing proteins is necessary. To this end, we used a new strategy, which combines a reference spectral library search and a simulated spectral library search, to identify missing proteins. We built the human iRefSPL, which contains the original human reference spectral library and additional peptide sequence-spectrum match entries from other species. We also constructed the human simSPL, which contains the simulated spectra of 173 907 human tryptic peptides determined by MassAnalyzer (version 2.3.1). To prove the enhanced analytical performance of the combination of the human iRefSPL and simSPL methods for the identification of missing proteins, we attempted to reanalyze the placental tissue data set (PXD000754). The data from each experiment were analyzed using PeptideProphet, and the results were combined using iProphet. For the quality control, we applied the class-specific false-discovery rate filtering method. All of the results were filtered at a false-discovery rate of libraries, iRefSPL and simSPL, were designed to ensure no overlap of the proteome coverage. They were shown to be complementary to spectral library

  20. A Search for Lyα Emission from Galaxies AT 6 < z < 8 Using Deep HST Grism Observations: Discovery of a z = 7.5 Galaxy

    Larson, Rebecca L.; Finkelstein, Steven; Pirzkal, Nor; Ryan, Russell; Tilvi, Vithal; Malhotra, Sangeeta; Rhoads, James; Finkelstein, Keely; Jung, Intae; Christensen, Lise; Cimatti, Andrea; Ferreras, Ignacio; Grogin, Norman; Koekemoer, Anton; Hathi, Nimish; O'Connell, Robert; Östlin, Göran; Pasquali, Anna; Rothberg, Barry; Windhorst, Rogier; FIGS Team

    2018-01-01

    We have built an automated detection method to find Lyα emission lines in HST grism data from 6 state of the intergalactic medium (IGM) during the epoch of reionization. We use 160 orbits of G102 slitless spectroscopy obtained from HST/WFC3 for the Faint Infrared Grism Survey (FIGS; PI: Malhotra) that were optimized to sample previously-identified high-redshift galaxy candidates. This dataset has already been used to identify one of these candidates, at redshift z = 7.51, which has been observed to have Lyα emission detectable with the HST Grism (Finkelstein et al. 2013; Tilvi et al. 2016). The FIGS data use five separate roll-angles of HST in an effort to mitigate the overall contamination effects of nearby galaxies and we have created a method that accounts for and removes the contamination from surrounding galaxies, while also removing any dispersed continuum light from each individual spectrum (Pirzkal et al. 2017). Using our new automated process we searched for significant (> 3σ) emission lines via two different methods. First, we compared the results for each galaxy across all roll angles and identified significant lines detected in more than one roll angle. Second, we performed a fit to all five roll angles simultaneously, accounting for the total flux of the emission line across all of our spectra. We have examined the spectra for 64 z > 7 candidates in our sample and found one new candidate Lyα emission line at a (> 5σ) level at 1.03µm (FIGS ID: GS2 1406 also named CANDELS ID: z7 PAR2 2909). After comparing this emission line with the broadband photometric colors, we conclude that this line is Lyα at z = 7.542 ± 0.003. This galaxy has the highest Lyα rest-frame equivalent width (EWLyα) yet published at z > 7 (110 ± 14 A).

  1. Comparison of seven methods for producing Affymetrix expression scores based on False Discovery Rates in disease profiling data

    Gruber Stephen B

    2005-02-01

    Full Text Available Abstract Background A critical step in processing oligonucleotide microarray data is combining the information in multiple probes to produce a single number that best captures the expression level of a RNA transcript. Several systematic studies comparing multiple methods for array processing have used tightly controlled calibration data sets as the basis for comparison. Here we compare performances for seven processing methods using two data sets originally collected for disease profiling studies. An emphasis is placed on understanding sensitivity for detecting differentially expressed genes in terms of two key statistical determinants: test statistic variability for non-differentially expressed genes, and test statistic size for truly differentially expressed genes. Results In the two data sets considered here, up to seven-fold variation across the processing methods was found in the number of genes detected at a given false discovery rate (FDR. The best performing methods called up to 90% of the same genes differentially expressed, had less variable test statistics under randomization, and had a greater number of large test statistics in the experimental data. Poor performance of one method was directly tied to a tendency to produce highly variable test statistic values under randomization. Based on an overall measure of performance, two of the seven methods (Dchip and a trimmed mean approach are superior in the two data sets considered here. Two other methods (MAS5 and GCRMA-EB are inferior, while results for the other three methods are mixed. Conclusions Choice of processing method has a major impact on differential expression analysis of microarray data. Previously reported performance analyses using tightly controlled calibration data sets are not highly consistent with results reported here using data from human tissue samples. Performance of array processing methods in disease profiling and other realistic biological studies should be

  2. Materials Screening for the Discovery of New Half-Heuslers: Machine Learning versus ab Initio Methods.

    Legrain, Fleur; Carrete, Jesús; van Roekeghem, Ambroise; Madsen, Georg K H; Mingo, Natalio

    2018-01-18

    Machine learning (ML) is increasingly becoming a helpful tool in the search for novel functional compounds. Here we use classification via random forests to predict the stability of half-Heusler (HH) compounds, using only experimentally reported compounds as a training set. Cross-validation yields an excellent agreement between the fraction of compounds classified as stable and the actual fraction of truly stable compounds in the ICSD. The ML model is then employed to screen 71 178 different 1:1:1 compositions, yielding 481 likely stable candidates. The predicted stability of HH compounds from three previous high-throughput ab initio studies is critically analyzed from the perspective of the alternative ML approach. The incomplete consistency among the three separate ab initio studies and between them and the ML predictions suggests that additional factors beyond those considered by ab initio phase stability calculations might be determinant to the stability of the compounds. Such factors can include configurational entropies and quasiharmonic contributions.

  3. An efficient search method for finding the critical slip surface using the compositional Monte Carlo technique

    Goshtasbi, K.; Ahmadi, M; Naeimi, Y.

    2008-01-01

    Locating the critical slip surface and the associated minimum factor of safety are two complementary parts in a slope stability analysis. A large number of computer programs exist to solve slope stability problems. Most of these programs, however, have used inefficient and unreliable search procedures to locate the global minimum factor of safety. This paper presents an efficient and reliable method to determine the global minimum factor of safety coupled with a modified version of the Monte Carlo technique. Examples arc presented to illustrate the reliability of the proposed method

  4. Generalized Pattern Search methods for a class of nonsmooth optimization problems with structure

    Bogani, C.; Gasparo, M. G.; Papini, A.

    2009-07-01

    We propose a Generalized Pattern Search (GPS) method to solve a class of nonsmooth minimization problems, where the set of nondifferentiability is included in the union of known hyperplanes and, therefore, is highly structured. Both unconstrained and linearly constrained problems are considered. At each iteration the set of poll directions is enforced to conform to the geometry of both the nondifferentiability set and the boundary of the feasible region, near the current iterate. This is the key issue to guarantee the convergence of certain subsequences of iterates to points which satisfy first-order optimality conditions. Numerical experiments on some classical problems validate the method.

  5. Searching in the Context of a Task: A Review of Methods and Tools

    Ana Maguitman

    2018-04-01

    Full Text Available Contextual information extracted from the user task can help to better target retrieval to task-relevant content. In particular, topical context can be exploited to identify the subject of the information needs, contributing to reduce the information overload problem. A great number of methods exist to extract raw context data and contextual interaction patterns from the user task and to model this information using higher-level representations. Context can then be used as a source for automatic query generation, or as a means to refine or disambiguate user-generated queries. It can also be used to filter and rank results as well as to select domain-specific search engines with better capabilities to satisfy specific information requests. This article reviews methods that have been applied to deal with the problem of reflecting the current and long-term interests of a user in the search process. It discusses major difficulties encountered in the research area of context-based information retrieval and presents an overview of tools proposed since the mid-nineties to deal with the problem of context-based search.

  6. Search method for long-duration gravitational-wave transients from neutron stars

    Prix, R.; Giampanis, S.; Messenger, C.

    2011-01-01

    We introduce a search method for a new class of gravitational-wave signals, namely, long-duration O(hours-weeks) transients from spinning neutron stars. We discuss the astrophysical motivation from glitch relaxation models and we derive a rough estimate for the maximal expected signal strength based on the superfluid excess rotational energy. The transient signal model considered here extends the traditional class of infinite-duration continuous-wave signals by a finite start-time and duration. We derive a multidetector Bayes factor for these signals in Gaussian noise using F-statistic amplitude priors, which simplifies the detection statistic and allows for an efficient implementation. We consider both a fully coherent statistic, which is computationally limited to directed searches for known pulsars, and a cheaper semicoherent variant, suitable for wide parameter-space searches for transients from unknown neutron stars. We have tested our method by Monte-Carlo simulation, and we find that it outperforms orthodox maximum-likelihood approaches both in sensitivity and in parameter-estimation quality.

  7. Utilizing mixed methods research in analyzing Iranian researchers’ informarion search behaviour in the Web and presenting current pattern

    Maryam Asadi

    2015-12-01

    Full Text Available Using mixed methods research design, the current study has analyzed Iranian researchers’ information searching behaviour on the Web.Then based on extracted concepts, the model of their information searching behavior was revealed. . Forty-four participants, including academic staff from universities and research centers were recruited for this study selected by purposive sampling. Data were gathered from questionnairs including ten questions and semi-structured interview. Each participant’s memos were analyzed using grounded theory methods adapted from Strauss & Corbin (1998. Results showed that the main objectives of subjects were doing a research, writing a paper, studying, doing assignments, downloading files and acquiring public information in using Web. The most important of learning about how to search and retrieve information were trial and error and get help from friends among the subjects. Information resources are identified by searching in information resources (e.g. search engines, references in papers, and search in Online database… communications facilities & tools (e.g. contact with colleagues, seminars & workshops, social networking..., and information services (e.g. RSS, Alerting, and SDI. Also, Findings indicated that searching by search engines, reviewing references, searching in online databases, and contact with colleagues and studying last issue of the electronic journals were the most important for searching. The most important strategies were using search engines and scientific tools such as Google Scholar. In addition, utilizing from simple (Quick search method was the most common among subjects. Using of topic, keywords, title of paper were most important of elements for retrieval information. Analysis of interview showed that there were nine stages in researchers’ information searching behaviour: topic selection, initiating search, formulating search query, information retrieval, access to information

  8. (Re)interpreting LHC New Physics Search Results : Tools and Methods, 3rd Workshop

    The quest for new physics beyond the SM is arguably the driving topic for LHC Run2. LHC collaborations are pursuing searches for new physics in a vast variety of channels. Although collaborations provide various interpretations for their search results, the full understanding of these results requires a much wider interpretation scope involving all kinds of theoretical models. This is a very active field, with close theory-experiment interaction. In particular, development of dedicated methodologies and tools is crucial for such scale of interpretation. Recently, a Forum was initiated to host discussions among LHC experimentalists and theorists on topics related to the BSM (re)interpretation of LHC data, and especially on the development of relevant interpretation tools and infrastructure: https://twiki.cern.ch/twiki/bin/view/LHCPhysics/InterpretingLHCresults Two meetings were held at CERN, where active discussions and concrete work on (re)interpretation methods and tools took place, with valuable cont...

  9. Electricity price forecast using Combinatorial Neural Network trained by a new stochastic search method

    Abedinia, O.; Amjady, N.; Shafie-khah, M.; Catalão, J.P.S.

    2015-01-01

    Highlights: • Presenting a Combinatorial Neural Network. • Suggesting a new stochastic search method. • Adapting the suggested method as a training mechanism. • Proposing a new forecast strategy. • Testing the proposed strategy on real-world electricity markets. - Abstract: Electricity price forecast is key information for successful operation of electricity market participants. However, the time series of electricity price has nonlinear, non-stationary and volatile behaviour and so its forecast method should have high learning capability to extract the complex input/output mapping function of electricity price. In this paper, a Combinatorial Neural Network (CNN) based forecasting engine is proposed to predict the future values of price data. The CNN-based forecasting engine is equipped with a new training mechanism for optimizing the weights of the CNN. This training mechanism is based on an efficient stochastic search method, which is a modified version of chemical reaction optimization algorithm, giving high learning ability to the CNN. The proposed price forecast strategy is tested on the real-world electricity markets of Pennsylvania–New Jersey–Maryland (PJM) and mainland Spain and its obtained results are extensively compared with the results obtained from several other forecast methods. These comparisons illustrate effectiveness of the proposed strategy.

  10. A Method for Estimating View Transformations from Image Correspondences Based on the Harmony Search Algorithm

    Erik Cuevas

    2015-01-01

    Full Text Available In this paper, a new method for robustly estimating multiple view relations from point correspondences is presented. The approach combines the popular random sampling consensus (RANSAC algorithm and the evolutionary method harmony search (HS. With this combination, the proposed method adopts a different sampling strategy than RANSAC to generate putative solutions. Under the new mechanism, at each iteration, new candidate solutions are built taking into account the quality of the models generated by previous candidate solutions, rather than purely random as it is the case of RANSAC. The rules for the generation of candidate solutions (samples are motivated by the improvisation process that occurs when a musician searches for a better state of harmony. As a result, the proposed approach can substantially reduce the number of iterations still preserving the robust capabilities of RANSAC. The method is generic and its use is illustrated by the estimation of homographies, considering synthetic and real images. Additionally, in order to demonstrate the performance of the proposed approach within a real engineering application, it is employed to solve the problem of position estimation in a humanoid robot. Experimental results validate the efficiency of the proposed method in terms of accuracy, speed, and robustness.

  11. Search method optimization technique for thermal design of high power RFQ structure

    Sharma, N.K.; Joshi, S.C.

    2009-01-01

    RRCAT has taken up the development of 3 MeV RFQ structure for the low energy part of 100 MeV H - ion injector linac. RFQ is a precision machined resonating structure designed for high rf duty factor. RFQ structural stability during high rf power operation is an important design issue. The thermal analysis of RFQ has been performed using ANSYS finite element analysis software and optimization of various parameters is attempted using Search Method optimization technique. It is an effective optimization technique for the systems governed by a large number of independent variables. The method involves examining a number of combinations of values of independent variables and drawing conclusions from the magnitude of the objective function at these combinations. In these methods there is a continuous improvement in the objective function throughout the course of the search and hence these methods are very efficient. The method has been employed in optimization of various parameters (called independent variables) of RFQ like cooling water flow rate, cooling water inlet temperatures, cavity thickness etc. involved in RFQ thermal design. The temperature rise within RFQ structure is the objective function during the thermal design. Using ANSYS Programming Development Language (APDL), various multiple iterative programmes are written and the analysis are performed to minimize the objective function. The dependency of the objective function on various independent variables is established and the optimum values of the parameters are evaluated. The results of the analysis are presented in the paper. (author)

  12. Sliding surface searching method for slopes containing a potential weak structural surface

    Aijun Yao

    2014-06-01

    Full Text Available Weak structural surface is one of the key factors controlling the stability of slopes. The stability of rock slopes is in general concerned with set of discontinuities. However, in soft rocks, failure can occur along surfaces approaching to a circular failure surface. To better understand the position of potential sliding surface, a new method called simplex-finite stochastic tracking method is proposed. This method basically divides sliding surface into two parts: one is described by smooth curve obtained by random searching, the other one is polyline formed by the weak structural surface. Single or multiple sliding surfaces can be considered, and consequently several types of combined sliding surfaces can be simulated. The paper will adopt the arc-polyline to simulate potential sliding surface and analyze the searching process of sliding surface. Accordingly, software for slope stability analysis using this method was developed and applied in real cases. The results show that, using simplex-finite stochastic tracking method, it is possible to locate the position of a potential sliding surface in the slope.

  13. Interactive knowledge discovery from marketing questionnarie using simulated breeding and inductive learning methods

    Terano, Takao [Univ. of Tsukuba, Tokyo (Japan); Ishino, Yoko [Univ. of Tokyo (Japan)

    1996-12-31

    This paper describes a novel method to acquire efficient decision rules from questionnaire data using both simulated breeding and inductive learning techniques. The basic ideas of the method are that simulated breeding is used to get the effective features from the questionnaire data and that inductive learning is used to acquire simple decision rules from the data. The simulated breeding is one of the Genetic Algorithm (GA) based techniques to subjectively or interactively evaluate the qualities of offspring generated by genetic operations. In this paper, we show a basic interactive version of the method and two variations: the one with semi-automated GA phases and the one with the relatively evaluation phase via the Analytic Hierarchy Process (AHP). The proposed method has been qualitatively and quantitatively validated by a case study on consumer product questionnaire data.

  14. Volatility Discovery

    Dias, Gustavo Fruet; Scherrer, Cristina; Papailias, Fotis

    The price discovery literature investigates how homogenous securities traded on different markets incorporate information into prices. We take this literature one step further and investigate how these markets contribute to stochastic volatility (volatility discovery). We formally show...... that the realized measures from homogenous securities share a fractional stochastic trend, which is a combination of the price and volatility discovery measures. Furthermore, we show that volatility discovery is associated with the way that market participants process information arrival (market sensitivity......). Finally, we compute volatility discovery for 30 actively traded stocks in the U.S. and report that Nyse and Arca dominate Nasdaq....

  15. Searching for rigour in the reporting of mixed methods population health research: a methodological review.

    Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J

    2015-12-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  16. Thermodynamic equilibrium solubility measurements in simulated fluids by 96-well plate method in early drug discovery.

    Bharate, Sonali S; Vishwakarma, Ram A

    2015-04-01

    An early prediction of solubility in physiological media (PBS, SGF and SIF) is useful to predict qualitatively bioavailability and absorption of lead candidates. Despite of the availability of multiple solubility estimation methods, none of the reported method involves simplified fixed protocol for diverse set of compounds. Therefore, a simple and medium-throughput solubility estimation protocol is highly desirable during lead optimization stage. The present work introduces a rapid method for assessment of thermodynamic equilibrium solubility of compounds in aqueous media using 96-well microplate. The developed protocol is straightforward to set up and takes advantage of the sensitivity of UV spectroscopy. The compound, in stock solution in methanol, is introduced in microgram quantities into microplate wells followed by drying at an ambient temperature. Microplates were shaken upon addition of test media and the supernatant was analyzed by UV method. A plot of absorbance versus concentration of a sample provides saturation point, which is thermodynamic equilibrium solubility of a sample. The established protocol was validated using a large panel of commercially available drugs and with conventional miniaturized shake flask method (r(2)>0.84). Additionally, the statistically significant QSPR models were established using experimental solubility values of 52 compounds. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. An image segmentation method based on fuzzy C-means clustering and Cuckoo search algorithm

    Wang, Mingwei; Wan, Youchuan; Gao, Xianjun; Ye, Zhiwei; Chen, Maolin

    2018-04-01

    Image segmentation is a significant step in image analysis and machine vision. Many approaches have been presented in this topic; among them, fuzzy C-means (FCM) clustering is one of the most widely used methods for its high efficiency and ambiguity of images. However, the success of FCM could not be guaranteed because it easily traps into local optimal solution. Cuckoo search (CS) is a novel evolutionary algorithm, which has been tested on some optimization problems and proved to be high-efficiency. Therefore, a new segmentation technique using FCM and blending of CS algorithm is put forward in the paper. Further, the proposed method has been measured on several images and compared with other existing FCM techniques such as genetic algorithm (GA) based FCM and particle swarm optimization (PSO) based FCM in terms of fitness value. Experimental results indicate that the proposed method is robust, adaptive and exhibits the better performance than other methods involved in the paper.

  18. Work stress interventions in hospital care : Effectiveness of the DISCovery method

    Niks, I.M.W.; de Jonge, J.; Gevers, J.M.P.; Houtman, I.L.D.

    2018-01-01

    Effective interventions to prevent work stress and to improve health, well-being, and performance of employees are of the utmost importance. This quasi-experimental intervention study presents a specific method for diagnosis of psychosocial risk factors at work and subsequent development and

  19. Robust statistical methods for significance evaluation and applications in cancer driver detection and biomarker discovery

    Madsen, Tobias

    2017-01-01

    In the present thesis I develop, implement and apply statistical methods for detecting genomic elements implicated in cancer development and progression. This is done in two separate bodies of work. The first uses the somatic mutation burden to distinguish cancer driver mutations from passenger m...

  20. Work Stress Interventions in Hospital Care: Effectiveness of the DISCovery Method

    Niks, I.M.W.; Gevers, J.M.P.; Jonge, J. de; Houtman, I.L.D.

    2018-01-01

    Effective interventions to prevent work stress and to improve health, well-being, and performance of employees are of the utmost importance. This quasi-experimental intervention study presents a specific method for diagnosis of psychosocial risk factors at work and subsequent development and

  1. Topology Discovery Using Cisco Discovery Protocol

    Rodriguez, Sergio R.

    2009-01-01

    In this paper we address the problem of discovering network topology in proprietary networks. Namely, we investigate topology discovery in Cisco-based networks. Cisco devices run Cisco Discovery Protocol (CDP) which holds information about these devices. We first compare properties of topologies that can be obtained from networks deploying CDP versus Spanning Tree Protocol (STP) and Management Information Base (MIB) Forwarding Database (FDB). Then we describe a method of discovering topology ...

  2. AFLP fragment isolation technique as a method to produce random sequences for single nucleotide polymorphism discovery in the green turtle, Chelonia mydas.

    Roden, Suzanne E; Dutton, Peter H; Morin, Phillip A

    2009-01-01

    The green sea turtle, Chelonia mydas, was used as a case study for single nucleotide polymorphism (SNP) discovery in a species that has little genetic sequence information available. As green turtles have a complex population structure, additional nuclear markers other than microsatellites could add to our understanding of their complex life history. Amplified fragment length polymorphism technique was used to generate sets of random fragments of genomic DNA, which were then electrophoretically separated with precast gels, stained with SYBR green, excised, and directly sequenced. It was possible to perform this method without the use of polyacrylamide gels, radioactive or fluorescent labeled primers, or hybridization methods, reducing the time, expense, and safety hazards of SNP discovery. Within 13 loci, 2547 base pairs were screened, resulting in the discovery of 35 SNPs. Using this method, it was possible to yield a sufficient number of loci to screen for SNP markers without the availability of prior sequence information.

  3. 41. DISCOVERY, SEARCH, AND COMMUNICATION OF TEXTUAL KNOWLEDGE RESOURCES IN DISTRIBUTED SYSTEMS a. Discovering and Utilizing Knowledge Sources for Metasearch Knowledge Systems

    Zamora, Antonio

    2008-03-18

    Advanced Natural Language Processing Tools for Web Information Retrieval, Content Analysis, and Synthesis. The goal of this SBIR was to implement and evaluate several advanced Natural Language Processing (NLP) tools and techniques to enhance the precision and relevance of search results by analyzing and augmenting search queries and by helping to organize the search output obtained from heterogeneous databases and web pages containing textual information of interest to DOE and the scientific-technical user communities in general. The SBIR investigated 1) the incorporation of spelling checkers in search applications, 2) identification of significant phrases and concepts using a combination of linguistic and statistical techniques, and 3) enhancement of the query interface and search retrieval results through the use of semantic resources, such as thesauri. A search program with a flexible query interface was developed to search reference databases with the objective of enhancing search results from web queries or queries of specialized search systems such as DOE's Information Bridge. The DOE ETDE/INIS Joint Thesaurus was processed to create a searchable database. Term frequencies and term co-occurrences were used to enhance the web information retrieval by providing algorithmically-derived objective criteria to organize relevant documents into clusters containing significant terms. A thesaurus provides an authoritative overview and classification of a field of knowledge. By organizing the results of a search using the thesaurus terminology, the output is more meaningful than when the results are just organized based on the terms that co-occur in the retrieved documents, some of which may not be significant. An attempt was made to take advantage of the hierarchy provided by broader and narrower terms, as well as other field-specific information in the thesauri. The search program uses linguistic morphological routines to find relevant entries regardless of

  4. A comparison of methods for gravitational wave burst searches from LIGO and Virgo

    Beauville, F; Buskulic, D; Grosjean, D; Bizouard, M-A; Cavalier, F; Clapson, A-C; Hello, P; Blackburn, L; Katsavounidis, E; Bosi, L; Brocco, L; Brown, D A; Chatterji, S; Christensen, N; Knight, M; Fairhurst, S; Guidi, G; Heng, S; Hewitson, M; Klimenko, S

    2008-01-01

    The search procedure for burst gravitational waves has been studied using 24 h of simulated data in a network of three interferometers (Hanford 4 km, Livingston 4 km and Virgo 3 km are the example interferometers). Several methods to detect burst events developed in the LIGO Scientific Collaboration (LSC) and Virgo Collaboration have been studied and compared. We have performed coincidence analysis of the triggers obtained in the different interferometers with and without simulated signals added to the data. The benefits of having multiple interferometers of similar sensitivity are demonstrated by comparing the detection performance of the joint coincidence analysis with LSC and Virgo only burst searches. Adding Virgo to the LIGO detector network can increase by 50% the detection efficiency for this search. Another advantage of a joint LIGO-Virgo network is the ability to reconstruct the source sky position. The reconstruction accuracy depends on the timing measurement accuracy of the events in each interferometer, and is displayed in this paper with a fixed source position example

  5. A comparison of methods for gravitational wave burst searches from LIGO and Virgo

    Beauville, F; Buskulic, D; Grosjean, D [Laboratoire d' Annecy-le-Vieux de Physique des Particules, Chemin de Bellevue, BP 110, 74941 Annecy-le-Vieux Cedex (France); Bizouard, M-A; Cavalier, F; Clapson, A-C; Hello, P [Laboratoire de l' Accelerateur Lineaire, IN2P3/CNRS-Universite de Paris XI, BP 34, 91898 Orsay Cedex (France); Blackburn, L; Katsavounidis, E [LIGO-Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Bosi, L [INFN Sezione di Perugia and/or Universita di Perugia, Via A Pascoli, I-06123 Perugia (Italy); Brocco, L [INFN Sezione di Roma and/or Universita ' La Sapienza' , P le A Moro 2, I-00185 Roma (Italy); Brown, D A; Chatterji, S [LIGO-California Institute of Technology, Pasadena, CA 91125 (United States); Christensen, N; Knight, M [Carleton College, Northfield, MN 55057 (United States); Fairhurst, S [University of Wisconsin-Milwaukee, Milwaukee, WI 53201 (United States); Guidi, G [INFN Sezione Firenze/Urbino Via G Sansone 1, I-50019 Sesto Fiorentino (Italy); and/or Universita di Firenze, Largo E Fermi 2, I-50125 Firenze and/or Universita di Urbino, Via S Chiara 27, I-61029 Urbino (Italy); Heng, S; Hewitson, M [University of Glasgow, Glasgow, G12 8QQ (United Kingdom); Klimenko, S [University of Florida-Gainesville, FL 32611 (United States)] (and others)

    2008-02-21

    The search procedure for burst gravitational waves has been studied using 24 h of simulated data in a network of three interferometers (Hanford 4 km, Livingston 4 km and Virgo 3 km are the example interferometers). Several methods to detect burst events developed in the LIGO Scientific Collaboration (LSC) and Virgo Collaboration have been studied and compared. We have performed coincidence analysis of the triggers obtained in the different interferometers with and without simulated signals added to the data. The benefits of having multiple interferometers of similar sensitivity are demonstrated by comparing the detection performance of the joint coincidence analysis with LSC and Virgo only burst searches. Adding Virgo to the LIGO detector network can increase by 50% the detection efficiency for this search. Another advantage of a joint LIGO-Virgo network is the ability to reconstruct the source sky position. The reconstruction accuracy depends on the timing measurement accuracy of the events in each interferometer, and is displayed in this paper with a fixed source position example.

  6. How Users Search the Library from a Single Search Box

    Lown, Cory; Sierra, Tito; Boyer, Josh

    2013-01-01

    Academic libraries are turning increasingly to unified search solutions to simplify search and discovery of library resources. Unfortunately, very little research has been published on library user search behavior in single search box environments. This study examines how users search a large public university library using a prominent, single…

  7. Phase boundary estimation in electrical impedance tomography using the Hooke and Jeeves pattern search method

    Khambampati, Anil Kumar; Kim, Kyung Youn; Ijaz, Umer Zeeshan; Lee, Jeong Seong; Kim, Sin

    2010-01-01

    In industrial processes, monitoring of heterogeneous phases is crucial to the safety and operation of the engineering structures. Particularly, the visualization of voids and air bubbles is advantageous. As a result many studies have appeared in the literature that offer varying degrees of functionality. Electrical impedance tomography (EIT) has already been proved to be a hallmark for process monitoring and offers not only the visualization of the resistivity profile for a given flow mixture but is also used for detection of phase boundaries. Iterative image reconstruction algorithms, such as the modified Newton–Raphson (mNR) method, are commonly used as inverse solvers. However, their utility is problematic in a sense that they require the initial solution in close proximity of the ground truth. Furthermore, they also rely on the gradient information of the objective function to be minimized. Therefore, in this paper, we address all these issues by employing a direct search algorithm, namely the Hooke and Jeeves pattern search method, to estimate the phase boundaries that directly minimizes the cost function and does not require the gradient information. It is assumed that the resistivity profile is known a priori and therefore the unknown information will be the size and location of the object. The boundary coefficients are parameterized using truncated Fourier series and are estimated using the relationship between the measured voltages and injected currents. Through extensive simulation and experimental result and by comparison with mNR, we show that the Hooke and Jeeves pattern search method offers a promising prospect for process monitoring

  8. 29 CFR 2700.56 - Discovery; general.

    2010-07-01

    ...(c) or 111 of the Act has been filed. 30 U.S.C. 815(c) and 821. (e) Completion of discovery... 29 Labor 9 2010-07-01 2010-07-01 false Discovery; general. 2700.56 Section 2700.56 Labor... Hearings § 2700.56 Discovery; general. (a) Discovery methods. Parties may obtain discovery by one or more...

  9. 19 CFR 207.109 - Discovery.

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Discovery. 207.109 Section 207.109 Customs Duties... and Committee Proceedings § 207.109 Discovery. (a) Discovery methods. All parties may obtain discovery under such terms and limitations as the administrative law judge may order. Discovery may be by one or...

  10. All roads lead to Rome - New search methods for the optimal triangulation problem

    Ottosen, T. J.; Vomlel, Jiří

    2012-01-01

    Roč. 53, č. 9 (2012), s. 1350-1366 ISSN 0888-613X R&D Projects: GA MŠk 1M0572; GA ČR GEICC/08/E010; GA ČR GA201/09/1891 Grant - others:GA MŠk(CZ) 2C06019 Institutional support: RVO:67985556 Keywords : Bayesian networks * Optimal triangulation * Probabilistic inference * Cliques in a graph Subject RIV: BD - Theory of Information Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/vomlel-all roads lead to rome - new search methods for the optimal triangulation problem.pdf

  11. Hybrid Genetic Algorithm - Local Search Method for Ground-Water Management

    Chiu, Y.; Nishikawa, T.; Martin, P.

    2008-12-01

    Ground-water management problems commonly are formulated as a mixed-integer, non-linear programming problem (MINLP). Relying only on conventional gradient-search methods to solve the management problem is computationally fast; however, the methods may become trapped in a local optimum. Global-optimization schemes can identify the global optimum, but the convergence is very slow when the optimal solution approaches the global optimum. In this study, we developed a hybrid optimization scheme, which includes a genetic algorithm and a gradient-search method, to solve the MINLP. The genetic algorithm identifies a near- optimal solution, and the gradient search uses the near optimum to identify the global optimum. Our methodology is applied to a conjunctive-use project in the Warren ground-water basin, California. Hi- Desert Water District (HDWD), the primary water-manager in the basin, plans to construct a wastewater treatment plant to reduce future septic-tank effluent from reaching the ground-water system. The treated wastewater instead will recharge the ground-water basin via percolation ponds as part of a larger conjunctive-use strategy, subject to State regulations (e.g. minimum distances and travel times). HDWD wishes to identify the least-cost conjunctive-use strategies that control ground-water levels, meet regulations, and identify new production-well locations. As formulated, the MINLP objective is to minimize water-delivery costs subject to constraints including pump capacities, available recharge water, water-supply demand, water-level constraints, and potential new-well locations. The methodology was demonstrated by an enumerative search of the entire feasible solution and comparing the optimum solution with results from the branch-and-bound algorithm. The results also indicate that the hybrid method identifies the global optimum within an affordable computation time. Sensitivity analyses, which include testing different recharge-rate scenarios, pond

  12. PMSVM: An Optimized Support Vector Machine Classification Algorithm Based on PCA and Multilevel Grid Search Methods

    Yukai Yao

    2015-01-01

    Full Text Available We propose an optimized Support Vector Machine classifier, named PMSVM, in which System Normalization, PCA, and Multilevel Grid Search methods are comprehensively considered for data preprocessing and parameters optimization, respectively. The main goals of this study are to improve the classification efficiency and accuracy of SVM. Sensitivity, Specificity, Precision, and ROC curve, and so forth, are adopted to appraise the performances of PMSVM. Experimental results show that PMSVM has relatively better accuracy and remarkable higher efficiency compared with traditional SVM algorithms.

  13. A novel optimization method, Gravitational Search Algorithm (GSA), for PWR core optimization

    Mahmoudi, S.M.; Aghaie, M.; Bahonar, M.; Poursalehi, N.

    2016-01-01

    Highlights: • The Gravitational Search Algorithm (GSA) is introduced. • The advantage of GSA is verified in Shekel’s Foxholes. • Reload optimizing in WWER-1000 and WWER-440 cases are performed. • Maximizing K eff , minimizing PPFs and flattening power density is considered. - Abstract: In-core fuel management optimization (ICFMO) is one of the most challenging concepts of nuclear engineering. In recent decades several meta-heuristic algorithms or computational intelligence methods have been expanded to optimize reactor core loading pattern. This paper presents a new method of using Gravitational Search Algorithm (GSA) for in-core fuel management optimization. The GSA is constructed based on the law of gravity and the notion of mass interactions. It uses the theory of Newtonian physics and searcher agents are the collection of masses. In this work, at the first step, GSA method is compared with other meta-heuristic algorithms on Shekel’s Foxholes problem. In the second step for finding the best core, the GSA algorithm has been performed for three PWR test cases including WWER-1000 and WWER-440 reactors. In these cases, Multi objective optimizations with the following goals are considered, increment of multiplication factor (K eff ), decrement of power peaking factor (PPF) and power density flattening. It is notable that for neutronic calculation, PARCS (Purdue Advanced Reactor Core Simulator) code is used. The results demonstrate that GSA algorithm have promising performance and could be proposed for other optimization problems of nuclear engineering field.

  14. Comparing the Precision of Information Retrieval of MeSH-Controlled Vocabulary Search Method and a Visual Method in the Medline Medical Database.

    Hariri, Nadjla; Ravandi, Somayyeh Nadi

    2014-01-01

    Medline is one of the most important databases in the biomedical field. One of the most important hosts for Medline is Elton B. Stephens CO. (EBSCO), which has presented different search methods that can be used based on the needs of the users. Visual search and MeSH-controlled search methods are among the most common methods. The goal of this research was to compare the precision of the retrieved sources in the EBSCO Medline base using MeSH-controlled and visual search methods. This research was a semi-empirical study. By holding training workshops, 70 students of higher education in different educational departments of Kashan University of Medical Sciences were taught MeSH-Controlled and visual search methods in 2012. Then, the precision of 300 searches made by these students was calculated based on Best Precision, Useful Precision, and Objective Precision formulas and analyzed in SPSS software using the independent sample T Test, and three precisions obtained with the three precision formulas were studied for the two search methods. The mean precision of the visual method was greater than that of the MeSH-Controlled search for all three types of precision, i.e. Best Precision, Useful Precision, and Objective Precision, and their mean precisions were significantly different (P searches. Fifty-three percent of the participants in the research also mentioned that the use of the combination of the two methods produced better results. For users, it is more appropriate to use a natural, language-based method, such as the visual method, in the EBSCO Medline host than to use the controlled method, which requires users to use special keywords. The potential reason for their preference was that the visual method allowed them more freedom of action.

  15. A dynamic lattice searching method with rotation operation for optimization of large clusters

    Wu Xia; Cai Wensheng; Shao Xueguang

    2009-01-01

    Global optimization of large clusters has been a difficult task, though much effort has been paid and many efficient methods have been proposed. During our works, a rotation operation (RO) is designed to realize the structural transformation from decahedra to icosahedra for the optimization of large clusters, by rotating the atoms below the center atom with a definite degree around the fivefold axis. Based on the RO, a development of the previous dynamic lattice searching with constructed core (DLSc), named as DLSc-RO, is presented. With an investigation of the method for the optimization of Lennard-Jones (LJ) clusters, i.e., LJ 500 , LJ 561 , LJ 600 , LJ 665-667 , LJ 670 , LJ 685 , and LJ 923 , Morse clusters, silver clusters by Gupta potential, and aluminum clusters by NP-B potential, it was found that both the global minima with icosahedral and decahedral motifs can be obtained, and the method is proved to be efficient and universal.

  16. MRS algorithm: a new method for searching myocardial region in SPECT myocardial perfusion images.

    He, Yuan-Lie; Tian, Lian-Fang; Chen, Ping; Li, Bin; Mao, Zhong-Yuan

    2005-10-01

    First, the necessity of automatically segmenting myocardium from myocardial SPECT image is discussed in Section 1. To eliminate the influence of the background, the optimal threshold segmentation method modified for the MRS algorithm is explained in Section 2. Then, the image erosion structure is applied to identify the myocardium region and the liver region. The contour tracing method is introduced to extract the myocardial contour. To locate the centriod of the myocardium, the myocardial centriod searching method is developed. The protocol of the MRS algorithm is summarized in Section 6. The performance of the MRS algorithm is investigated and the conclusion is drawn in Section 7. Finally, the importance of the MRS algorithm and the improvement of the MRS algorithm are discussed.

  17. An R-peak detection method that uses an SVD filter and a search back system.

    Jung, Woo-Hyuk; Lee, Sang-Goog

    2012-12-01

    In this paper, we present a method for detecting the R-peak of an ECG signal by using an singular value decomposition (SVD) filter and a search back system. The ECG signal was detected in two phases: the pre-processing phase and the decision phase. The pre-processing phase consisted of the stages for the SVD filter, Butterworth High Pass Filter (HPF), moving average (MA), and squaring, whereas the decision phase consisted of a single stage that detected the R-peak. In the pre-processing phase, the SVD filter removed noise while the Butterworth HPF eliminated baseline wander. The MA removed the remaining noise of the signal that had gone through the SVD filter to make the signal smooth, and squaring played a role in strengthening the signal. In the decision phase, the threshold was used to set the interval before detecting the R-peak. When the latest R-R interval (RRI), suggested by Hamilton et al., was greater than 150% of the previous RRI, the method of detecting the R-peak in such an interval was modified to be 150% or greater than the smallest interval of the two most latest RRIs. When the modified search back system was used, the error rate of the peak detection decreased to 0.29%, compared to 1.34% when the modified search back system was not used. Consequently, the sensitivity was 99.47%, the positive predictivity was 99.47%, and the detection error was 1.05%. Furthermore, the quality of the signal in data with a substantial amount of noise was improved, and thus, the R-peak was detected effectively. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  18. A novel bioinformatics method for efficient knowledge discovery by BLSOM from big genomic sequence data.

    Bai, Yu; Iwasaki, Yuki; Kanaya, Shigehiko; Zhao, Yue; Ikemura, Toshimichi

    2014-01-01

    With remarkable increase of genomic sequence data of a wide range of species, novel tools are needed for comprehensive analyses of the big sequence data. Self-Organizing Map (SOM) is an effective tool for clustering and visualizing high-dimensional data such as oligonucleotide composition on one map. By modifying the conventional SOM, we have previously developed Batch-Learning SOM (BLSOM), which allows classification of sequence fragments according to species, solely depending on the oligonucleotide composition. In the present study, we introduce the oligonucleotide BLSOM used for characterization of vertebrate genome sequences. We first analyzed pentanucleotide compositions in 100 kb sequences derived from a wide range of vertebrate genomes and then the compositions in the human and mouse genomes in order to investigate an efficient method for detecting differences between the closely related genomes. BLSOM can recognize the species-specific key combination of oligonucleotide frequencies in each genome, which is called a "genome signature," and the specific regions specifically enriched in transcription-factor-binding sequences. Because the classification and visualization power is very high, BLSOM is an efficient powerful tool for extracting a wide range of information from massive amounts of genomic sequences (i.e., big sequence data).

  19. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    O' Leary, Patrick [Kitware, Inc., Clifton Park, NY (United States)

    2017-09-13

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.

  20. Hooke–Jeeves Method-used Local Search in a Hybrid Global Optimization Algorithm

    V. D. Sulimov

    2014-01-01

    Full Text Available Modern methods for optimization investigation of complex systems are based on development and updating the mathematical models of systems because of solving the appropriate inverse problems. Input data desirable for solution are obtained from the analysis of experimentally defined consecutive characteristics for a system or a process. Causal characteristics are the sought ones to which equation coefficients of mathematical models of object, limit conditions, etc. belong. The optimization approach is one of the main ones to solve the inverse problems. In the main case it is necessary to find a global extremum of not everywhere differentiable criterion function. Global optimization methods are widely used in problems of identification and computation diagnosis system as well as in optimal control, computing to-mography, image restoration, teaching the neuron networks, other intelligence technologies. Increasingly complicated systems of optimization observed during last decades lead to more complicated mathematical models, thereby making solution of appropriate extreme problems significantly more difficult. A great deal of practical applications may have the problem con-ditions, which can restrict modeling. As a consequence, in inverse problems the criterion functions can be not everywhere differentiable and noisy. Available noise means that calculat-ing the derivatives is difficult and unreliable. It results in using the optimization methods without calculating the derivatives.An efficiency of deterministic algorithms of global optimization is significantly restrict-ed by their dependence on the extreme problem dimension. When the number of variables is large they use the stochastic global optimization algorithms. As stochastic algorithms yield too expensive solutions, so this drawback restricts their applications. Developing hybrid algo-rithms that combine a stochastic algorithm for scanning the variable space with deterministic local search

  1. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    Bethel, Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-07-24

    The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing was not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.

  2. The Pan-STARRS1 Proper-motion Survey for Young Brown Dwarfs in Nearby Star-forming Regions. I. Taurus Discoveries and a Reddening-free Classification Method for Ultracool Dwarfs

    Zhang, Zhoujian; Liu, Michael C.; Best, William M. J.; Magnier, Eugene A.; Aller, Kimberly M.; Chambers, K. C.; Draper, P. W.; Flewelling, H.; Hodapp, K. W.; Kaiser, N.; Kudritzki, R.-P.; Metcalfe, N.; Wainscoat, R. J.; Waters, C.

    2018-05-01

    We are conducting a proper-motion survey for young brown dwarfs in the Taurus-Auriga molecular cloud based on the Pan-STARRS1 3π Survey. Our search uses multi-band photometry and astrometry to select candidates, and is wider (370 deg2) and deeper (down to ≈3 M Jup) than previous searches. We present here our search methods and spectroscopic follow-up of our high-priority candidates. Since extinction complicates spectral classification, we have developed a new approach using low-resolution (R ≈ 100) near-infrared spectra to quantify reddening-free spectral types, extinctions, and gravity classifications for mid-M to late-L ultracool dwarfs (≲100–3 M Jup in Taurus). We have discovered 25 low-gravity (VL-G) and the first 11 intermediate-gravity (INT-G) substellar (M6–L1) members of Taurus, constituting the largest single increase of Taurus brown dwarfs to date. We have also discovered 1 new Pleiades member and 13 new members of the Perseus OB2 association, including a candidate very wide separation (58 kau) binary. We homogeneously reclassify the spectral types and extinctions of all previously known Taurus brown dwarfs. Altogether our discoveries have thus far increased the substellar census in Taurus by ≈40% and added three more L-type members (≲5–10 M Jup). Most notably, our discoveries reveal an older (>10 Myr) low-mass population in Taurus, in accord with recent studies of the higher-mass stellar members. The mass function appears to differ between the younger and older Taurus populations, possibly due to incompleteness of the older stellar members or different star formation processes.

  3. Methods to filter out spurious disturbances in continuous-wave searches from gravitational-wave detectors

    Leaci, Paola

    2015-01-01

    Semicoherent all-sky searches over year-long observation times for continuous gravitational wave signals produce various thousands of potential periodic source candidates. Efficient methods able to discard false candidate events are crucial in order to put all the efforts into a computationally intensive follow-up analysis for the remaining most promising candidates (Shaltev et al 2014 Phys. Rev. D 89 124030). In this paper we present a set of techniques able to fulfill such requirements, identifying and eliminating false candidate events, reducing thus the bulk of candidate sets that need to be further investigated. Some of these techniques were also used to streamline the candidate sets returned by the Einstein@Home hierarchical searches presented in (Aasi J et al (The LIGO Scientific Collaboration and the Virgo Collaboration) 2013 Phys. Rev. D 87 042001). These powerful methods and the benefits originating from their application to both simulated and on detector data from the fifth LIGO science run are illustrated and discussed. (paper)

  4. Reporting Quality of Search Methods in Systematic Reviews of HIV Behavioral Interventions (2000–2010): Are the Searches Clearly Explained, Systematic and Reproducible?

    Mullins, Mary M.; DeLuca, Julia B.; Crepaz, Nicole; Lyles, Cynthia M.

    2018-01-01

    Systematic reviews are an essential tool for researchers, prevention providers and policy makers who want to remain current with the evidence in the field. Systematic review must adhere to strict standards, as the results can provide a more objective appraisal of evidence for making scientific decisions than traditional narrative reviews. An integral component of a systematic review is the development and execution of a comprehensive systematic search to collect available and relevant information. A number of reporting guidelines have been developed to ensure quality publications of systematic reviews. These guidelines provide the essential elements to include in the review process and report in the final publication for complete transparency. We identified the common elements of reporting guidelines and examined the reporting quality of search methods in HIV behavioral intervention literature. Consistent with the findings from previous evaluations of reporting search methods of systematic reviews in other fields, our review shows a lack of full and transparent reporting within systematic reviews even though a plethora of guidelines exist. This review underscores the need for promoting the completeness of and adherence to transparent systematic search reporting within systematic reviews. PMID:26052651

  5. LITERATURE SEARCH FOR METHODS FOR HAZARD ANALYSES OF AIR CARRIER OPERATIONS.

    MARTINEZ - GURIDI,G.; SAMANTA,P.

    2002-07-01

    Representatives of the Federal Aviation Administration (FAA) and several air carriers under Title 14 of the Code of Federal Regulations (CFR) Part 121 developed a system-engineering model of the functions of air-carrier operations. Their analyses form the foundation or basic architecture upon which other task areas are based: hazard analyses, performance measures, and risk indicator design. To carry out these other tasks, models may need to be developed using the basic architecture of the Air Carrier Operations System Model (ACOSM). Since ACOSM encompasses various areas of air-carrier operations and can be used to address different task areas with differing but interrelated objectives, the modeling needs are broad. A literature search was conducted to identify and analyze the existing models that may be applicable for pursuing the task areas in ACOSM. The intent of the literature search was not necessarily to identify a specific model that can be directly used, but rather to identify relevant ones that have similarities with the processes and activities defined within ACOSM. Such models may provide useful inputs and insights in structuring ACOSM models. ACOSM simulates processes and activities in air-carrier operation, but, in a general framework, it has similarities with other industries where attention also has been paid to hazard analyses, emphasizing risk management, and in designing risk indicators. To assure that efforts in other industries are adequately considered, the literature search includes publications from other industries, e.g., chemical, nuclear, and process industries. This report discusses the literature search, the relevant methods identified and provides a preliminary assessment of their use in developing the models needed for the ACOSM task areas. A detailed assessment of the models has not been made. Defining those applicable for ACOSM will need further analyses of both the models and tools identified. The report is organized in four chapters

  6. Discovery of Novel Complex Metal Hydrides for Hydrogen Storage through Molecular Modeling and Combinatorial Methods

    Lesch, David A; Adriaan Sachtler, J.W. J.; Low, John J; Jensen, Craig M; Ozolins, Vidvuds; Siegel, Don; Harmon, Laurel

    2011-02-14

    UOP LLC, a Honeywell Company, Ford Motor Company, and Striatus, Inc., collaborated with Professor Craig Jensen of the University of Hawaii and Professor Vidvuds Ozolins of University of California, Los Angeles on a multi-year cost-shared program to discover novel complex metal hydrides for hydrogen storage. This innovative program combined sophisticated molecular modeling with high throughput combinatorial experiments to maximize the probability of identifying commercially relevant, economical hydrogen storage materials with broad application. A set of tools was developed to pursue the medium throughput (MT) and high throughput (HT) combinatorial exploratory investigation of novel complex metal hydrides for hydrogen storage. The assay programs consisted of monitoring hydrogen evolution as a function of temperature. This project also incorporated theoretical methods to help select candidate materials families for testing. The Virtual High Throughput Screening served as a virtual laboratory, calculating structures and their properties. First Principles calculations were applied to various systems to examine hydrogen storage reaction pathways and the associated thermodynamics. The experimental program began with the validation of the MT assay tool with NaAlH4/0.02 mole Ti, the state of the art hydrogen storage system given by decomposition of sodium alanate to sodium hydride, aluminum metal, and hydrogen. Once certified, a combinatorial 21-point study of the NaAlH4 LiAlH4Mg(AlH4)2 phase diagram was investigated with the MT assay. Stability proved to be a problem as many of the materials decomposed during synthesis, altering the expected assay results. This resulted in repeating the entire experiment with a mild milling approach, which only temporarily increased capacity. NaAlH4 was the best performer in both studies and no new mixed alanates were observed, a result consistent with the VHTS. Powder XRD suggested that the reverse reaction, the regeneration of the

  7. Beyond information retrieval: information discovery and multimedia information retrieval

    Roberto Raieli

    2017-01-01

    The paper compares the current methodologies for search and discovery of information and information resources: terminological search and term-based language, own of information retrieval (IR); semantic search and information discovery, being developed mainly through the language of linked data; semiotic search and content-based language, experienced by multimedia information retrieval (MIR).MIR semiotic methodology is, then, detailed.

  8. Beyond Discovery

    Korsgaard, Steffen; Sassmannshausen, Sean Patrick

    2017-01-01

    In this chapter we explore four alternatives to the dominant discovery view of entrepreneurship; the development view, the construction view, the evolutionary view, and the Neo-Austrian view. We outline the main critique points of the discovery presented in these four alternatives, as well...

  9. Chemical Discovery

    Brown, Herbert C.

    1974-01-01

    The role of discovery in the advance of the science of chemistry and the factors that are currently operating to handicap that function are considered. Examples are drawn from the author's work with boranes. The thesis that exploratory research and discovery should be encouraged is stressed. (DT)

  10. Application of Combination High-Throughput Phenotypic Screening and Target Identification Methods for the Discovery of Natural Product-Based Combination Drugs.

    Isgut, Monica; Rao, Mukkavilli; Yang, Chunhua; Subrahmanyam, Vangala; Rida, Padmashree C G; Aneja, Ritu

    2018-03-01

    Modern drug discovery efforts have had mediocre success rates with increasing developmental costs, and this has encouraged pharmaceutical scientists to seek innovative approaches. Recently with the rise of the fields of systems biology and metabolomics, network pharmacology (NP) has begun to emerge as a new paradigm in drug discovery, with a focus on multiple targets and drug combinations for treating disease. Studies on the benefits of drug combinations lay the groundwork for a renewed focus on natural products in drug discovery. Natural products consist of a multitude of constituents that can act on a variety of targets in the body to induce pharmacodynamic responses that may together culminate in an additive or synergistic therapeutic effect. Although natural products cannot be patented, they can be used as starting points in the discovery of potent combination therapeutics. The optimal mix of bioactive ingredients in natural products can be determined via phenotypic screening. The targets and molecular mechanisms of action of these active ingredients can then be determined using chemical proteomics, and by implementing a reverse pharmacokinetics approach. This review article provides evidence supporting the potential benefits of natural product-based combination drugs, and summarizes drug discovery methods that can be applied to this class of drugs. © 2017 Wiley Periodicals, Inc.

  11. Search methods that people use to find owners of lost pets.

    Lord, Linda K; Wittum, Thomas E; Ferketich, Amy K; Funk, Julie A; Rajala-Schultz, Päivi J

    2007-06-15

    To characterize the process by which people who find lost pets search for the owners. Cross-sectional study. Sample Population-188 individuals who found a lost pet in Dayton, Ohio, between March 1 and June 30, 2006. Procedures-Potential participants were identified as a result of contact with a local animal agency or placement of an advertisement in the local newspaper. A telephone survey was conducted to identify methods participants used to find the pets' owners. 156 of 188 (83%) individuals completed the survey. Fifty-nine of the 156 (38%) pets were reunited with their owners; median time to reunification was 2 days (range, 0.5 to 45 days). Only 1 (3%) cat owner was found, compared with 58 (46%) dog owners. Pet owners were found as a result of information provided by an animal agency (25%), placement of a newspaper advertisement (24%), walking the neighborhood (19%), signs in the neighborhood (15%), information on a pet tag (10%), and other methods (7%). Most finders (87%) considered it extremely important to find the owner, yet only 13 (8%) initially surrendered the found pet to an animal agency. The primary reason people did not surrender found pets was fear of euthanasia (57%). Only 97 (62%) individuals were aware they could run a found-pet advertisement in the newspaper at no charge, and only 1 person who was unaware of the no-charge policy placed an advertisement. Veterinarians and shelters can help educate people who find lost pets about methods to search for the pets' owners.

  12. Application of an automated natural language processing (NLP) workflow to enable federated search of external biomedical content in drug discovery and development.

    McEntire, Robin; Szalkowski, Debbie; Butler, James; Kuo, Michelle S; Chang, Meiping; Chang, Man; Freeman, Darren; McQuay, Sarah; Patel, Jagruti; McGlashen, Michael; Cornell, Wendy D; Xu, Jinghai James

    2016-05-01

    External content sources such as MEDLINE(®), National Institutes of Health (NIH) grants and conference websites provide access to the latest breaking biomedical information, which can inform pharmaceutical and biotechnology company pipeline decisions. The value of the sites for industry, however, is limited by the use of the public internet, the limited synonyms, the rarity of batch searching capability and the disconnected nature of the sites. Fortunately, many sites now offer their content for download and we have developed an automated internal workflow that uses text mining and tailored ontologies for programmatic search and knowledge extraction. We believe such an efficient and secure approach provides a competitive advantage to companies needing access to the latest information for a range of use cases and complements manually curated commercial sources. Copyright © 2016. Published by Elsevier Ltd.

  13. Exploring genomic dark matter: A critical assessment of the performance of homology search methods on noncoding RNA

    Freyhult, E.; Bollback, J. P.; Gardner, P. P.

    2006-01-01

    Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer, and Infer......Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer......, and Infernal. Surprisingly, the most popular homology search methods are often the least accurate. As a result, many studies have used inappropriate tools for their analyses. On the basis of our results, we suggest homology search strategies using the currently available tools and some directions for future...

  14. Optimal correction and design parameter search by modern methods of rigorous global optimization

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  15. A meta-heuristic method for solving scheduling problem: crow search algorithm

    Adhi, Antono; Santosa, Budi; Siswanto, Nurhadi

    2018-04-01

    Scheduling is one of the most important processes in an industry both in manufacturingand services. The scheduling process is the process of selecting resources to perform an operation on tasks. Resources can be machines, peoples, tasks, jobs or operations.. The selection of optimum sequence of jobs from a permutation is an essential issue in every research in scheduling problem. Optimum sequence becomes optimum solution to resolve scheduling problem. Scheduling problem becomes NP-hard problem since the number of job in the sequence is more than normal number can be processed by exact algorithm. In order to obtain optimum results, it needs a method with capability to solve complex scheduling problems in an acceptable time. Meta-heuristic is a method usually used to solve scheduling problem. The recently published method called Crow Search Algorithm (CSA) is adopted in this research to solve scheduling problem. CSA is an evolutionary meta-heuristic method which is based on the behavior in flocks of crow. The calculation result of CSA for solving scheduling problem is compared with other algorithms. From the comparison, it is found that CSA has better performance in term of optimum solution and time calculation than other algorithms.

  16. Inverse atmospheric radiative transfer problems - A nonlinear minimization search method of solution. [aerosol pollution monitoring

    Fymat, A. L.

    1976-01-01

    The paper studies the inversion of the radiative transfer equation describing the interaction of electromagnetic radiation with atmospheric aerosols. The interaction can be considered as the propagation in the aerosol medium of two light beams: the direct beam in the line-of-sight attenuated by absorption and scattering, and the diffuse beam arising from scattering into the viewing direction, which propagates more or less in random fashion. The latter beam has single scattering and multiple scattering contributions. In the former case and for single scattering, the problem is reducible to first-kind Fredholm equations, while for multiple scattering it is necessary to invert partial integrodifferential equations. A nonlinear minimization search method, applicable to the solution of both types of problems has been developed, and is applied here to the problem of monitoring aerosol pollution, namely the complex refractive index and size distribution of aerosol particles.

  17. Adjusting the Parameters of Metal Oxide Gapless Surge Arresters’ Equivalent Circuits Using the Harmony Search Method

    Christos A. Christodoulou

    2017-12-01

    Full Text Available The appropriate circuit modeling of metal oxide gapless surge arresters is critical for insulation coordination studies. Metal oxide arresters present a dynamic behavior for fast front surges; namely, their residual voltage is dependent on the peak value, as well as the duration of the injected impulse current, and should therefore not only be represented by non-linear elements. The aim of the current work is to adjust the parameters of the most frequently used surge arresters’ circuit models by considering the magnitude of the residual voltage, as well as the dissipated energy for given pulses. In this aim, the harmony search method is implemented to adjust parameter values of the arrester equivalent circuit models. This functions by minimizing a defined objective function that compares the simulation outcomes with the manufacturer’s data and the results obtained from previous methodologies.

  18. A method in search of a theory: peer education and health promotion.

    Turner, G; Shepherd, J

    1999-04-01

    Peer education has grown in popularity and practice in recent years in the field of health promotion. However, advocates of peer education rarely make reference to theories in their rationale for particular projects. In this paper the authors review a selection of commonly cited theories, and examine to what extent they have value and relevance to peer education in health promotion. Beginning from an identification of 10 claims made for peer education, each theory is examined in terms of the scope of the theory and evidence to support it in practice. The authors conclude that, whilst most theories have something to offer towards an explanation of why peer education might be effective, most theories are limited in scope and there is little empirical evidence in health promotion practice to support them. Peer education would seem to be a method in search of a theory rather than the application of theory to practice.

  19. Application of a high throughput method of biomarker discovery to improvement of the EarlyCDT(®-Lung Test.

    Isabel K Macdonald

    Full Text Available BACKGROUND: The National Lung Screening Trial showed that CT screening for lung cancer led to a 20% reduction in mortality. However, CT screening has a number of disadvantages including low specificity. A validated autoantibody assay is available commercially (EarlyCDT®-Lung to aid in the early detection of lung cancer and risk stratification in patients with pulmonary nodules detected by CT. Recent advances in high throughput (HTP cloning and expression methods have been developed into a discovery pipeline to identify biomarkers that detect autoantibodies. The aim of this study was to demonstrate the successful clinical application of this strategy to add to the EarlyCDT-Lung panel in order to improve its sensitivity and specificity (and hence positive predictive value, (PPV. METHODS AND FINDINGS: Serum from two matched independent cohorts of lung cancer patients were used (n = 100 and n = 165. Sixty nine proteins were initially screened on an abridged HTP version of the autoantibody ELISA using protein prepared on small scale by a HTP expression and purification screen. Promising leads were produced in shake flask culture and tested on the full assay. These results were analyzed in combination with those from the EarlyCDT-Lung panel in order to provide a set of re-optimized cut-offs. Five proteins that still displayed cancer/normal differentiation were tested for reproducibility and validation on a second batch of protein and a separate patient cohort. Addition of these proteins resulted in an improvement in the sensitivity and specificity of the test from 38% and 86% to 49% and 93% respectively (PPV improvement from 1 in 16 to 1 in 7. CONCLUSION: This is a practical example of the value of investing resources to develop a HTP technology. Such technology may lead to improvement in the clinical utility of the EarlyCDT--Lung test, and so further aid the early detection of lung cancer.

  20. A Search for Lost Planets in the Kepler Multi-Planet Systems and the Discovery of the Long-Period, Neptune-Sized Exoplanet Kepler-150 f

    Schmitt, Joseph R.; Jenkins, Jon M.; Fischer, Debra A.

    2017-01-01

    The vast majority of the 4700 confirmed planets and planet candidates discovered by the Kepler space telescope were first found by the Kepler pipeline. In the pipeline, after a transit signal is found, all data points associated with those transits are removed, creating a Swiss cheese-like light curve full of holes, which is then used for subsequent transit searches. These holes could render an additional planet undetectable (or lost). We examine a sample of 114 stars with 3+ confirmed planets to see the effect that this Swiss cheesing may have. A simulation determined that the probability that a transiting planet is lost due to the transit masking is low, but non-neglible, reaching a plateau at approximately 3.3% lost in the period range of P = 400 - 500 days. We then model the transits in all quarters of each star and subtract out the transit signals, restoring the in-transit data points, and use the Kepler pipeline to search the transit-subtracted (i.e., transit-cleaned) light curves. However, the pipeline did not discover any credible new transit signals. This demonstrates the validity and robustness of the Kepler pipelines choice to use transit masking over transit subtraction. However, a follow-up visual search through all the transit-subtracted data, which allows for easier visual identification of new transits, revealed the existence of a new, Neptune-sized exoplanet. Kepler-150 f (P = 637.2 days, RP = 3.86 R earth) is confirmed using a combination of false positive probability analysis, transit duration analysis, and the planet multiplicity argument.

  1. A SEARCH FOR LOST PLANETS IN THE KEPLER MULTI-PLANET SYSTEMS AND THE DISCOVERY OF A LONG PERIOD, NEPTUNE-SIZED EXOPLANET KEPLER-150 F.

    Schmitt, Joseph R; Jenkins, Jon M; Fischer, Debra A

    2017-04-01

    The vast majority of the 4700 confirmed planets and planet candidates discovered by the Kepler space telescope were first found by the Kepler pipeline. In the pipeline, after a transit signal is found, all data points associated with those transits are removed, creating a "Swiss cheese"-like light curve full of holes, which is then used for subsequent transit searches. These holes could render an additional planet undetectable (or "lost"). We examine a sample of 114 stars with 3+ confirmed planets to see the effect that this "Swiss cheesing" may have. A simulation determined that the probability that a transiting planet is lost due to the transit masking is low, but non-neglible, reaching a plateau at ~3.3% lost in the period range of P = 400 - 500 days. We then model the transits in all quarters of each star and subtract out the transit signals, restoring the in-transit data points, and use the Kepler pipeline to search the transit-subtracted (i.e., transit-cleaned) light curves. However, the pipeline did not discover any credible new transit signals. This demonstrates the validity and robustness of the Kepler pipeline's choice to use transit masking over transit subtraction. However, a follow-up visual search through all the transit-subtracted data, which allows for easier visual identification of new transits, revealed the existence of a new, Neptune-sized exoplanet. Kepler-150 f ( P = 637.2 days, R P = 3.86 R ⊕ ) is confirmed using a combination of false positive probability analysis, transit duration analysis, and the planet multiplicity argument.

  2. Comparison the Effect of Teaching by Group Guided Discovery Learning, Questions & Answers and Lecturing Methods on the Level of Learning and Information Durability of Students

    Mardanparvar H.

    2016-02-01

    Full Text Available Aims: The requirements for revising the traditional education methods and utilization of new and active student-oriented learning methods have come into the scope of the educational systems long ago. Therefore, the new methods are being popular in different sciences including medical sciences. The aim of this study was to compare the effectiveness of teaching through three methods (group guided discovery, questions and answers, and lecture methods on the learning level and information durability in the nursing students. Instrument & Methods: In the semi-experimental study, 62 forth-semester nursing students of Nursing and Midwifery Faculty of Isfahan University of Medical Sciences, who were passing the infectious course for the first time at the first semester of the academic year 2015-16, were studied. The subjects were selected via census method and randomly divided into three groups including group guided discovery, questions and answers, and lecture groups. The test was conducted before, immediately after, and one month after the conduction of the training program using a researcher-made questionnaire. Data was analyzed by SPSS 19 software using Chi-square test, one-way ANOVA, ANOVA with repeated observations, and LSD post-hoc test. Findings: The mean score of the test conducted immediately after the training program in the lecture group was significantly lesser than guided discovery and question and answer groups (p<0.001. In addition, the mean score of the test conducted one month after the training program in guided discovery group was significantly higher than both question and answer (p=0.004 and lecture (p=0.001 groups. Conclusion: Active educational methods lead to a higher level of the students’ participation in the educational issues and provided a background to enhance learning and for better information durability. 

  3. A new family of Polak-Ribiere-Polyak conjugate gradient method with the strong-Wolfe line search

    Ghani, Nur Hamizah Abdul; Mamat, Mustafa; Rivaie, Mohd

    2017-08-01

    Conjugate gradient (CG) method is an important technique in unconstrained optimization, due to its effectiveness and low memory requirements. The focus of this paper is to introduce a new CG method for solving large scale unconstrained optimization. Theoretical proofs show that the new method fulfills sufficient descent condition if strong Wolfe-Powell inexact line search is used. Besides, computational results show that our proposed method outperforms to other existing CG methods.

  4. Higgs Discovery

    Sannino, Francesco

    2013-01-01

    has been challenged by the discovery of a not-so-heavy Higgs-like state. I will therefore review the recent discovery \\cite{Foadi:2012bb} that the standard model top-induced radiative corrections naturally reduce the intrinsic non-perturbative mass of the composite Higgs state towards the desired...... via first principle lattice simulations with encouraging results. The new findings show that the recent naive claims made about new strong dynamics at the electroweak scale being disfavoured by the discovery of a not-so-heavy composite Higgs are unwarranted. I will then introduce the more speculative......I discuss the impact of the discovery of a Higgs-like state on composite dynamics starting by critically examining the reasons in favour of either an elementary or composite nature of this state. Accepting the standard model interpretation I re-address the standard model vacuum stability within...

  5. Evolutionary Policy Transfer and Search Methods for Boosting Behavior Quality: RoboCup Keep-Away Case Study

    Geoff Nitschke

    2017-11-01

    Full Text Available This study evaluates various evolutionary search methods to direct neural controller evolution in company with policy (behavior transfer across increasingly complex collective robotic (RoboCup keep-away tasks. Robot behaviors are first evolved in a source task and then transferred for further evolution to more complex target tasks. Evolutionary search methods tested include objective-based search (fitness function, behavioral and genotypic diversity maintenance, and hybrids of such diversity maintenance and objective-based search. Evolved behavior quality is evaluated according to effectiveness and efficiency. Effectiveness is the average task performance of transferred and evolved behaviors, where task performance is the average time the ball is controlled by a keeper team. Efficiency is the average number of generations taken for the fittest evolved behaviors to reach a minimum task performance threshold given policy transfer. Results indicate that policy transfer coupled with hybridized evolution (behavioral diversity maintenance and objective-based search addresses the bootstrapping problem for increasingly complex keep-away tasks. That is, this hybrid method (coupled with policy transfer evolves behaviors that could not otherwise be evolved. Also, this hybrid evolutionary search was demonstrated as consistently evolving topologically simple neural controllers that elicited high-quality behaviors.

  6. Computational Methods Used in Hit-to-Lead and Lead Optimization Stages of Structure-Based Drug Discovery.

    Heifetz, Alexander; Southey, Michelle; Morao, Inaki; Townsend-Nicholson, Andrea; Bodkin, Mike J

    2018-01-01

    GPCR modeling approaches are widely used in the hit-to-lead (H2L) and lead optimization (LO) stages of drug discovery. The aims of these modeling approaches are to predict the 3D structures of the receptor-ligand complexes, to explore the key interactions between the receptor and the ligand and to utilize these insights in the design of new molecules with improved binding, selectivity or other pharmacological properties. In this book chapter, we present a brief survey of key computational approaches integrated with hierarchical GPCR modeling protocol (HGMP) used in hit-to-lead (H2L) and in lead optimization (LO) stages of structure-based drug discovery (SBDD). We outline the differences in modeling strategies used in H2L and LO of SBDD and illustrate how these tools have been applied in three drug discovery projects.

  7. Search strategies

    Oliver, B. M.

    Attention is given to the approaches which would provide the greatest chance of success in attempts related to the discovery of extraterrestrial advanced cultures in the Galaxy, taking into account the principle of least energy expenditure. The energetics of interstellar contact are explored, giving attention to the use of manned spacecraft, automatic probes, and beacons. The least expensive approach to a search for other civilizations involves a listening program which attempts to detect signals emitted by such civilizations. The optimum part of the spectrum for the considered search is found to be in the range from 1 to 2 GHz. Antenna and transmission formulas are discussed along with the employment of matched gates and filters, the probable characteristics of the signals to be detected, the filter-signal mismatch loss, surveys of the radio sky, the conduction of targeted searches.

  8. On the antiproton discovery

    Piccioni, O.

    1989-01-01

    The author of this article describes his own role in the discovery of the antiproton. Although Segre and Chamberlain received the Nobel Prize in 1959 for its discovery, the author claims that their experimental method was his idea which he communicated to them informally in December 1954. He describes how his application for citizenship (he was Italian), and other scientists' manipulation, prevented him from being at Berkeley to work on the experiment himself. (UK)

  9. Investigations on search methods for speech recognition using weighted finite state transducers

    Rybach, David

    2014-01-01

    The search problem in the statistical approach to speech recognition is to find the most likely word sequence for an observed speech signal using a combination of knowledge sources, i.e. the language model, the pronunciation model, and the acoustic models of phones. The resulting search space is enormous. Therefore, an efficient search strategy is required to compute the result with a feasible amount of time and memory. The structured statistical models as well as their combination, the searc...

  10. A Tale of Two Discoveries: Comparing the Usability of Summon and EBSCO Discovery Service

    Foster, Anita K.; MacDonald, Jean B.

    2013-01-01

    Web-scale discovery systems are gaining momentum among academic libraries as libraries seek a means to provide their users with a one-stop searching experience. Illinois State University's Milner Library found itself in the unique position of having access to two distinct discovery products, EBSCO Discovery Service and Serials Solutions' Summon.…

  11. Elliptical tiling method to generate a 2-dimensional set of templates for gravitational wave search

    Arnaud, Nicolas; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Kreckelbergh, Stephane; Porter, Edward K.

    2003-01-01

    Searching for a signal depending on unknown parameters in a noisy background with matched filtering techniques always requires an analysis of the data with several templates in parallel in order to ensure a proper match between the filter and the real waveform. The key feature of such an implementation is the design of the filter bank which must be small to limit the computational cost while keeping the detection efficiency as high as possible. This paper presents a geometrical method that allows one to cover the corresponding physical parameter space by a set of ellipses, each of them being associated with a given template. After the description of the main characteristics of the algorithm, the method is applied in the field of gravitational wave (GW) data analysis, for the search of damped sine signals. Such waveforms are expected to be produced during the deexcitation phase of black holes - the so-called 'ringdown' signals - and are also encountered in some numerically computed supernova signals. First, the number of templates N computed by the method is similar to its analytical estimation, despite the overlaps between neighbor templates and the border effects. Moreover, N is small enough to test for the first time the performances of the set of templates for different choices of the minimal match MM, the parameter used to define the maximal allowed loss of signal-to-noise ratio (SNR) due to the mismatch between real signals and templates. The main result of this analysis is that the fraction of SNR recovered is on average much higher than MM, which dramatically decreases the mean percentage of false dismissals. Indeed, it goes well below its estimated value of 1-MM 3 used as input of the algorithm. Thus, as this feature should be common to any tiling algorithm, it seems possible to reduce the constraint on the value of MM - and indeed the number of templates and the computing power - without losing as many events as expected on average. This should be of great

  12. Application of a heuristic search method for generation of fuel reload configurations

    Galperin, A.; Nissan, E.

    1988-01-01

    A computerized heuristic search method for the generation and optimization of fuel reload configurations is proposed and investigated. The heuristic knowledge is expressed modularly in the form of ''IF-THEN'' production rules. The method was implemented in a program coded in the Franz LISP programming language and executed under the UNIX operating system. A test problem was formulated, based on a typical light water reactor reload problem with a few simplifications assumed, in order to allow formulation of the reload strategy into a relatively small number of rules. A computer run of the problem was performed with a VAX-780 machine. A set of 312 solutions was generated in -- 20 min of execution time. Testing of a few arbitrarily chosen configurations demonstrated reasonably good performance for the computer-generated solutions. A computerized generator of reload configurations may be used for the fast generation or modification of reload patterns and as a tool for the formulation, tuning, and testing of the heuristic knowledge rules used by an ''expert'' fuel manager

  13. Gravity Search Algorithm hybridized Recursive Least Square method for power system harmonic estimation

    Santosh Kumar Singh

    2017-06-01

    Full Text Available This paper presents a new hybrid method based on Gravity Search Algorithm (GSA and Recursive Least Square (RLS, known as GSA-RLS, to solve the harmonic estimation problems in the case of time varying power signals in presence of different noises. GSA is based on the Newton’s law of gravity and mass interactions. In the proposed method, the searcher agents are a collection of masses that interact with each other using Newton’s laws of gravity and motion. The basic GSA algorithm strategy is combined with RLS algorithm sequentially in an adaptive way to update the unknown parameters (weights of the harmonic signal. Simulation and practical validation are made with the experimentation of the proposed algorithm with real time data obtained from a heavy paper industry. A comparative performance of the proposed algorithm is evaluated with other recently reported algorithms like, Differential Evolution (DE, Particle Swarm Optimization (PSO, Bacteria Foraging Optimization (BFO, Fuzzy-BFO (F-BFO hybridized with Least Square (LS and BFO hybridized with RLS algorithm, which reveals that the proposed GSA-RLS algorithm is the best in terms of accuracy, convergence and computational time.

  14. Search for the top quark at D0 using multivariate methods

    Bhat, P.C.

    1995-07-01

    We report on the search for the top quark in p bar p collisions at the Fermilab Tevatron (√s = 1.8 TeV) in the di-lepton and lepton+jets channels using multivariate methods. An H-matrix analysis of the eμ data corresponding to an integrated luminosity of 13.5±1.6 pb -1 yields one event whose likelihood to be a top quark event, assuming m top = 180 GeV/c 2 , is ten times more than that of WW and eighteen times more than that of Z → ττ. A neural network analysis of the e+jets channel using a data sample corresponding to an integrated luminosity of 47.9±5.7 pb -1 shows an excess of events in the signal region and yields a cross-section for t bar t production of 6.7±2.3 (stat.) pb, assuming a top mass of 200 GeV/c 2 . An analysis of the e+jets data using the probability density estimation method yields a cross-section that is consistent with the above result

  15. Validation of a search strategy to identify nutrition trials in PubMed using the relative recall method.

    Durão, Solange; Kredo, Tamara; Volmink, Jimmy

    2015-06-01

    To develop, assess, and maximize the sensitivity of a search strategy to identify diet and nutrition trials in PubMed using relative recall. We developed a search strategy to identify diet and nutrition trials in PubMed. We then constructed a gold standard reference set to validate the identified trials using the relative recall method. Relative recall was calculated by dividing the number of references from the gold standard our search strategy identified by the total number of references in the gold standard. Our gold standard comprised 298 trials, derived from 16 included systematic reviews. The initial search strategy identified 242 of 298 references, with a relative recall of 81.2% [95% confidence interval (CI): 76.3%, 85.5%]. We analyzed titles and abstracts of the 56 missed references for possible additional terms. We then modified the search strategy accordingly. The relative recall of the final search strategy was 88.6% (95% CI: 84.4%, 91.9%). We developed a search strategy to identify diet and nutrition trials in PubMed with a high relative recall (sensitivity). This could be useful for establishing a nutrition trials register to support the conduct of future research, including systematic reviews. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Automated Supernova Discovery (Abstract)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  17. An Exploration of Retrieval-Enhancing Methods for Integrated Search in a Digital Library

    Sørensen, Diana Ransgaard; Bogers, Toine; Larsen, Birger

    2012-01-01

    Integrated search is defined as searching across different document types and representations simultaneously, with the goal of presenting the user with a single ranked result list containing the optimal mix of document types. In this paper, we compare various approaches to integrating three diffe...

  18. Novel citation-based search method for scientific literature: application to meta-analyses

    Janssens, A.C.J.W.; Gwinn, M.

    2015-01-01

    Background: Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of

  19. Guidelines for Biomarker of Food Intake Reviews (BFIRev): how to conduct an extensive literature search for biomarker of food intake discovery.

    Praticò, Giulia; Gao, Qian; Scalbert, Augustin; Vergères, Guy; Kolehmainen, Marjukka; Manach, Claudine; Brennan, Lorraine; Pedapati, Sri Harsha; Afman, Lydia A; Wishart, David S; Vázquez-Fresno, Rosa; Lacueva, Cristina Andres; Garcia-Aloy, Mar; Verhagen, Hans; Feskens, Edith J M; Dragsted, Lars O

    2018-01-01

    Identification of new biomarkers of food and nutrient intake has developed fast over the past two decades and could potentially provide important new tools for compliance monitoring and dietary intake assessment in nutrition and health science. In recent years, metabolomics has played an important role in identifying a large number of putative biomarkers of food intake (BFIs). However, the large body of scientific literature on potential BFIs outside the metabolomics area should also be taken into account. In particular, we believe that extensive literature reviews should be conducted and that the quality of all suggested biomarkers should be systematically evaluated. In order to cover the literature on BFIs in the most appropriate and consistent manner, there is a need for appropriate guidelines on this topic. These guidelines should build upon guidelines in related areas of science while targeting the special needs of biomarker methodology. This document provides a guideline for conducting an extensive literature search on BFIs, which will provide the basis to systematically validate BFIs. This procedure will help to prioritize future work on the identification of new potential biomarkers and on validating these as well as other biomarker candidates, thereby providing better tools for future studies in nutrition and health.

  20. Guidelines for Biomarker of Food Intake Reviews (BFIRev: how to conduct an extensive literature search for biomarker of food intake discovery

    Giulia Praticò

    2018-02-01

    Full Text Available Abstract Identification of new biomarkers of food and nutrient intake has developed fast over the past two decades and could potentially provide important new tools for compliance monitoring and dietary intake assessment in nutrition and health science. In recent years, metabolomics has played an important role in identifying a large number of putative biomarkers of food intake (BFIs. However, the large body of scientific literature on potential BFIs outside the metabolomics area should also be taken into account. In particular, we believe that extensive literature reviews should be conducted and that the quality of all suggested biomarkers should be systematically evaluated. In order to cover the literature on BFIs in the most appropriate and consistent manner, there is a need for appropriate guidelines on this topic. These guidelines should build upon guidelines in related areas of science while targeting the special needs of biomarker methodology. This document provides a guideline for conducting an extensive literature search on BFIs, which will provide the basis to systematically validate BFIs. This procedure will help to prioritize future work on the identification of new potential biomarkers and on validating these as well as other biomarker candidates, thereby providing better tools for future studies in nutrition and health.

  1. Custom database development and biomarker discovery methods for MALDI-TOF mass spectrometry-based identification of high-consequence bacterial pathogens.

    Tracz, Dobryan M; Tyler, Andrea D; Cunningham, Ian; Antonation, Kym S; Corbett, Cindi R

    2017-03-01

    A high-quality custom database of MALDI-TOF mass spectral profiles was developed with the goal of improving clinical diagnostic identification of high-consequence bacterial pathogens. A biomarker discovery method is presented for identifying and evaluating MALDI-TOF MS spectra to potentially differentiate biothreat bacteria from less-pathogenic near-neighbour species. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  2. A Tabu Search WSN Deployment Method for Monitoring Geographically Irregular Distributed Events

    2009-03-01

    Full Text Available In this paper, we address the Wireless Sensor Network (WSN deployment issue. We assume that the observed area is characterized by the geographical irregularity of the sensed events. Formally, we consider that each point in the deployment area is associated a differentiated detection probability threshold, which must be satisfied by our deployment method. Our resulting WSN deployment problem is formulated as a Multi-Objectives Optimization problem, which seeks to reduce the gap between the generated events detection probabilities and the required thresholds while minimizing the number of deployed sensors. To overcome the computational complexity of an exact resolution, we propose an original pseudo-random approach based on the Tabu Search heuristic. Simulations show that our proposal achieves better performances than several other approaches proposed in the literature. In the last part of this paper, we generalize the deployment problem by including the wireless communication network connectivity constraint. Thus, we extend our proposal to ensure that the resulting WSN topology is connected even if a sensor communication range takes small values.

  3. A Tabu Search WSN Deployment Method for Monitoring Geographically Irregular Distributed Events.

    Aitsaadi, Nadjib; Achir, Nadjib; Boussetta, Khaled; Pujolle, Guy

    2009-01-01

    In this paper, we address the Wireless Sensor Network (WSN) deployment issue. We assume that the observed area is characterized by the geographical irregularity of the sensed events. Formally, we consider that each point in the deployment area is associated a differentiated detection probability threshold, which must be satisfied by our deployment method. Our resulting WSN deployment problem is formulated as a Multi-Objectives Optimization problem, which seeks to reduce the gap between the generated events detection probabilities and the required thresholds while minimizing the number of deployed sensors. To overcome the computational complexity of an exact resolution, we propose an original pseudo-random approach based on the Tabu Search heuristic. Simulations show that our proposal achieves better performances than several other approaches proposed in the literature. In the last part of this paper, we generalize the deployment problem by including the wireless communication network connectivity constraint. Thus, we extend our proposal to ensure that the resulting WSN topology is connected even if a sensor communication range takes small values.

  4. Searching for beyond the Standard Model physics using direct and indirect methods at LHCb

    Hall, Samuel C P; Golutvin, Andrey

    It is known that the Standard Model of particle physics is incomplete in its description of nature at a fundamental level. For example, the Standard Model can neither incorporate dark matter nor explain the matter dominated nature of the Universe. This thesis presents three analyses undertaken using data collected by the LHCb detector. Each analysis searches for indications of physics beyond the Standard Model in dierent decays of B mesons, using dierent techniques. Notably, two analyses look for indications of new physics using indirect methods, and one uses a direct approach. The rst analysis shows evidence for the rare decay $B^{+} \\rightarrow D^{+}_{s}\\phi$ with greater than 3 $\\sigma$ signicance; this also constitutes the rst evidence for a fullyhadronic annihilation-type decay of a $B^{+}$ meson. A measurement of the branching fraction of the decay $B^{+} \\rightarrow D^{+}_{s}\\phi$ is seen to be higher than, but still compatible with, Standard Model predictions. The CP-asymmetry of the decay is also ...

  5. Minimization of municipal solid waste transportation route in West Jakarta using Tabu Search method

    Chaerul, M.; Mulananda, A. M.

    2018-04-01

    Indonesia still adopts the concept of collect-haul-dispose for municipal solid waste handling and it leads to the queue of the waste trucks at final disposal site (TPA). The study aims to minimize the total distance of waste transportation system by applying a Transshipment model. In this case, analogous of transshipment point is a compaction facility (SPA). Small capacity of trucks collects the waste from waste temporary collection points (TPS) to the compaction facility which located near the waste generator. After compacted, the waste is transported using big capacity of trucks to the final disposal site which is located far away from city. Problem related with the waste transportation can be solved using Vehicle Routing Problem (VRP). In this study, the shortest distance of route from truck pool to TPS, TPS to SPA, and SPA to TPA was determined by using meta-heuristic methods, namely Tabu Search 2 Phases. TPS studied is the container type with total 43 units throughout the West Jakarta City with 38 units of Armroll truck with capacity of 10 m3 each. The result determines the assignment of each truck from the pool to the selected TPS, SPA and TPA with the total minimum distance of 2,675.3 KM. The minimum distance causing the total cost for waste transportation to be spent by the government also becomes minimal.

  6. Pathway Detection from Protein Interaction Networks and Gene Expression Data Using Color-Coding Methods and A* Search Algorithms

    Cheng-Yu Yeh

    2012-01-01

    Full Text Available With the large availability of protein interaction networks and microarray data supported, to identify the linear paths that have biological significance in search of a potential pathway is a challenge issue. We proposed a color-coding method based on the characteristics of biological network topology and applied heuristic search to speed up color-coding method. In the experiments, we tested our methods by applying to two datasets: yeast and human prostate cancer networks and gene expression data set. The comparisons of our method with other existing methods on known yeast MAPK pathways in terms of precision and recall show that we can find maximum number of the proteins and perform comparably well. On the other hand, our method is more efficient than previous ones and detects the paths of length 10 within 40 seconds using CPU Intel 1.73GHz and 1GB main memory running under windows operating system.

  7. Application of pattern search method to power system security constrained economic dispatch with non-smooth cost function

    Al-Othman, A.K.; El-Naggar, K.M.

    2008-01-01

    Direct search methods are evolutionary algorithms used to solve optimization problems. (DS) methods do not require any information about the gradient of the objective function at hand while searching for an optimum solution. One of such methods is Pattern Search (PS) algorithm. This paper presents a new approach based on a constrained pattern search algorithm to solve a security constrained power system economic dispatch problem (SCED) with non-smooth cost function. Operation of power systems demands a high degree of security to keep the system satisfactorily operating when subjected to disturbances, while and at the same time it is required to pay attention to the economic aspects. Pattern recognition technique is used first to assess dynamic security. Linear classifiers that determine the stability of electric power system are presented and added to other system stability and operational constraints. The problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Pattern search method is then applied to solve the constrained optimization formulation. In particular, the method is tested using three different test systems. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that pattern search (PS) is very applicable for solving security constrained power system economic dispatch problem (SCED). In addition, valve-point effect loading and total system losses are considered to further investigate the potential of the PS technique. Based on the results, it can be concluded that the PS has demonstrated ability in handling highly nonlinear discontinuous non-smooth cost function of the SCED. (author)

  8. Searching for Innovations and Methods of Using the Cultural Heritage on the Example of Upper Silesia

    Wagner, Tomasz

    2017-10-01

    The basic subject of this paper is historical and cultural heritage of some parts of Upper Silesia, bind by common history and similar problems at present days. The paper presents some selected historical phenomena that have influenced contemporary space, mentioned above, and contemporary issues of heritage protection in Upper Silesia. The Silesian architecture interpretation, since 1989, is strongly covered with some ideological and national ideas. The last 25 years are the next level of development which contains rapidly transformation of the space what is caused by another economical transformations. In this period, we can observe landscape transformations, liquidation of objects and historical structures, loos of regional features, spontaneous adaptation processes of objects and many methods of implementation forms of protection, and using of cultural resources. Some upheaval linked to the state borders changes, system, economy and ethnic transformation caused that former Upper Silesia border area focuses phenomena that exists in some other similar European areas which are abutments of cultures and traditions. The latest period in the history of Upper Silesia gives us time to reflect the character of changes in architecture and city planning of the area and appraisal of efficiency these practices which are connected to cultural heritage perseveration. The phenomena of the last decades are: decrement of regional features, elimination of objects, which were a key feature of the regional cultural heritage, deformation of these forms that were shaped in the history and some trials of using these elements of cultural heritage, which are widely recognized as cultural values. In this situation, it is important to seek creative solutions that will neutralize bad processes resulting from bad law and practice. The most important phenomena of temporary space is searching of innovative fields and methods and use of cultural resources. An important part of the article is

  9. Identifying complications of interventional procedures from UK routine healthcare databases: a systematic search for methods using clinical codes.

    Keltie, Kim; Cole, Helen; Arber, Mick; Patrick, Hannah; Powell, John; Campbell, Bruce; Sims, Andrew

    2014-11-28

    Several authors have developed and applied methods to routine data sets to identify the nature and rate of complications following interventional procedures. But, to date, there has been no systematic search for such methods. The objective of this article was to find, classify and appraise published methods, based on analysis of clinical codes, which used routine healthcare databases in a United Kingdom setting to identify complications resulting from interventional procedures. A literature search strategy was developed to identify published studies that referred, in the title or abstract, to the name or acronym of a known routine healthcare database and to complications from procedures or devices. The following data sources were searched in February and March 2013: Cochrane Methods Register, Conference Proceedings Citation Index - Science, Econlit, EMBASE, Health Management Information Consortium, Health Technology Assessment database, MathSciNet, MEDLINE, MEDLINE in-process, OAIster, OpenGrey, Science Citation Index Expanded and ScienceDirect. Of the eligible papers, those which reported methods using clinical coding were classified and summarised in tabular form using the following headings: routine healthcare database; medical speciality; method for identifying complications; length of follow-up; method of recording comorbidity. The benefits and limitations of each approach were assessed. From 3688 papers identified from the literature search, 44 reported the use of clinical codes to identify complications, from which four distinct methods were identified: 1) searching the index admission for specified clinical codes, 2) searching a sequence of admissions for specified clinical codes, 3) searching for specified clinical codes for complications from procedures and devices within the International Classification of Diseases 10th revision (ICD-10) coding scheme which is the methodology recommended by NHS Classification Service, and 4) conducting manual clinical

  10. Assessment of methods for computing the closest point projection, penetration, and gap functions in contact searching problems

    Kopačka, Ján; Gabriel, Dušan; Plešek, Jiří; Ulbin, M.

    2016-01-01

    Roč. 105, č. 11 (2016), s. 803-833 ISSN 0029-5981 R&D Projects: GA ČR(CZ) GAP101/12/2315; GA MŠk(CZ) ME10114 Institutional support: RVO:61388998 Keywords : closest point projection * local contact search * quadratic elements * Newtons methods * geometric iteration methods * simplex method Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.162, year: 2016 http://onlinelibrary.wiley.com/doi/10.1002/nme.4994/abstract

  11. Rapid Automatic Lighting Control of a Mixed Light Source for Image Acquisition using Derivative Optimum Search Methods

    Kim HyungTae

    2015-01-01

    Full Text Available Automatic lighting (auto-lighting is a function that maximizes the image quality of a vision inspection system by adjusting the light intensity and color.In most inspection systems, a single color light source is used, and an equal step search is employed to determine the maximum image quality. However, when a mixed light source is used, the number of iterations becomes large, and therefore, a rapid search method must be applied to reduce their number. Derivative optimum search methods follow the tangential direction of a function and are usually faster than other methods. In this study, multi-dimensional forms of derivative optimum search methods are applied to obtain the maximum image quality considering a mixed-light source. The auto-lighting algorithms were derived from the steepest descent and conjugate gradient methods, which have N-size inputs of driving voltage and one output of image quality. Experiments in which the proposed algorithm was applied to semiconductor patterns showed that a reduced number of iterations is required to determine the locally maximized image quality.

  12. An automated and efficient conformation search of L-cysteine and L,L-cystine using the scaled hypersphere search method

    Kishimoto, Naoki; Waizumi, Hiroki

    2017-10-01

    Stable conformers of L-cysteine and L,L-cystine were explored using an automated and efficient conformational searching method. The Gibbs energies of the stable conformers of L-cysteine and L,L-cystine were calculated with G4 and MP2 methods, respectively, at 450, 298.15, and 150 K. By assuming thermodynamic equilibrium and the barrier energies for the conformational isomerization pathways, the estimated ratios of the stable conformers of L-cysteine were compared with those determined by microwave spectroscopy in a previous study. Equilibrium structures of 1:1 and 2:1 cystine-Fe complexes were also calculated, and the energy of insertion of Fe into the disulfide bond was obtained.

  13. Decentralized cooperative unmanned aerial vehicles conflict resolution by neural network-based tree search method

    Jian Yang

    2016-09-01

    Full Text Available In this article, a tree search algorithm is proposed to find the near optimal conflict avoidance solutions for unmanned aerial vehicles. In the dynamic environment, the unmodeled elements, such as wind, would make UAVs deviate from nominal traces. It brings about difficulties for conflict detection and resolution. The back propagation neural networks are utilized to approximate the unmodeled dynamics of the environment. To satisfy the online planning requirement, the search length of the tree search algorithm would be limited. Therefore, the algorithm may not be able to reach the goal states in search process. The midterm reward function for assessing each node is devised, with consideration given to two factors, namely, the safe separation requirement and the mission of each unmanned aerial vehicle. The simulation examples and the comparisons with previous approaches are provided to illustrate the smooth and convincing behaviours of the proposed algorithm.

  14. Transcriptomic SNP discovery for custom genotyping arrays: impacts of sequence data, SNP calling method and genotyping technology on the probability of validation success.

    Humble, Emily; Thorne, Michael A S; Forcada, Jaume; Hoffman, Joseph I

    2016-08-26

    Single nucleotide polymorphism (SNP) discovery is an important goal of many studies. However, the number of 'putative' SNPs discovered from a sequence resource may not provide a reliable indication of the number that will successfully validate with a given genotyping technology. For this it may be necessary to account for factors such as the method used for SNP discovery and the type of sequence data from which it originates, suitability of the SNP flanking sequences for probe design, and genomic context. To explore the relative importance of these and other factors, we used Illumina sequencing to augment an existing Roche 454 transcriptome assembly for the Antarctic fur seal (Arctocephalus gazella). We then mapped the raw Illumina reads to the new hybrid transcriptome using BWA and BOWTIE2 before calling SNPs with GATK. The resulting markers were pooled with two existing sets of SNPs called from the original 454 assembly using NEWBLER and SWAP454. Finally, we explored the extent to which SNPs discovered using these four methods overlapped and predicted the corresponding validation outcomes for both Illumina Infinium iSelect HD and Affymetrix Axiom arrays. Collating markers across all discovery methods resulted in a global list of 34,718 SNPs. However, concordance between the methods was surprisingly poor, with only 51.0 % of SNPs being discovered by more than one method and 13.5 % being called from both the 454 and Illumina datasets. Using a predictive modeling approach, we could also show that SNPs called from the Illumina data were on average more likely to successfully validate, as were SNPs called by more than one method. Above and beyond this pattern, predicted validation outcomes were also consistently better for Affymetrix Axiom arrays. Our results suggest that focusing on SNPs called by more than one method could potentially improve validation outcomes. They also highlight possible differences between alternative genotyping technologies that could be

  15. Pep-3D-Search: a method for B-cell epitope prediction based on mimotope analysis.

    Huang, Yan Xin; Bao, Yong Li; Guo, Shu Yan; Wang, Yan; Zhou, Chun Guang; Li, Yu Xin

    2008-12-16

    The prediction of conformational B-cell epitopes is one of the most important goals in immunoinformatics. The solution to this problem, even if approximate, would help in designing experiments to precisely map the residues of interaction between an antigen and an antibody. Consequently, this area of research has received considerable attention from immunologists, structural biologists and computational biologists. Phage-displayed random peptide libraries are powerful tools used to obtain mimotopes that are selected by binding to a given monoclonal antibody (mAb) in a similar way to the native epitope. These mimotopes can be considered as functional epitope mimics. Mimotope analysis based methods can predict not only linear but also conformational epitopes and this has been the focus of much research in recent years. Though some algorithms based on mimotope analysis have been proposed, the precise localization of the interaction site mimicked by the mimotopes is still a challenging task. In this study, we propose a method for B-cell epitope prediction based on mimotope analysis called Pep-3D-Search. Given the 3D structure of an antigen and a set of mimotopes (or a motif sequence derived from the set of mimotopes), Pep-3D-Search can be used in two modes: mimotope or motif. To evaluate the performance of Pep-3D-Search to predict epitopes from a set of mimotopes, 10 epitopes defined by crystallography were compared with the predicted results from a Pep-3D-Search: the average Matthews correlation coefficient (MCC), sensitivity and precision were 0.1758, 0.3642 and 0.6948. Compared with other available prediction algorithms, Pep-3D-Search showed comparable MCC, specificity and precision, and could provide novel, rational results. To verify the capability of Pep-3D-Search to align a motif sequence to a 3D structure for predicting epitopes, 6 test cases were used. The predictive performance of Pep-3D-Search was demonstrated to be superior to that of other similar programs

  16. Climate change on the Colorado River: a method to search for robust management strategies

    Keefe, R.; Fischbach, J. R.

    2010-12-01

    The Colorado River is a principal source of water for the seven Basin States, providing approximately 16.5 maf per year to users in the southwestern United States and Mexico. Though the dynamics of the river ensure Upper Basin users a reliable supply of water, the three Lower Basin states (California, Nevada, and Arizona) are in danger of delivery interruptions as Upper Basin demand increases and climate change threatens to reduce future streamflows. In light of the recent drought and uncertain effects of climate change on Colorado River flows, we evaluate the performance of a suite of policies modeled after the shortage sharing agreement adopted in December 2007 by the Department of the Interior. We build on the current literature by using a simplified model of the Lower Colorado River to consider future streamflow scenarios given climate change uncertainty. We also generate different scenarios of parametric consumptive use growth in the Upper Basin and evaluate alternate management strategies in light of these uncertainties. Uncertainty associated with climate change is represented with a multi-model ensemble from the literature, using a nearest neighbor perturbation to increase the size of the ensemble. We use Robust Decision Making to compare near-term or long-term management strategies across an ensemble of plausible future scenarios with the goal of identifying one or more approaches that are robust to alternate assumptions about the future. This method entails using search algorithms to quantitatively identify vulnerabilities that may threaten a given strategy (including the current operating policy) and characterize key tradeoffs between strategies under different scenarios.

  17. Best, Useful and Objective Precisions for Information Retrieval of Three Search Methods in PubMed and iPubMed

    Somayyeh Nadi Ravandi

    2016-10-01

    Full Text Available MEDLINE is one of the valuable sources of medical information on the Internet. Among the different open access sites of MEDLINE, PubMed is the best-known site. In 2010, iPubMed was established with an interaction-fuzzy search method for MEDLINE access. In the present work, we aimed to compare the precision of the retrieved sources (Best, Useful and Objective precision in the PubMed and iPubMed using two search methods (simple and MeSH search in PubMed and interaction-fuzzy method in iPubmed. During our semi-empirical study period, we held training workshops for 61 students of higher education to teach them Simple Search, MeSH Search, and Fuzzy-Interaction Search methods. Then, the precision of 305 searches for each method prepared by the students was calculated on the basis of Best precision, Useful precision, and Objective precision formulas. Analyses were done in SPSS version 11.5 using the Friedman and Wilcoxon Test, and three precisions obtained with the three precision formulas were studied for the three search methods. The mean precision of the interaction-fuzzy Search method was higher than that of the simple search and MeSH search for all three types of precision, i.e., Best precision, Useful precision, and Objective precision, and the Simple search method was in the next rank, and their mean precisions were significantly different (P < 0.001. The precision of the interaction-fuzzy search method in iPubmed was investigated for the first time. Also for the first time, three types of precision were evaluated in PubMed and iPubmed. The results showed that the Interaction-Fuzzy search method is more precise than using the natural language search (simple search and MeSH search, and users of this method found papers that were more related to their queries; even though search in Pubmed is useful, it is important that users apply new search methods to obtain the best results.

  18. Application Methods Guided Discovery in the Effort Improving Skills Observing Student Learning IPA in the Fourth Grades in Primary School

    Septikasari, Zela

    2015-01-01

    The purpose of this research was to improve improve the skills of observing in science learning by using guided discovery. This type of research is a collaborative classroom action research with teachers and research subjects Elementary School fourth grade students in SD Lempuyangan 1, Yogyakarta. The results showed that the percentace of students who has score B on pre- action of 23.53%; in the first cycle increased to 38.24%; and 91.18% in the second cycle. Thus in the first cycle an increa...

  19. In vitro detection of circulating tumor cells compared by the CytoTrack and CellSearch methods

    Hillig, T.; Horn, P.; Nygaard, Ann-Britt

    2015-01-01

    .23/p = 0.09). Overall, the recovery of CytoTrack and CellSearch was 68.8 +/- 3.9 %/71.1 +/- 2.9 %, respectively (p = 0.58). In spite of different methodologies, CytoTrack and CellSearch found similar number of CTCs, when spiking was performed with the EpCAM and pan cytokeratin-positive cell line MCF-7......Comparison of two methods to detect circulating tumor cells (CTC) CytoTrack and CellSearch through recovery of MCF-7 breast cancer cells, spiked into blood collected from healthy donors. Spiking of a fixed number of EpCAM and pan-cytokeratin positive MCF-7 cells into 7.5 mL donor blood...... was performed by FACSAria flow sorting. The samples were shipped to either CytoTrack or CellSearch research facilities within 48 h, where evaluation of MCF-7 recovery was performed. CytoTrack and CellSearch analyses were performed simultaneously. Recoveries of MCF-7 single cells, cells in clusters, and clusters...

  20. Discovery of the Higgs boson

    Sharma, Vivek

    2016-01-01

    The recent observation of the Higgs boson has been hailed as the scientific discovery of the century and led to the 2013 Nobel Prize in physics. This book describes the detailed science behind the decades-long search for this elusive particle at the Large Electron Positron Collider at CERN and at the Tevatron at Fermilab and its subsequent discovery and characterization at the Large Hadron Collider at CERN. Written by physicists who played leading roles in this epic search and discovery, this book is an authoritative and pedagogical exposition of the portrait of the Higgs boson that has emerged from a large number of experimental measurements. As the first of its kind, this book should be of interest to graduate students and researchers in particle physics.

  1. Development of a universal metabolome-standard method for long-term LC-MS metabolome profiling and its application for bladder cancer urine-metabolite-biomarker discovery.

    Peng, Jun; Chen, Yi-Ting; Chen, Chien-Lun; Li, Liang

    2014-07-01

    Large-scale metabolomics study requires a quantitative method to generate metabolome data over an extended period with high technical reproducibility. We report a universal metabolome-standard (UMS) method, in conjunction with chemical isotope labeling liquid chromatography-mass spectrometry (LC-MS), to provide long-term analytical reproducibility and facilitate metabolome comparison among different data sets. In this method, UMS of a specific type of sample labeled by an isotope reagent is prepared a priori. The UMS is spiked into any individual samples labeled by another form of the isotope reagent in a metabolomics study. The resultant mixture is analyzed by LC-MS to provide relative quantification of the individual sample metabolome to UMS. UMS is independent of a study undertaking as well as the time of analysis and useful for profiling the same type of samples in multiple studies. In this work, the UMS method was developed and applied for a urine metabolomics study of bladder cancer. UMS of human urine was prepared by (13)C2-dansyl labeling of a pooled sample from 20 healthy individuals. This method was first used to profile the discovery samples to generate a list of putative biomarkers potentially useful for bladder cancer detection and then used to analyze the verification samples about one year later. Within the discovery sample set, three-month technical reproducibility was examined using a quality control sample and found a mean CV of 13.9% and median CV of 9.4% for all the quantified metabolites. Statistical analysis of the urine metabolome data showed a clear separation between the bladder cancer group and the control group from the discovery samples, which was confirmed by the verification samples. Receiver operating characteristic (ROC) test showed that the area under the curve (AUC) was 0.956 in the discovery data set and 0.935 in the verification data set. These results demonstrated the utility of the UMS method for long-term metabolomics and

  2. Multi-Agent Based Beam Search for Real-Time Production Scheduling and Control Method, Software and Industrial Application

    Kang, Shu Gang

    2013-01-01

    The Multi-Agent Based Beam Search (MABBS) method systematically integrates four major requirements of manufacturing production - representation capability, solution quality, computation efficiency, and implementation difficulty - within a unified framework to deal with the many challenges of complex real-world production planning and scheduling problems. Multi-agent Based Beam Search for Real-time Production Scheduling and Control introduces this method, together with its software implementation and industrial applications.  This book connects academic research with industrial practice, and develops a practical solution to production planning and scheduling problems. To simplify implementation, a reusable software platform is developed to build the MABBS method into a generic computation engine.  This engine is integrated with a script language, called the Embedded Extensible Application Script Language (EXASL), to provide a flexible and straightforward approach to representing complex real-world problems. ...

  3. Method and electronic database search engine for exposing the content of an electronic database

    Stappers, P.J.

    2000-01-01

    The invention relates to an electronic database search engine comprising an electronic memory device suitable for storing and releasing elements from the database, a display unit, a user interface for selecting and displaying at least one element from the database on the display unit, and control

  4. Search Engines for Tomorrow's Scholars

    Fagan, Jody Condit

    2011-01-01

    Today's scholars face an outstanding array of choices when choosing search tools: Google Scholar, discipline-specific abstracts and index databases, library discovery tools, and more recently, Microsoft's re-launch of their academic search tool, now dubbed Microsoft Academic Search. What are these tools' strengths for the emerging needs of…

  5. The web server of IBM's Bioinformatics and Pattern Discovery group.

    Huynh, Tien; Rigoutsos, Isidore; Parida, Laxmi; Platt, Daniel; Shibuya, Tetsuo

    2003-07-01

    We herein present and discuss the services and content which are available on the web server of IBM's Bioinformatics and Pattern Discovery group. The server is operational around the clock and provides access to a variety of methods that have been published by the group's members and collaborators. The available tools correspond to applications ranging from the discovery of patterns in streams of events and the computation of multiple sequence alignments, to the discovery of genes in nucleic acid sequences and the interactive annotation of amino acid sequences. Additionally, annotations for more than 70 archaeal, bacterial, eukaryotic and viral genomes are available on-line and can be searched interactively. The tools and code bundles can be accessed beginning at http://cbcsrv.watson.ibm.com/Tspd.html whereas the genomics annotations are available at http://cbcsrv.watson.ibm.com/Annotations/.

  6. Classical algorithms for automated parameter-search methods in compartmental neural models - A critical survey based on simulations using neuron

    Mutihac, R.; Mutihac, R.C.; Cicuttin, A.

    2001-09-01

    Parameter-search methods are problem-sensitive. All methods depend on some meta-parameters of their own, which must be determined experimentally in advance. A better choice of these intrinsic parameters for a certain parameter-search method may improve its performance. Moreover, there are various implementations of the same method, which may also affect its performance. The choice of the matching (error) function has a great impact on the search process in terms of finding the optimal parameter set and minimizing the computational cost. An initial assessment of the matching function ability to distinguish between good and bad models is recommended, before launching exhaustive computations. However, different runs of a parameter search method may result in the same optimal parameter set or in different parameter sets (the model is insufficiently constrained to accurately characterize the real system). Robustness of the parameter set is expressed by the extent to which small perturbations in the parameter values are not affecting the best solution. A parameter set that is not robust is unlikely to be physiologically relevant. Robustness can also be defined as the stability of the optimal parameter set to small variations of the inputs. When trying to estimate things like the minimum, or the least-squares optimal parameters of a nonlinear system, the existence of multiple local minima can cause problems with the determination of the global optimum. Techniques such as Newton's method, the Simplex method and Least-squares Linear Taylor Differential correction technique can be useful provided that one is lucky enough to start sufficiently close to the global minimum. All these methods suffer from the inability to distinguish a local minimum from a global one because they follow the local gradients towards the minimum, even if some methods are resetting the search direction when it is likely to get stuck in presumably a local minimum. Deterministic methods based on

  7. Natural Language Search Interfaces: Health Data Needs Single-Field Variable Search

    Smith, Sam; Sufi, Shoaib; Goble, Carole; Buchan, Iain

    2016-01-01

    Background Data discovery, particularly the discovery of key variables and their inter-relationships, is key to secondary data analysis, and in-turn, the evolving field of data science. Interface designers have presumed that their users are domain experts, and so they have provided complex interfaces to support these “experts.” Such interfaces hark back to a time when searches needed to be accurate first time as there was a high computational cost associated with each search. Our work is part of a governmental research initiative between the medical and social research funding bodies to improve the use of social data in medical research. Objective The cross-disciplinary nature of data science can make no assumptions regarding the domain expertise of a particular scientist, whose interests may intersect multiple domains. Here we consider the common requirement for scientists to seek archived data for secondary analysis. This has more in common with search needs of the “Google generation” than with their single-domain, single-tool forebears. Our study compares a Google-like interface with traditional ways of searching for noncomplex health data in a data archive. Methods Two user interfaces are evaluated for the same set of tasks in extracting data from surveys stored in the UK Data Archive (UKDA). One interface, Web search, is “Google-like,” enabling users to browse, search for, and view metadata about study variables, whereas the other, traditional search, has standard multioption user interface. Results Using a comprehensive set of tasks with 20 volunteers, we found that the Web search interface met data discovery needs and expectations better than the traditional search. A task × interface repeated measures analysis showed a main effect indicating that answers found through the Web search interface were more likely to be correct (F 1,19=37.3, Peffect of task (F 3,57=6.3, Pinterface (F 1,19=18.0, Peffect of task (F 2,38=4.1, P=.025, Greenhouse

  8. CodeRAnts: A recommendation method based on collaborative searching and ant colonies, applied to reusing of open source code

    Isaac Caicedo-Castro

    2014-01-01

    Full Text Available This paper presents CodeRAnts, a new recommendation method based on a collaborative searching technique and inspired on the ant colony metaphor. This method aims to fill the gap in the current state of the matter regarding recommender systems for software reuse, for which prior works present two problems. The first is that, recommender systems based on these works cannot learn from the collaboration of programmers and second, outcomes of assessments carried out on these systems present low precision measures and recall and in some of these systems, these metrics have not been evaluated. The work presented in this paper contributes a recommendation method, which solves these problems.

  9. Hybridization of Sensing Methods of the Search Domain and Adaptive Weighted Sum in the Pareto Approximation Problem

    A. P. Karpenko

    2015-01-01

    Full Text Available We consider the relatively new and rapidly developing class of methods to solve a problem of multi-objective optimization, based on the preliminary built finite-dimensional approximation of the set, and thereby, the Pareto front of this problem as well. The work investigates the efficiency of several modifications of the method of adaptive weighted sum (AWS. This method proposed in the paper of Ryu and Kim Van (JH. Ryu, S. Kim, H. Wan is intended to build Pareto approximation of the multi-objective optimization problem.The AWS method uses quadratic approximation of the objective functions in the current sub-domain of the search space (the area of trust based on the gradient and Hessian matrix of the objective functions. To build the (quadratic meta objective functions this work uses methods of the experimental design theory, which involves calculating the values of these functions in the grid nodes covering the area of trust (a sensing method of the search domain. There are two groups of the sensing methods under consideration: hypercube- and hyper-sphere-based methods. For each of these groups, a number of test multi-objective optimization tasks has been used to study the efficiency of the following grids: "Latin Hypercube"; grid, which is uniformly random for each measurement; grid, based on the LP  sequences.

  10. New method for the discovery of adulterated cognacs and brandies based on solid-phase microextraction and gas chromatography - mass spectrometry

    Darya Mozhayeva

    2014-10-01

    Full Text Available The article represents new method for discovery of adulterated cognacs and brandies based on solidphase microextraction (SPME in combination with gas chromatography – mass spectrometry (GC-MS. The work comprised optimization of SPME parameters (extraction temperature and time, concentration of added salt with subsequent analysis of authentic samples and comparison of the obtained chromatograms using principal component analysis (PCA. According to the obtained results, increase of extraction temperature resulted in an increase of response of the most volatile brandy constituents. To avoid chemical transformations and/or degradation of the samples, the extraction temperature must be limited to 30!C. Increase of the extraction time lead to higher total peak area, but longer extraction times (>10 min for 100 µm polydimethylsiloxane and >2 min for divinylbenzene/Carboxen/polydimethylsiloxane fibers caused displacement of analytes. Salt addition increased total response of analytes, but caused problems with reproducibility. The developed method was successfully applied for discovery of adulterated samples of brandy, cognac, whisky and whiskey sold in Kazakhstan. The obtained data was analyzed applying principal component analysis (PCA. Five adulterated brandy and whisky samples were discovered and confirmed. The developed method is recommended for application in forensic laboratories.

  11. Coherent search of continuous gravitational wave signals: extension of the 5-vectors method to a network of detectors

    Astone, P; Colla, A; Frasca, S; Palomba, C; D'Antonio, S

    2012-01-01

    We describe the extension to multiple datasets of a coherent method for the search of continuous gravitational wave signals, based on the computation of 5-vectors. In particular, we show how to coherently combine different datasets belonging to the same detector or to different detectors. In the latter case the coherent combination is the way to have the maximum increase in signal-to-noise ratio. If the datasets belong to the same detector the advantage comes mainly from the properties of a quantity called coherence which is helpful (in both cases, in fact) in rejecting false candidates. The method has been tested searching for simulated signals injected in Gaussian noise and the results of the simulations are discussed.

  12. A Simple Time Domain Collocation Method to Precisely Search for the Periodic Orbits of Satellite Relative Motion

    Xiaokui Yue

    2014-01-01

    Full Text Available A numerical approach for obtaining periodic orbits of satellite relative motion is proposed, based on using the time domain collocation (TDC method to search for the periodic solutions of an exact J2 nonlinear relative model. The initial conditions for periodic relative orbits of the Clohessy-Wiltshire (C-W equations or Tschauner-Hempel (T-H equations can be refined with this approach to generate nearly bounded orbits. With these orbits, a method based on the least-squares principle is then proposed to generate projected closed orbit (PCO, which is a reference for the relative motion control. Numerical simulations reveal that the presented TDC searching scheme is effective and simple, and the projected closed orbit is very fuel saving.

  13. Methods and pitfalls in searching drug safety databases utilising the Medical Dictionary for Regulatory Activities (MedDRA).

    Brown, Elliot G

    2003-01-01

    The Medical Dictionary for Regulatory Activities (MedDRA) is a unified standard terminology for recording and reporting adverse drug event data. Its introduction is widely seen as a significant improvement on the previous situation, where a multitude of terminologies of widely varying scope and quality were in use. However, there are some complexities that may cause difficulties, and these will form the focus for this paper. Two methods of searching MedDRA-coded databases are described: searching based on term selection from all of MedDRA and searching based on terms in the safety database. There are several potential traps for the unwary in safety searches. There may be multiple locations of relevant terms within a system organ class (SOC) and lack of recognition of appropriate group terms; the user may think that group terms are more inclusive than is the case. MedDRA may distribute terms relevant to one medical condition across several primary SOCs. If the database supports the MedDRA model, it is possible to perform multiaxial searching: while this may help find terms that might have been missed, it is still necessary to consider the entire contents of the SOCs to find all relevant terms and there are many instances of incomplete secondary linkages. It is important to adjust for multiaxiality if data are presented using primary and secondary locations. Other sources for errors in searching are non-intuitive placement and the selection of terms as preferred terms (PTs) that may not be widely recognised. Some MedDRA rules could also result in errors in data retrieval if the individual is unaware of these: in particular, the lack of multiaxial linkages for the Investigations SOC, Social circumstances SOC and Surgical and medical procedures SOC and the requirement that a PT may only be present under one High Level Term (HLT) and one High Level Group Term (HLGT) within any single SOC. Special Search Categories (collections of PTs assembled from various SOCs by

  14. Parallel metaheuristics in computational biology: an asynchronous cooperative enhanced scatter search method

    Penas, David R.; González, Patricia; Egea, José A.; Banga, Julio R.; Doallo, Ramón

    2015-01-01

    Metaheuristics are gaining increased attention as efficient solvers for hard global optimization problems arising in bioinformatics and computational systems biology. Scatter Search (SS) is one of the recent outstanding algorithms in that class. However, its application to very hard problems, like those considering parameter estimation in dynamic models of systems biology, still results in excessive computation times. In order to reduce the computational cost of the SS and improve its success...

  15. NEW METHOD FOR REACHING CONSUMERS OVER THE INTERNET: "SEARCH ENGINE MARKETING”

    Ergezer, Çağrı

    2018-01-01

    Internet has become a platform which reached millions of users momentarily with increased use, also become a place where people spent most of their time during the day by gaining consumer and potential customer ID in addition to just being ordinary Internet users. Search engines also have earned the distinction of being the preferred reference for users in the Internet sea which draws attention with usage rate and allowing you to easily reach the sought-after content where millions of content...

  16. A Comparison of Local Search Methods for the Multicriteria Police Districting Problem on Graph

    F. Liberatore

    2016-01-01

    Full Text Available In the current economic climate, law enforcement agencies are facing resource shortages. The effective and efficient use of scarce resources is therefore of the utmost importance to provide a high standard public safety service. Optimization models specifically tailored to the necessity of police agencies can help to ameliorate their use. The Multicriteria Police Districting Problem (MC-PDP on a graph concerns the definition of sound patrolling sectors in a police district. The objective of this problem is to partition a graph into convex and continuous subsets, while ensuring efficiency and workload balance among the subsets. The model was originally formulated in collaboration with the Spanish National Police Corps. We propose for its solution three local search algorithms: a Simple Hill Climbing, a Steepest Descent Hill Climbing, and a Tabu Search. To improve their diversification capabilities, all the algorithms implement a multistart procedure, initialized by randomized greedy solutions. The algorithms are empirically tested on a case study on the Central District of Madrid. Our experiments show that the solutions identified by the novel Tabu Search outperform the other algorithms. Finally, research guidelines for future developments on the MC-PDP are given.

  17. Optimal generation and reserve dispatch in a multi-area competitive market using a hybrid direct search method

    Chen, C.-L.

    2005-01-01

    With restructuring of the power industry, competitive bidding for energy and ancillary services are increasingly recognized as an important part of electricity markets. It is desirable to optimize not only the generator's bid prices for energy and for providing minimized ancillary services but also the transmission congestion costs. In this paper, a hybrid approach of combining sequential dispatch with a direct search method is developed to deal with the multi-product and multi-area electricity market dispatch problem. The hybrid direct search method (HDSM) incorporates sequential dispatch into the direct search method to facilitate economic sharing of generation and reserve across areas and to minimize the total market cost in a multi-area competitive electricity market. The effects of tie line congestion and area spinning reserve requirement are also consistently reflected in the marginal price in each area. Numerical experiments are included to understand the various constraints in the market cost analysis and to provide valuable information for market participants in a pool oriented electricity market

  18. Optimal generation and reserve dispatch in a multi-area competitive market using a hybrid direct search method

    Chun Lung Chen

    2005-01-01

    With restructuring of the power industry, competitive bidding for energy and ancillary services are increasingly recognized as an important part of electricity markets. It is desirable to optimize not only the generator's bid prices for energy and for providing minimized ancillary services but also the transmission congestion costs. In this paper, a hybrid approach of combining sequential dispatch with a direct search method is developed to deal with the multi-product and multi-area electricity market dispatch problem. The hybrid direct search method (HDSM) incorporates sequential dispatch into the direct search method to facilitate economic sharing of generation and reserve across areas and to minimize the total market cost in a multi-area competitive electricity market. The effects of tie line congestion and area spinning reserve requirement are also consistently reflected in the marginal price in each area. Numerical experiments are included to understand the various constraints in the market cost analysis and to provide valuable information for market participants in a pool oriented electricity market. (author)

  19. Comparison of three web-scale discovery services for health sciences research*

    Rosie Hanneke, MLS

    2016-11-01

    Full Text Available Objective: The purpose of this study was to investigate the relative effectiveness of three web-scale discovery (WSD tools in answering health sciences search queries. Methods: Simple keyword searches, based on topics from six health sciences disciplines, were run at multiple real-world implementations of EBSCO Discovery Service (EDS, Ex Libris’s Primo, and ProQuest’s Summon. Each WSD tool was evaluated in its ability to retrieve relevant results and in its coverage of MEDLINE content. Results: All WSD tools returned between 50%–60% relevant results. Primo returned a higher number of duplicate results than the other 2WSD products. Summon results were more relevant when search terms were automatically mapped to controlled vocabulary. EDS indexed the largest number of MEDLINE citations, followed closely by Summon. Additionally, keyword searches in all 3 WSD tools retrieved relevant material that was not found with precision (Medical Subject Headings searches in MEDLINE. Conclusions: None of the 3 WSD products studied was overwhelmingly more effective in returning relevant results. While difficult to place the figure of 50%–60% relevance in context, it implies a strong likelihood that the average user would be able to find satisfactory sources on the first page of search results using a rudimentary keyword search. The discovery of additional relevant material beyond that retrieved from MEDLINE indicates WSD tools’ value as a supplement to traditional resources for health sciences researchers.

  20. Comparison of three web-scale discovery services for health sciences research*

    Hanneke, Rosie; O'Brien, Kelly K.

    2016-01-01

    Objective The purpose of this study was to investigate the relative effectiveness of three web-scale discovery (WSD) tools in answering health sciences search queries. Methods Simple keyword searches, based on topics from six health sciences disciplines, were run at multiple real-world implementations of EBSCO Discovery Service (EDS), Ex Libris's Primo, and ProQuest's Summon. Each WSD tool was evaluated in its ability to retrieve relevant results and in its coverage of MEDLINE content. Results All WSD tools returned between 50%–60% relevant results. Primo returned a higher number of duplicate results than the other 2 WSD products. Summon results were more relevant when search terms were automatically mapped to controlled vocabulary. EDS indexed the largest number of MEDLINE citations, followed closely by Summon. Additionally, keyword searches in all 3 WSD tools retrieved relevant material that was not found with precision (Medical Subject Headings) searches in MEDLINE. Conclusions None of the 3 WSD products studied was overwhelmingly more effective in returning relevant results. While difficult to place the figure of 50%–60% relevance in context, it implies a strong likelihood that the average user would be able to find satisfactory sources on the first page of search results using a rudimentary keyword search. The discovery of additional relevant material beyond that retrieved from MEDLINE indicates WSD tools' value as a supplement to traditional resources for health sciences researchers. PMID:27076797

  1. Applying ligands profiling using multiple extended electron distribution based field templates and feature trees similarity searching in the discovery of new generation of urea-based antineoplastic kinase inhibitors.

    Eman M Dokla

    Full Text Available This study provides a comprehensive computational procedure for the discovery of novel urea-based antineoplastic kinase inhibitors while focusing on diversification of both chemotype and selectivity pattern. It presents a systematic structural analysis of the different binding motifs of urea-based kinase inhibitors and the corresponding configurations of the kinase enzymes. The computational model depends on simultaneous application of two protocols. The first protocol applies multiple consecutive validated virtual screening filters including SMARTS, support vector-machine model (ROC = 0.98, Bayesian model (ROC = 0.86 and structure-based pharmacophore filters based on urea-based kinase inhibitors complexes retrieved from literature. This is followed by hits profiling against different extended electron distribution (XED based field templates representing different kinase targets. The second protocol enables cancericidal activity verification by using the algorithm of feature trees (Ftrees similarity searching against NCI database. Being a proof-of-concept study, this combined procedure was experimentally validated by its utilization in developing a novel series of urea-based derivatives of strong anticancer activity. This new series is based on 3-benzylbenzo[d]thiazol-2(3H-one scaffold which has interesting chemical feasibility and wide diversification capability. Antineoplastic activity of this series was assayed in vitro against NCI 60 tumor-cell lines showing very strong inhibition of GI(50 as low as 0.9 uM. Additionally, its mechanism was unleashed using KINEX™ protein kinase microarray-based small molecule inhibitor profiling platform and cell cycle analysis showing a peculiar selectivity pattern against Zap70, c-src, Mink1, csk and MeKK2 kinases. Interestingly, it showed activity on syk kinase confirming the recent studies finding of the high activity of diphenyl urea containing compounds against this kinase. Allover, the new series

  2. Retrieval of Legal Information Through Discovery Layers: A Case Study Related to Indian Law Libraries

    Kushwah, Shivpal Singh

    2016-09-01

    Full Text Available Purpose. The purpose of this paper is to analyze and evaluate discovery layer search tools for retrieval of legal information in Indian law libraries. This paper covers current practices in legal information retrieval with special reference to Indian academic law libraries, and analyses its importance in the domain of law.Design/Methodology/Approach. A web survey and observational study method are used to collect the data. Data related to the discovery tools were collected using email and further discussion held with the discovery layer/ tool /product developers and their representatives.Findings. Results show that most of the Indian law libraries are subscribing to bundles of legal information resources such as Hein Online, JSTOR, LexisNexis Academic, Manupatra, Westlaw India, SCC web, AIR Online (CDROM, and so on. International legal and academic resources are compatible with discovery tools because they support various standards related to online publishing and dissemination such as OAI/PMH, Open URL, MARC21, and Z39.50, but Indian legal resources such as Manupatra, Air, and SCC are not compatible with the discovery layers. The central index is one of the important components in a discovery search interface, and discovery layer services/tools could be useful for Indian law libraries also if they can include multiple legal and academic resources in their central index. But present practices and observations reveal that discovery layers are not providing facility to cover legal information resources. Therefore, in the present form, discovery tools are not very useful; they are an incomplete and half solution for Indian libraries because all available Indian legal resources available in the law libraries are not covered.Originality/Value. Very limited research or published literature is available in the area of discovery layers and their compatibility with legal information resources.

  3. Discovery Mondays

    2003-01-01

    Many people don't realise quite how much is going on at CERN. Would you like to gain first-hand knowledge of CERN's scientific and technological activities and their many applications? Try out some experiments for yourself, or pick the brains of the people in charge? If so, then the «Lundis Découverte» or Discovery Mondays, will be right up your street. Starting on May 5th, on every first Monday of the month you will be introduced to a different facet of the Laboratory. CERN staff, non-scientists, and members of the general public, everyone is welcome. So tell your friends and neighbours and make sure you don't miss this opportunity to satisfy your curiosity and enjoy yourself at the same time. You won't have to listen to a lecture, as the idea is to have open exchange with the expert in question and for each subject to be illustrated with experiments and demonstrations. There's no need to book, as Microcosm, CERN's interactive museum, will be open non-stop from 7.30 p.m. to 9 p.m. On the first Discovery M...

  4. Assessing the search for information on three Rs methods, and their subsequent implementation: a national survey among scientists in the Netherlands

    Luijk, J. van; Cuijpers, Y.M.; Vaart, L. van der; Leenaars, M; Ritskes-Hoitinga, M.

    2011-01-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify

  5. Assessing the Search for Information on Three Rs Methods, and their Subsequent Implementation: A National Survey among Scientists in The Netherlands.

    Luijk, J. van; Cuijpers, Y.M.; Vaart, L. van der; Leenaars, M.; Ritskes-Hoitinga, M.

    2011-01-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify

  6. Development and experimental test of support vector machines virtual screening method for searching Src inhibitors from large compound libraries

    Han Bucong

    2012-11-01

    Full Text Available Abstract Background Src plays various roles in tumour progression, invasion, metastasis, angiogenesis and survival. It is one of the multiple targets of multi-target kinase inhibitors in clinical uses and trials for the treatment of leukemia and other cancers. These successes and appearances of drug resistance in some patients have raised significant interest and efforts in discovering new Src inhibitors. Various in-silico methods have been used in some of these efforts. It is desirable to explore additional in-silico methods, particularly those capable of searching large compound libraries at high yields and reduced false-hit rates. Results We evaluated support vector machines (SVM as virtual screening tools for searching Src inhibitors from large compound libraries. SVM trained and tested by 1,703 inhibitors and 63,318 putative non-inhibitors correctly identified 93.53%~ 95.01% inhibitors and 99.81%~ 99.90% non-inhibitors in 5-fold cross validation studies. SVM trained by 1,703 inhibitors reported before 2011 and 63,318 putative non-inhibitors correctly identified 70.45% of the 44 inhibitors reported since 2011, and predicted as inhibitors 44,843 (0.33% of 13.56M PubChem, 1,496 (0.89% of 168 K MDDR, and 719 (7.73% of 9,305 MDDR compounds similar to the known inhibitors. Conclusions SVM showed comparable yield and reduced false hit rates in searching large compound libraries compared to the similarity-based and other machine-learning VS methods developed from the same set of training compounds and molecular descriptors. We tested three virtual hits of the same novel scaffold from in-house chemical libraries not reported as Src inhibitor, one of which showed moderate activity. SVM may be potentially explored for searching Src inhibitors from large compound libraries at low false-hit rates.

  7. Development and experimental test of support vector machines virtual screening method for searching Src inhibitors from large compound libraries.

    Han, Bucong; Ma, Xiaohua; Zhao, Ruiying; Zhang, Jingxian; Wei, Xiaona; Liu, Xianghui; Liu, Xin; Zhang, Cunlong; Tan, Chunyan; Jiang, Yuyang; Chen, Yuzong

    2012-11-23

    Src plays various roles in tumour progression, invasion, metastasis, angiogenesis and survival. It is one of the multiple targets of multi-target kinase inhibitors in clinical uses and trials for the treatment of leukemia and other cancers. These successes and appearances of drug resistance in some patients have raised significant interest and efforts in discovering new Src inhibitors. Various in-silico methods have been used in some of these efforts. It is desirable to explore additional in-silico methods, particularly those capable of searching large compound libraries at high yields and reduced false-hit rates. We evaluated support vector machines (SVM) as virtual screening tools for searching Src inhibitors from large compound libraries. SVM trained and tested by 1,703 inhibitors and 63,318 putative non-inhibitors correctly identified 93.53%~ 95.01% inhibitors and 99.81%~ 99.90% non-inhibitors in 5-fold cross validation studies. SVM trained by 1,703 inhibitors reported before 2011 and 63,318 putative non-inhibitors correctly identified 70.45% of the 44 inhibitors reported since 2011, and predicted as inhibitors 44,843 (0.33%) of 13.56M PubChem, 1,496 (0.89%) of 168 K MDDR, and 719 (7.73%) of 9,305 MDDR compounds similar to the known inhibitors. SVM showed comparable yield and reduced false hit rates in searching large compound libraries compared to the similarity-based and other machine-learning VS methods developed from the same set of training compounds and molecular descriptors. We tested three virtual hits of the same novel scaffold from in-house chemical libraries not reported as Src inhibitor, one of which showed moderate activity. SVM may be potentially explored for searching Src inhibitors from large compound libraries at low false-hit rates.

  8. PROPOSAL OF METHOD FOR AN AUTOMATIC COMPLEMENTARITIES SEARCH BETWEEN COMPANIES' R&D

    PAULO VINÍCIUS MARCONDES CORDEIRO; DARIO EDUARDO AMARAL DERGINT; KAZUO HATAKEYAMA

    2014-01-01

    Open innovation model is the best choice for the firms that cannot afford R&D costs but intent to continue playing the innovation game. This model offers to any firm the possibility to have companies spread worldwide and in all research fields as partners in R&D. However, the possible partnership can be restricted to the manager's know-who. Patent documents can be the source of rich information about technical development and innovation from a huge amount of firms. Search through all these da...

  9. PEDF as an anticancer drug and new treatment methods following the discovery of its receptors: A patent perspective

    Manalo, Katrina B.; Choong, Peter F.M.; Becerra, S. Patricia; Dass, Crispin R.

    2014-01-01

    Background Traditional forms of cancer therapy, which includes chemotherapy, have largely been overhauled due to the significant degree of toxicity they pose to normal, otherwise healthy tissue. It is hoped that use of biological agents, most of which are endogenously present in the body, will lead to safer treatment outcomes, without sacrificing efficacy. Objective The finding that PEDF, a naturally-occurring protein, was a potent angiogenesis inhibitor became the basis for studying the role of PEDF in tumours that are highly resistant to chemotherapy. The determination of the direct role of PEDF against cancer paved the way for understanding and developing PEDF as a novel drug. This review focuses on the patent applications behind testing the anticancer therapeutic effect of PEDF via its receptors as an antiangiogenic agent and as a direct anticancer agent. Conclusions The majority of the PEDF patents describe its and/or its fragments’ antiangiogenic ability and the usage of recombinant vectors as the mode of treatment delivery. PEDF’s therapeutic potential against different diseases and the discovery of its receptors opens possibilities for improving PEDF-based peptide design and drug delivery modes. PMID:21204726

  10. Multivariate methods and the search for single top-quark production in association with a $W$ boson in ATLAS

    Kovesarki, Peter; Dingfelder, Jochen

    This thesis describes three machine learning algorithms that can be used for physics analyses. The first is a density estimator that was derived from the Green’s function identity of the Laplace operator and is capable of tagging data samples according to the signal purity. This latter task can also be performed with regression methods, and such an algorithm was implemented based on fast multi-dimensional polynomial regression. The accuracy was improved with a decision tree using smooth boundaries. Both methods apply rigorous checks against overtraining to make sure the results are drawn from statistically significant features. These two methods were applied in the search for the single top-quark production with a $W$ boson. Their separation power differ highly in favour for the regression method, mainly because it can exploit the extra information available during training. The third method is an unsupervised learning algorithm that offers finding an optimal coordinate system for a sample in the sense of m...

  11. Search for a transport method for the calculation of the PWR control and safety clusters

    Bruna, G.B.; Van Frank, C.; Vergain, M.L.; Chauvin, J.P.; Palmiotti, G.; Nobile, M.

    1990-01-01

    The project studies of power reactors rely mainly on diffusion calculations, but transport ones are often needed for assessing fine effects, intimately linked to geometry and spectrum heterogeneities. Accurate transport computations are necessary, in particular, for shielded cross section generation, and when homogenization and dishomogenization processes are involved. The transport codes, generally, offer the user a variety of computational options, related to different approximation levels. In every case, it is obviously desirable to be able to choose the reliable degree of approximation to be accepted in any particular computational circumstance of the project. The search for such adapted procedures is to be made on the basis of critical experiments. In our studies, this task was made possible by the availability of suitable results of the CAMELEON critical experiment, carried on in the EOLE facility at CEA's Center of Cadarache. In this paper, we summarize some of the work in progress at FRAMATOME on the definition of an assembly based transport calculation scheme to be used for PWR control and safety cluster computations. Two main items, devoted to the search of the optimum computational procedures, are presented here: - a parametrical study on computational options, made in an infinite medium assembly geometry, - a series of comparisons between calculated and experimental values of pin power distribution

  12. Global Optimization Based on the Hybridization of Harmony Search and Particle Swarm Optimization Methods

    A. P. Karpenko

    2014-01-01

    Full Text Available We consider a class of stochastic search algorithms of global optimization which in various publications are called behavioural, intellectual, metaheuristic, inspired by the nature, swarm, multi-agent, population, etc. We use the last term.Experience in using the population algorithms to solve challenges of global optimization shows that application of one such algorithm may not always effective. Therefore now great attention is paid to hybridization of population algorithms of global optimization. Hybrid algorithms unite various algorithms or identical algorithms, but with various values of free parameters. Thus efficiency of one algorithm can compensate weakness of another.The purposes of the work are development of hybrid algorithm of global optimization based on known algorithms of harmony search (HS and swarm of particles (PSO, software implementation of algorithm, study of its efficiency using a number of known benchmark problems, and a problem of dimensional optimization of truss structure.We set a problem of global optimization, consider basic algorithms of HS and PSO, give a flow chart of the offered hybrid algorithm called PSO HS , present results of computing experiments with developed algorithm and software, formulate main results of work and prospects of its development.

  13. Dai-Kou type conjugate gradient methods with a line search only using gradient.

    Huang, Yuanyuan; Liu, Changhe

    2017-01-01

    In this paper, the Dai-Kou type conjugate gradient methods are developed to solve the optimality condition of an unconstrained optimization, they only utilize gradient information and have broader application scope. Under suitable conditions, the developed methods are globally convergent. Numerical tests and comparisons with the PRP+ conjugate gradient method only using gradient show that the methods are efficient.

  14. Performance for the hybrid method using stochastic and deterministic searching for shape optimization of electromagnetic devices

    Yokose, Yoshio; Noguchi, So; Yamashita, Hideo

    2002-01-01

    Stochastic methods and deterministic methods are used for the problem of optimization of electromagnetic devices. The Genetic Algorithms (GAs) are used for one stochastic method in multivariable designs, and the deterministic method uses the gradient method, which is applied sensitivity of the objective function. These two techniques have benefits and faults. In this paper, the characteristics of those techniques are described. Then, research evaluates the technique by which two methods are used together. Next, the results of the comparison are described by applying each method to electromagnetic devices. (Author)

  15. Comparisons of peak-search and photopeak-integration methods in the computer analysis of gamma-ray spectra

    Baedecker, P.A.

    1980-01-01

    Myriad methods have been devised for extracting quantitative information from gamma-ray spectra by means of a computer, and a critical evaluation of the relative merits of the various programs that have been written would represent a Herculean, if not an impossible, task. The results from the International Atomic Energy Agency (IAEA) intercomparison, which may represent the most straightforward approach to making such an evaluation, showed a wide range in the quality of the results - even among laboratories where similar methods were used. The most clear-cut way of differentiating between programs is by the method used to evaluate peak areas: by the iterative fitting of the spectral features to an often complex model, or by a simple summation procedure. Previous comparisons have shown that relatively simple algorithms can compete favorably with fitting procedures, although fitting holds the greatest promise for the detection and measurement of complex peaks. However, fitting algorithms, which are generally complex and time consuming, are often ruled out by practical limitations based on the type of computing equipment available, cost limitations, the number of spectra to be processed in a given time period, and the ultimate goal of the analysis. Comparisons of methods can be useful, however, in helping to illustrate the limitations of the various algorithms that have been devised. This paper presents a limited review of some of the more common peak-search and peak-integration methods, along with Peak-search procedures

  16. Comparison of Decisions Quality of Heuristic Methods with Limited Depth-First Search Techniques in the Graph Shortest Path Problem

    Vatutin Eduard

    2017-12-01

    Full Text Available The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.

  17. Comparison of Decisions Quality of Heuristic Methods with Limited Depth-First Search Techniques in the Graph Shortest Path Problem

    Vatutin, Eduard

    2017-12-01

    The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.

  18. “Time for Some Traffic Problems": Enhancing E-Discovery and Big Data Processing Tools with Linguistic Methods for Deception Detection

    Erin Smith Crabb

    2014-09-01

    Full Text Available Linguistic deception theory provides methods to discover potentially deceptive texts to make them accessible to clerical review. This paper proposes the integration of these linguistic methods with traditional e-discovery techniques to identify deceptive texts within a given author’s larger body of written work, such as their sent email box. First, a set of linguistic features associated with deception are identified and a prototype classifier is constructed to analyze texts and describe the features’ distributions, while avoiding topic-specific features to improve recall of relevant documents. The tool is then applied to a portion of the Enron Email Dataset to illustrate how these strategies identify records, providing an example of its advantages and capability to stratify the large data set at hand.

  19. Serendipity in dark photon searches

    Ilten, Philip; Soreq, Yotam; Williams, Mike; Xue, Wei

    2018-06-01

    Searches for dark photons provide serendipitous discovery potential for other types of vector particles. We develop a framework for recasting dark photon searches to obtain constraints on more general theories, which includes a data-driven method for determining hadronic decay rates. We demonstrate our approach by deriving constraints on a vector that couples to the B-L current, a leptophobic B boson that couples directly to baryon number and to leptons via B- γ kinetic mixing, and on a vector that mediates a protophobic force. Our approach can easily be generalized to any massive gauge boson with vector couplings to the Standard Model fermions, and software to perform any such recasting is provided at https://gitlab.com/philten/darkcast https://gitlab.com/philten/darkcast" TargetType="URL"/> .

  20. Discovery of the iron isotopes

    Schuh, A.; Fritsch, A.; Heim, M.; Shore, A.; Thoennessen, M.

    2010-01-01

    Twenty-eight iron isotopes have been observed so far and the discovery of these isotopes is discussed here. For each isotope a brief summary of the first refereed publication, including the production and identification method, is presented.

  1. Discovery of the silver isotopes

    Schuh, A.; Fritsch, A.; Ginepro, J.Q.; Heim, M.; Shore, A.; Thoennessen, M.

    2010-01-01

    Thirty-eight silver isotopes have been observed so far and the discovery of these isotopes is discussed here. For each isotope a brief summary of the first refereed publication, including the production and identification method, is presented.

  2. Discovery of the cadmium isotopes

    Amos, S.; Thoennessen, M.

    2010-01-01

    Thirty-seven cadmium isotopes have been observed so far and the discovery of these isotopes is discussed here. For each isotope a brief summary of the first refereed publication, including the production and identification method, is presented.

  3. Choosing Discovery: A Literature Review on the Selection and Evaluation of Discovery Layers

    Moore, Kate B.; Greene, Courtney

    2012-01-01

    Within the next few years, traditional online public access catalogs will be replaced by more robust and interconnected discovery layers that can serve as primary public interfaces to simultaneously search many separate collections of resources. Librarians have envisioned this type of discovery tool since the 1980s, and research shows that…

  4. Hybrid Multistarting GA-Tabu Search Method for the Placement of BtB Converters for Korean Metropolitan Ring Grid

    Remund J. Labios

    2016-01-01

    Full Text Available This paper presents a method to determine the optimal locations for installing back-to-back (BtB converters in a power grid as a countermeasure to reduce fault current levels. The installation of BtB converters can be regarded as network reconfiguration. For the purpose, a hybrid multistarting GA-tabu search method was used to determine the best locations from a preselected list of candidate locations. The constraints used in determining the best locations include circuit breaker fault current limits, proximity of proposed locations, and capability of the solution to reach power flow convergence. A simple power injection model after applying line-opening on selected branches was used as a means for power flows with BtB converters. Kron reduction was also applied as a method for network reduction for fast evaluation of fault currents with a given topology. Simulations of the search method were performed on the Korean power system, particularly the Seoul metropolitan area.

  5. Earthquake effect on volcano and the geological structure in central java using tomography travel time method and relocation hypocenter by grid search method

    Suharsono; Nurdian, S. W; Palupi, I. R.

    2016-01-01

    Relocating hypocenter is a way to improve the velocity model of the subsurface. One of the method is Grid Search. To perform the distribution of the velocity in subsurface by tomography method, it is used the result of relocating hypocenter to be a reference for subsurface analysis in volcanic and major structural patterns, such as in Central Java. The main data of this study is the earthquake data recorded from 1952 to 2012 with the P wave number is 9162, the number of events is 2426 were recorded by 30 stations located in the vicinity of Central Java. Grid search method has some advantages they are: it can relocate the hypocenter more accurate because this method is dividing space lattice model into blocks, and each grid block can only be occupied by one point hypocenter. Tomography technique is done by travel time data that has had relocated with inversion pseudo bending method. Grid search relocated method show that the hypocenter's depth is shallower than before and the direction is to the south, the hypocenter distribution is modeled into the subduction zone between the continent of Eurasia with the Indo-Australian with an average angle of 14 °. The tomography results show the low velocity value is contained under volcanoes with value of -8% to -10%, then the pattern of the main fault structure in Central Java can be description by the results of tomography at high velocity that is from 8% to 10% with the direction is northwest and northeast-southwest. (paper)

  6. InSourcerer: a high-throughput method to search for unknown metabolite modifications by mass spectrometry.

    Mrzic, Aida; Lermyte, Frederik; Vu, Trung Nghia; Valkenborg, Dirk; Laukens, Kris

    2017-09-15

    Using mass spectrometry, the analysis of known metabolite structures has become feasible in a systematic high-throughput fashion. Nevertheless, the identification of previously unknown structures remains challenging, partially because many unidentified variants originate from known molecules that underwent unexpected modifications. Here, we present a method for the discovery of unknown metabolite modifications and conjugate metabolite isoforms in a high-throughput fashion. The method is based on user-controlled in-source fragmentation which is used to induce loss of weakly bound modifications. This is followed by the comparison of product ions from in-source fragmentation and collision-induced dissociation (CID). Diagonal MS 2 -MS 3 matching allows the detection of unknown metabolite modifications, as well as substructure similarities. As the method relies heavily on the advantages of in-source fragmentation and its ability to 'magically' elucidate unknown modification, we have named it inSourcerer as a portmanteau of in-source and sorcerer. The method was evaluated using a set of 15 different cytokinin standards. Product ions from in-source fragmentation and CID were compared. Hierarchical clustering revealed that good matches are due to the presence of common substructures. Plant leaf extract, spiked with a mix of all 15 standards, was used to demonstrate the method's ability to detect these standards in a complex mixture, as well as confidently identify compounds already present in the plant material. Here we present a method that incorporates a classic liquid chromatography/mass spectrometry (LC/MS) workflow with fragmentation models and computational algorithms. The assumptions upon which the concept of the method was built were shown to be valid and the method showed that in-source fragmentation can be used to pinpoint structural similarities and indicate the occurrence of a modification. Copyright © 2017 John Wiley & Sons, Ltd.

  7. A modified harmony search method for environmental/economic load dispatch of real-world power systems

    Jeddi, Babak; Vahidinasab, Vahid

    2014-01-01

    Highlights: • A combined economic and emission load dispatch (CEELD) model is proposed. • The proposed model considers practical constraints of real-world power systems. • A new modified harmony search algorithm proposed to solve non-convex CEELD. • The proposed algorithm is tested by applying it to solve seven test systems. - Abstract: Economic load dispatch (ELD) problem is one of the basic and important optimization problems in a power system. However, considering practical constraints of real-world power systems such as ramp rate limits, prohibited operating zones, valve loading effects, multi-fuel options, spinning reserve and transmission system losses in ELD problem makes it a non-convex optimization problem, which is a challenging one and cannot be solved by traditional methods. Moreover, considering environmental issues, results in combined economic and emission load dispatch (CEELD) problem that is a multiobjective optimization model with two non-commensurable and contradictory objectives. In this paper, a modified harmony search algorithm (MHSA) proposed and applied to solve ELD and CEELD problem considering the abovementioned constraints. In the proposed MHSA, a new improvising method based on wavelet mutation together with a new memory consideration scheme based on the roulette wheel mechanism are proposed which improves the accuracy, convergence speed, and robustness of the classical HSA. Performance of the proposed algorithm is investigated by applying it to solve various test systems having non-convex solution spaces. To Show the effectiveness of the proposed method, obtained results compared with classical harmony search algorithm (HSA) and some of the most recently published papers in the area

  8. In Search of Easy-to-Use Methods for Calibrating ADCP's for Velocity and Discharge Measurements

    Oberg, K.; ,

    2002-01-01

    A cost-effective procedure for calibrating acoustic Doppler current profilers (ADCP) in the field was presented. The advantages and disadvantages of various methods which are used for calibrating ADCP were discussed. The proposed method requires the use of differential global positioning system (DGPS) with sub-meter accuracy and standard software for collecting ADCP data. The method involves traversing a long (400-800 meter) course at a constant compass heading and speed, while collecting simultaneous DGPS and ADCP data.

  9. THE METHOD OF APPLICATION OF A COLLECTIVE SEARCH ACTIVITY AS A TOOL DEVELOPING METHODOLOGICAL THINKING OF A TEACHER

    Ibragimova Luiza Vahaevna

    2013-02-01

    Full Text Available To realize any pedagogical theory into practice it is necessary to transform the theoretical concepts in teaching methods. The development of all abilities, including thinking, occurs only in the activity, which is specially organized by creating the required pedagogical conditions, in this case – it is a the application of enhanced mental activity in teachers training course and vocational training b establishment of a "virtual university" for teachers in an institute of professional training c the organization of interdisciplinary interaction of teachers, based on conditions of the nonlinear didactics (training teachers of different subjects. The presented method is implemented for two years and consists of three phases: the motivational and educational, intellectual and developmental, innovative and reflective. At the motivational and educational stage, possibilities of collective search activity actualize during the course of training, group goals are set and chosen methods of their achieving by using the first pedagogical conditions. At intellectual and developmental stage, the development of skills to the collective search for effective teaching decisions during intercourse training with the first-and second-pedagogical conditions is carried out. The innovative step is the promotion of teachers to self-determination of techniques and tools that improve the quality of the educational process, providing assistance to each other in the development of teaching manuals, which is achieved with the help of all three pedagogical conditions.

  10. THE METHOD OF APPLICATION OF A COLLECTIVE SEARCH ACTIVITY AS A TOOL DEVELOPING METHODOLOGICAL THINKING OF A TEACHER

    Луиза Вахаевна Ибрагимова

    2013-04-01

    Full Text Available To realize any pedagogical theory into practice it is necessary to transform the theoretical concepts in teaching methods. The development of all abilities, including thinking, occurs only in the activity, which is specially organized by creating the required pedagogical conditions, in this case – it is a the application of enhanced mental activity in teachers training course and vocational training b establishment of a "virtual university" for teachers in an institute of professional training c the organization of interdisciplinary interaction of teachers, based on conditions of the nonlinear didactics (training teachers of different subjects. The presented method is implemented for two years and consists of three phases: the motivational and educational, intellectual and developmental, innovative and reflective. At the motivational and educational stage, possibilities of collective search activity actualize during the course of training, group goals are set and chosen methods of their achieving by using the first pedagogical conditions. At intellectual and developmental stage, the development of skills to the collective search for effective teaching decisions during intercourse training with the first-and second-pedagogical conditions is carried out. The innovative step is the promotion of teachers to self-determination of techniques and tools that improve the quality of the educational process, providing assistance to each other in the development of teaching manuals, which is achieved with the help of all three pedagogical conditions.DOI: http://dx.doi.org/10.12731/2218-7405-2013-2-17

  11. Methods for a systematic, comprehensive search for fast, heavy scintillator materials

    Derenzo, S.E.; Moses, W.W.; Weber, M.J.; West, A.C.

    1994-01-01

    Over the years a number of scintillator materials have been developed for a wide variety of nuclear detection applications in industry, high energy physics, and medical instrumentation. To expand the list of useful scintillators, the authors are pursuing the following systematic, comprehensive search: (1) select materials with good gamma-ray interaction properties from the 200,000 data set NIST crystal diffraction file, (2) synthesize samples (doped and undoped) in powdered or single crystal form, (3) test the samples using sub-nanosecond pulsed x-rays to measure important scintillation properties such as rise times, decay times, emission wavelengths, and light output, (4) prepare large, high quality crystals of the most promising candidates, and (5) test the crystals as gamma-ray detectors in representative configurations. An important parallel effort is the computation of electronic energy levels of activators and the band structure of intrinsic and host crystals to aid in the materials selection process. In this paper the authors interested mainly in scintillator materials for detecting 511 keV gamma rays in positron emission tomography

  12. Theoretical Investigation of Combined Use of PSO, Tabu Search and Lagrangian Relaxation methods to solve the Unit Commitment Problem

    Sahbi Marrouchi

    2018-02-01

    Full Text Available Solving the Unit Commitment problem (UCP optimizes the combination of production units operations and determines the appropriate operational scheduling of each production units to satisfy the expected consumption which varies from one day to one month. Besides, each production unit is conducted to constraints that render this problem complex, combinatorial and nonlinear. In this paper, we proposed a new strategy based on the combination three optimization methods: Tabu search, Particle swarm optimization and Lagrangian relaxation methods in order to develop a proper unit commitment scheduling of the production units while reducing the production cost during a definite period. The proposed strategy has been implemented on a the IEEE 9 bus test system containing 3 production unit and the results were promising compared to strategies based on meta-heuristic and deterministic methods.

  13. Search Strategy of Detector Position For Neutron Source Multiplication Method by Using Detected-Neutron Multiplication Factor

    Endo, Tomohiro

    2011-01-01

    In this paper, an alternative definition of a neutron multiplication factor, detected-neutron multiplication factor kdet, is produced for the neutron source multiplication method..(NSM). By using kdet, a search strategy of appropriate detector position for NSM is also proposed. The NSM is one of the practical subcritical measurement techniques, i.e., the NSM does not require any special equipment other than a stationary external neutron source and an ordinary neutron detector. Additionally, the NSM method is based on steady-state analysis, so that this technique is very suitable for quasi real-time measurement. It is noted that the correction factors play important roles in order to accurately estimate subcriticality from the measured neutron count rates. The present paper aims to clarify how to correct the subcriticality measured by the NSM method, the physical meaning of the correction factors, and how to reduce the impact of correction factors by setting a neutron detector at an appropriate detector position

  14. Searching for Rigour in the Reporting of Mixed Methods Population Health Research: A Methodological Review

    Brown, K. M.; Elliott, S. J.; Leatherdale, S. T.; Robertson-Wilson, J.

    2015-01-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing…

  15. In Search of Insight.

    Kaplan, Craig A.; Simon, Herbert A.

    1990-01-01

    Attaining the insight needed to solve the Mutilated Checkerboard problem, which requires discovery of an effective problem representation (EPR), is described. Performance on insight problems can be predicted from the availability of generators and constraints in the search for an EPR. Data for 23 undergraduates were analyzed. (TJH)

  16. Searching for Movies

    Bogers, Toine

    2015-01-01

    Despite a surge in popularity of work on casual leisure search, some leisure domains are still relatively underrepresented. Movies are good example of such a domain, which is peculiar given the popularity of movie-centered websites and discovery services such as IMDB, RottenTomatoes, and Netflix...

  17. Heuristic methods using grasp, path relinking and variable neighborhood search for the clustered traveling salesman problem

    Mário Mestria

    2013-08-01

    Full Text Available The Clustered Traveling Salesman Problem (CTSP is a generalization of the Traveling Salesman Problem (TSP in which the set of vertices is partitioned into disjoint clusters and objective is to find a minimum cost Hamiltonian cycle such that the vertices of each cluster are visited contiguously. The CTSP is NP-hard and, in this context, we are proposed heuristic methods for the CTSP using GRASP, Path Relinking and Variable Neighborhood Descent (VND. The heuristic methods were tested using Euclidean instances with up to 2000 vertices and clusters varying between 4 to 150 vertices. The computational tests were performed to compare the performance of the heuristic methods with an exact algorithm using the Parallel CPLEX software. The computational results showed that the hybrid heuristic method using VND outperforms other heuristic methods.

  18. Quantifying the Ease of Scientific Discovery.

    Arbesman, Samuel

    2011-02-01

    It has long been known that scientific output proceeds on an exponential increase, or more properly, a logistic growth curve. The interplay between effort and discovery is clear, and the nature of the functional form has been thought to be due to many changes in the scientific process over time. Here I show a quantitative method for examining the ease of scientific progress, another necessary component in understanding scientific discovery. Using examples from three different scientific disciplines - mammalian species, chemical elements, and minor planets - I find the ease of discovery to conform to an exponential decay. In addition, I show how the pace of scientific discovery can be best understood as the outcome of both scientific output and ease of discovery. A quantitative study of the ease of scientific discovery in the aggregate, such as done here, has the potential to provide a great deal of insight into both the nature of future discoveries and the technical processes behind discoveries in science.

  19. Discovery as a process

    Loehle, C.

    1994-05-01

    The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if something is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.

  20. Perspective: Role of structure prediction in materials discovery and design

    Richard J. Needs

    2016-05-01

    Full Text Available Materials informatics owes much to bioinformatics and the Materials Genome Initiative has been inspired by the Human Genome Project. But there is more to bioinformatics than genomes, and the same is true for materials informatics. Here we describe the rapidly expanding role of searching for structures of materials using first-principles electronic-structure methods. Structure searching has played an important part in unraveling structures of dense hydrogen and in identifying the record-high-temperature superconducting component in hydrogen sulfide at high pressures. We suggest that first-principles structure searching has already demonstrated its ability to determine structures of a wide range of materials and that it will play a central and increasing part in materials discovery and design.

  1. N- versus O-alkylation: utilizing NMR methods to establish reliable primary structure determinations for drug discovery.

    LaPlante, Steven R; Bilodeau, François; Aubry, Norman; Gillard, James R; O'Meara, Jeff; Coulombe, René

    2013-08-15

    A classic synthetic issue that remains unresolved is the reaction that involves the control of N- versus O-alkylation of ambident anions. This common chemical transformation is important for medicinal chemists, who require predictable and reliable protocols for the rapid synthesis of inhibitors. The uncertainty of whether the product(s) are N- and/or O-alkylated is common and can be costly if undetermined. Herein, we report an NMR-based strategy that focuses on distinguishing inhibitors and intermediates that are N- or O-alkylated. The NMR strategy involves three independent and complementary methods. However, any combination of two of the methods can be reliable if the third were compromised due to resonance overlap or other issues. The timely nature of these methods (HSQC/HMQC, HMBC. ROESY, and (13)C shift predictions) allows for contemporaneous determination of regioselective alkylation as needed during the optimization of synthetic routes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. [Analysis on traditional Chinese medicine prescriptions treating cancer-related anorexia syndrome based on grey system theory combined with multivariate analysis method and discovery of new prescriptions].

    Chen, Song-Lin; Chen, Cong; Zhu, Hui; Li, Jing; Pang, Yan

    2016-01-01

    Cancer-related anorexia syndrome (CACS) is one of the main causes for death at present as well as a syndrome seriously harming patients' quality of life, treatment effect and survival time. In current clinical researches, there are fewer reports about empirical traditional Chinese medicine(TCM) prescriptions and patent prescriptions treating CACS, and prescription rules are rarely analyzed in a systematic manner. As the hidden rules are not excavated, it is hard to have an innovative discovery and knowledge of clinical medication. In this paper, the grey screening method combined with the multivariate statistical method was used to build the ″CACS prescriptions database″. Based on the database, totally 359 prescriptions were selected, the frequency of herbs in prescription was determined, and commonly combined drugs were evolved into 4 new prescriptions for different syndromes. Prescriptions of TCM in treatment of CACS gave priority to benefiting qi for strengthening spleen, also laid emphasis on replenishing kidney essence, dispersing stagnated liver-qi and dispersing lung-qi. Moreover, interdependence and mutual promotion of yin and yang should be taken into account to reflect TCM's holism and theory for treatment based on syndrome differentiation. The grey screening method, as a valuable traditional Chinese medicine research-supporting method, can be used to subjectively and objectively analyze prescription rules; and the new prescriptions can provide reference for the clinical use of TCM for treating CACS and the drug development. Copyright© by the Chinese Pharmaceutical Association.

  3. Glycoblotting method allows for rapid and efficient glycome profiling of human Alzheimer's disease brain, serum and cerebrospinal fluid towards potential biomarker discovery.

    Gizaw, Solomon T; Ohashi, Tetsu; Tanaka, Masakazu; Hinou, Hiroshi; Nishimura, Shin-Ichiro

    2016-08-01

    Understanding of the significance of posttranslational glycosylation in Alzheimer's disease (AD) is of growing importance for the investigation of the pathogenesis of AD as well as discovery research of the disease-specific serum biomarkers. We designed a standard protocol for the glycoblotting combined with MALDI-TOFMS to perform rapid and quantitative profiling of the glycan parts of glycoproteins (N-glycans) and glycosphingolipids (GSLs) using human AD's post-mortem samples such as brain tissues (dissected cerebral cortices such as frontal, parietal, occipital, and temporal domains), serum and cerebrospinal fluid (CSF). The structural profiles of the major N-glycans released from glycoproteins and the total expression levels of the glycans were found to be mostly similar between the brain tissues of the AD patients and those of the normal control group. In contrast, the expression levels of the serum and CSF protein N-glycans such as bisect-type and multiply branched glycoforms were increased significantly in AD patient group. In addition, the levels of some gangliosides such as GM1, GM2 and GM3 appeared to alter in the AD patient brain and serum samples when compared with the normal control groups. Alteration of the expression levels of major N- and GSL-glycans in human brain tissues, serum and CSF of AD patients can be monitored quantitatively by means of the glycoblotting-based standard protocols. The changes in the expression levels of the glycans derived from the human post-mortem samples uncovered by the standardized glycoblotting method provides potential serum biomarkers in central nervous system disorders and can contribute to the insight into the molecular mechanisms in the pathogenesis of neurodegenerative diseases and future drug discovery. Most importantly, the present preliminary trials using human post-mortem samples of AD patients suggest that large-scale serum glycomics cohort by means of various-types of human AD patients as well as the normal

  4. STATISTICAL CHALLENGES FOR SEARCHES FOR NEW PHYSICS AT THE LHC.

    CRANMER, K.

    2005-09-12

    Because the emphasis of the LHC is on 5{sigma} discoveries and the LHC environment induces high systematic errors, many of the common statistical procedures used in High Energy Physics are not adequate. I review the basic ingredients of LHC searches, the sources of systematics, and the performance of several methods. Finally, I indicate the methods that seem most promising for the LHC and areas that are in need of further study.

  5. Self-guided method to search maximal Bell violations for unknown quantum states

    Yang, Li-Kai; Chen, Geng; Zhang, Wen-Hao; Peng, Xing-Xiang; Yu, Shang; Ye, Xiang-Jun; Li, Chuan-Feng; Guo, Guang-Can

    2017-11-01

    In recent decades, a great variety of research and applications concerning Bell nonlocality have been developed with the advent of quantum information science. Providing that Bell nonlocality can be revealed by the violation of a family of Bell inequalities, finding maximal Bell violation (MBV) for unknown quantum states becomes an important and inevitable task during Bell experiments. In this paper we introduce a self-guided method to find MBVs for unknown states using a stochastic gradient ascent algorithm (SGA), by parametrizing the corresponding Bell operators. For three investigated systems (two qubit, three qubit, and two qutrit), this method can ascertain the MBV of general two-setting inequalities within 100 iterations. Furthermore, we prove SGA is also feasible when facing more complex Bell scenarios, e.g., d -setting d -outcome Bell inequality. Moreover, compared to other possible methods, SGA exhibits significant superiority in efficiency, robustness, and versatility.

  6. Foreshock search over a long duration using a method of setting appropriate criteria

    Toyomoto, Y.; Kawakata, H.; Hirano, S.; Doi, I.

    2016-12-01

    Recently, small foreshocks have been detected using cross-correlation techniques (e.g., Bouchon et al., 2011) in which the foreshocks are identified when the cross-correlation coefficient (CC) exceeded a certain threshold. For some shallow intraplate earthquakes, foreshocks whose hypocenters were estimated to be adjacent to the main shock hypocenter were detected from several tens of minutes before the main shock occurrence (Doi and Kawakata, 2012; 2013). At least two problems remain in the cross-correlation techniques employed. First, previous studies on foreshocks used data whose durations are at most a month (Kato et al., 2013); this is insufficient to check if such events occurred only before the main shock occurrence or not. Second, CC is used for detection criteria without considering validity of the threshold. In this study, we search for foreshocks of an M 5.4 earthquake in central Nagano prefecture in Japan on June 30, 2011 with a vertical-component waveform at N.MWDH (Hi-net) station due to one of the cataloged foreshocks (M 1) as a template to calculate CC. We calculate CC between the template and continuous waveforms of the same component at the same station for two years before the main shock occurrence, and we try to overcome the problems mentioned above. We find that histogram of CC is well modeled with the normal distribution, which is similar to previous studies on tremors (e.g., Ohta and Ide, 2008). According to the model, the expected number of misdetection is less than 1 when CC > 0.63. Therefore, we regard that the waveform is due to a foreshock when CC > 0.63. As a result, foreshocks are detected only within thirteen hours immediately before the main shock occurrence for the two years. By setting an appropriate threshold, we conclude that foreshocks just before the main shock occurrence are not stationary events. Acknowledgments: We use continuous waveform records of NIED high sensitivity seismograph network in Japan (Hi-net) and the JMA

  7. Untargeted metabolomic profiling plasma samples of patients with lung cancer for searching significant metabolites by HPLC-MS method

    Dementeva, N.; Ivanova, K.; Kokova, D.; Kurzina, I.; Ponomaryova, A.; Kzhyshkowska, J.

    2017-09-01

    Lung cancer is one of the most common types of cancer leading to death. Consequently, the search and the identification of the metabolites associated with the risk of developing cancer are very valuable. For the purpose, untargeted metabolic profiling of the plasma samples collected from the patients with lung cancer (n = 100) and the control group (n = 100) was conducted. After sample preparation, the plasma samples were analyzed using LC-MS method. Biostatistics methods were applied to pre-process the data for elicitation of dominating metabolites which responded to the difference between the case and the control groups. At least seven significant metabolites were evaluated and annotated. The most part of identified metabolites are connected with lipid metabolism and their combination could be useful for follow-up studies of lung cancer pathogenesis.

  8. Methods, analysis, and the treatment of systematic errors for the electron electric dipole moment search in thorium monoxide

    Baron, J.; Campbell, W. C.; DeMille, D.; Doyle, J. M.; Gabrielse, G.; Gurevich, Y. V.; Hess, P. W.; Hutzler, N. R.; Kirilov, E.; Kozyryev, I.; O'Leary, B. R.; Panda, C. D.; Parsons, M. F.; Spaun, B.; Vutha, A. C.; West, A. D.; West, E. P.; ACME Collaboration

    2017-07-01

    We recently set a new limit on the electric dipole moment of the electron (eEDM) (J Baron et al and ACME collaboration 2014 Science 343 269-272), which represented an order-of-magnitude improvement on the previous limit and placed more stringent constraints on many charge-parity-violating extensions to the standard model. In this paper we discuss the measurement in detail. The experimental method and associated apparatus are described, together with the techniques used to isolate the eEDM signal. In particular, we detail the way experimental switches were used to suppress effects that can mimic the signal of interest. The methods used to search for systematic errors, and models explaining observed systematic errors, are also described. We briefly discuss possible improvements to the experiment.

  9. Multivariate methods and the search for single top-quark production in association with a W boson in ATLAS

    Koevesarki, Peter

    2012-11-01

    This thesis describes three machine learning algorithms that can be used for physics analyses. The first is a density estimator that was derived from the Green's function identity of the Laplace operator and is capable of tagging data samples according to the signal purity. This latter task can also be performed with regression methods, and such an algorithm was implemented based on fast multi-dimensional polynomial regression. The accuracy was improved with a decision tree using smooth boundaries. Both methods apply rigorous checks against overtraining to make sure the results are drawn from statistically significant features. These two methods were applied in the search for the single top-quark production with a W boson. Their separation power differ highly in favour for the regression method, mainly because it can exploit the extra information available during training. The third method is an unsupervised learning algorithm that offers finding an optimal coordinate system for a sample in the sense of maximal information entropy, which may aid future methods to model data.

  10. Multivariate methods and the search for single top-quark production in association with a W boson in ATLAS

    Koevesarki, Peter

    2012-11-15

    This thesis describes three machine learning algorithms that can be used for physics analyses. The first is a density estimator that was derived from the Green's function identity of the Laplace operator and is capable of tagging data samples according to the signal purity. This latter task can also be performed with regression methods, and such an algorithm was implemented based on fast multi-dimensional polynomial regression. The accuracy was improved with a decision tree using smooth boundaries. Both methods apply rigorous checks against overtraining to make sure the results are drawn from statistically significant features. These two methods were applied in the search for the single top-quark production with a W boson. Their separation power differ highly in favour for the regression method, mainly because it can exploit the extra information available during training. The third method is an unsupervised learning algorithm that offers finding an optimal coordinate system for a sample in the sense of maximal information entropy, which may aid future methods to model data.

  11. Double digest RADseq: an inexpensive method for de novo SNP discovery and genotyping in model and non-model species.

    Brant K Peterson

    Full Text Available The ability to efficiently and accurately determine genotypes is a keystone technology in modern genetics, crucial to studies ranging from clinical diagnostics, to genotype-phenotype association, to reconstruction of ancestry and the detection of selection. To date, high capacity, low cost genotyping has been largely achieved via "SNP chip" microarray-based platforms which require substantial prior knowledge of both genome sequence and variability, and once designed are suitable only for those targeted variable nucleotide sites. This method introduces substantial ascertainment bias and inherently precludes detection of rare or population-specific variants, a major source of information for both population history and genotype-phenotype association. Recent developments in reduced-representation genome sequencing experiments on massively parallel sequencers (commonly referred to as RAD-tag or RADseq have brought direct sequencing to the problem of population genotyping, but increased cost and procedural and analytical complexity have limited their widespread adoption. Here, we describe a complete laboratory protocol, including a custom combinatorial indexing method, and accompanying software tools to facilitate genotyping across large numbers (hundreds or more of individuals for a range of markers (hundreds to hundreds of thousands. Our method requires no prior genomic knowledge and achieves per-site and per-individual costs below that of current SNP chip technology, while requiring similar hands-on time investment, comparable amounts of input DNA, and downstream analysis times on the order of hours. Finally, we provide empirical results from the application of this method to both genotyping in a laboratory cross and in wild populations. Because of its flexibility, this modified RADseq approach promises to be applicable to a diversity of biological questions in a wide range of organisms.

  12. Short segment search method for phylogenetic analysis using nested sliding windows

    Iskandar, A. A.; Bustamam, A.; Trimarsanto, H.

    2017-10-01

    To analyze phylogenetics in Bioinformatics, coding DNA sequences (CDS) segment is needed for maximal accuracy. However, analysis by CDS cost a lot of time and money, so a short representative segment by CDS, which is envelope protein segment or non-structural 3 (NS3) segment is necessary. After sliding window is implemented, a better short segment than envelope protein segment and NS3 is found. This paper will discuss a mathematical method to analyze sequences using nested sliding window to find a short segment which is representative for the whole genome. The result shows that our method can find a short segment which more representative about 6.57% in topological view to CDS segment than an Envelope segment or NS3 segment.

  13. A method for stochastic constrained optimization using derivative-free surrogate pattern search and collocation

    Sankaran, Sethuraman; Audet, Charles; Marsden, Alison L.

    2010-01-01

    Recent advances in coupling novel optimization methods to large-scale computing problems have opened the door to tackling a diverse set of physically realistic engineering design problems. A large computational overhead is associated with computing the cost function for most practical problems involving complex physical phenomena. Such problems are also plagued with uncertainties in a diverse set of parameters. We present a novel stochastic derivative-free optimization approach for tackling such problems. Our method extends the previously developed surrogate management framework (SMF) to allow for uncertainties in both simulation parameters and design variables. The stochastic collocation scheme is employed for stochastic variables whereas Kriging based surrogate functions are employed for the cost function. This approach is tested on four numerical optimization problems and is shown to have significant improvement in efficiency over traditional Monte-Carlo schemes. Problems with multiple probabilistic constraints are also discussed.

  14. A gravitational wave burst search method based on the S transform

    Clapson, Andre-Claude; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Kreckelberg, Stephane; Varvella, Monica

    2005-01-01

    The detection of burst-type events in the output of ground gravitational wave observatories is particularly challenging due to the expected variety of astrophysical waveforms and the issue of discriminating them from instrumental noise. Robust methods, that achieve reasonable detection performances over a wide range of signals, would be most useful. We present a burst-detection pipeline based on a time-frequency transform, the S transform. This transform offers good time-frequency localization of energy without requiring prior knowledge of the event structure. We set a simple (and robust) event extraction chain. Results are provided for a variety of signals injected in simulated Gaussian statistics data (from the LIGO-Virgo joint working group). Indications are that detection is robust with respect to event type and that efficiency compares reasonably with reference methods. The time-frequency representation is shown to be affected by spectral features such as resonant lines. This emphasizes the role of pre-processing

  15. Searching for degenerate Higgs bosons using a profile likelihood ratio method

    Heikkilä, Jaana

    ATLAS and CMS collaborations at the Large Hadron Collider have observed a new resonance con- sistent with the standard model Higgs boson. However, it has been suggested that the observed signal could also be produced by multiple nearly mass-degenerate states that couple differently to the standard model particles. In this work, a method to discriminate between the hypothesis of a single Higgs boson and that of multiple mass-degenerate Higgs bosons was developed. Using the matrix of measured signal strengths in different production and decay modes, parametrizations for the two hypotheses were constructed as a general rank 1 matrix and the most general $5 \\times 4$ matrix, respectively. The test statistic was defined as a ratio of profile likelihoods for the two hypotheses. The method was applied to the CMS measurements. The expected test statistic distribution was estimated twice by generating pseudo-experiments according to both the standard model hypothesis and the single Higgs boson hypothesis best fitting...

  16. Intelligent Search Method Based ACO Techniques for a Multistage Decision Problem EDP/LFP

    Mostefa RAHLI

    2006-07-01

    Full Text Available The implementation of a numerical library of calculation based optimization in electrical supply networks area is in the centre of the current research orientations, thus, our project in a form given is centred on the development of platform NMSS1. It's a software environment which will preserve many efforts as regards calculations of charge, smoothing curves, losses calculation and economic planning of the generated powers [23].The operational research [17] in a hand and the industrial practice in the other, prove that the means and processes of simulation reached a level of very appreciable reliability and mathematical confidence [4, 5, 14]. It is of this expert observation that many processes make confidence to the results of simulation.The handicaps of this approach or methodology are that it makes base its judgments and handling on simplified assumptions and constraints whose influence was deliberately neglected to be added to the cost to spend [14].By juxtaposing the methods of simulation with artificial intelligence techniques, gathering set of numerical methods acquires an optimal reliability whose assurance can not leave doubt.Software environment NMSS [23] can be a in the field of the rallying techniques of simulation and electric network calculation via a graphic interface. In the same software integrate an AI capability via a module expert system.Our problem is a multistage case where are completely dependant and can't be performed separately.For a multistage problem [21, 22], the results obtained from a credible (large size problem calculation, makes the following question: Could choice of numerical methods set make the calculation of a complete problem using more than two treatments levels, a total error which will be the weakest one possible? It is well-known according to algorithmic policy; each treatment can be characterized by a function called mathematical complexity. This complexity is in fact a coast (a weight overloading

  17. A gravitational wave burst search method based on the S transform

    Clapson, Andre-Claude; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Kreckelberg, Stephane; Varvella, Monica [Groupe Virgo, LAL, Universite Paris-Sud, Batiment 208, BP 34, F-91898 Orsay Cedex (France)

    2005-09-21

    The detection of burst-type events in the output of ground gravitational wave observatories is particularly challenging due to the expected variety of astrophysical waveforms and the issue of discriminating them from instrumental noise. Robust methods, that achieve reasonable detection performances over a wide range of signals, would be most useful. We present a burst-detection pipeline based on a time-frequency transform, the S transform. This transform offers good time-frequency localization of energy without requiring prior knowledge of the event structure. We set a simple (and robust) event extraction chain. Results are provided for a variety of signals injected in simulated Gaussian statistics data (from the LIGO-Virgo joint working group). Indications are that detection is robust with respect to event type and that efficiency compares reasonably with reference methods. The time-frequency representation is shown to be affected by spectral features such as resonant lines. This emphasizes the role of pre-processing.

  18. A Hybrid Method for the Modelling and Optimisation of Constrained Search Problems

    Sitek Pawel

    2014-08-01

    Full Text Available The paper presents a concept and the outline of the implementation of a hybrid approach to modelling and solving constrained problems. Two environments of mathematical programming (in particular, integer programming and declarative programming (in particular, constraint logic programming were integrated. The strengths of integer programming and constraint logic programming, in which constraints are treated in a different way and different methods are implemented, were combined to use the strengths of both. The hybrid method is not worse than either of its components used independently. The proposed approach is particularly important for the decision models with an objective function and many discrete decision variables added up in multiple constraints. To validate the proposed approach, two illustrative examples are presented and solved. The first example is the authors’ original model of cost optimisation in the supply chain with multimodal transportation. The second one is the two-echelon variant of the well-known capacitated vehicle routing problem.

  19. Assessing the search for information on Three Rs methods, and their subsequent implementation: a national survey among scientists in the Netherlands.

    van Luijk, Judith; Cuijpers, Yvonne; van der Vaart, Lilian; Leenaars, Marlies; Ritskes-Hoitinga, Merel

    2011-10-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify these findings on a national level, we conducted a survey among scientists throughout The Netherlands. Due to the low response rate, the results give an impression of opinions, rather than being representative of The Netherlands as a whole. The findings of both surveys complement each other, and indicate that there is room for improvement. Scientists perceive searching the literature for information on Three Rs methods to be a difficult task, and specific Three Rs search skills and knowledge of Three Rs databases are limited. Rather than using a literature search, many researchers obtain information on these methods through personal communication, which means that published information on possible Three Rs methods often remains unfound and unused. A solution might be to move beyond the direct search for information on Three Rs methods and choose another approach. One approach that seems rather appropriate is that of systematic review. This provides insight into the necessity for any new animal studies, as well as optimal implementation of available data and the prevention of unnecessary animal use in the future. 2011 FRAME.

  20. Search for brown dwarfs by gravitational microlensing effect with the pixels method. Analysis of AGAPE and EROS collaborations data

    Melchior, Anne-Laure

    1995-01-01

    This work is involved in baryonic dark matter search in galactic halos. An important collect of observational data has been initiated to test the hypothesis that this dark mass is made of compact objects such as brown dwarfs or small mass stars. The gravitational microlensing effect allows to probe this distribution of this mass type along the line of sight of nearby galaxies such as the Large Magellanic Cloud. A new way to detect these microlensing events has been proposed by P. Baillon et al.: the pixel method. The aim is to detect the amplification of stars which are unresolved or too faint to be seen by classical analysis. First, we present this method and the simulations which allow to establish its feasibility. Then, we describe the pixel analysis of the 91-92 EROS data on the Large Magellanic Cloud. The selection of luminosity variations with a shape compatible with microlensing events allows us to study the sensitivity of this analysis. We see how these results allow us to validate the pixel method applied on a large volume of data. This also shows the possibility to find luminosity variations which escape classical analysis research. Strengthened by these results, we finally describe the analysis of the AGAPE 94 data on the Andromeda galaxy which uses the same pixel method. Being ten times farther away than the Large Magellanic Cloud, the Andromeda galaxy has very few resolved stars, making the pixel method the only way of looking for microlensing events. (author) [fr

  1. Large Neighborhood Search

    Pisinger, David; Røpke, Stefan

    2010-01-01

    Heuristics based on large neighborhood search have recently shown outstanding results in solving various transportation and scheduling problems. Large neighborhood search methods explore a complex neighborhood by use of heuristics. Using large neighborhoods makes it possible to find better...... candidate solutions in each iteration and hence traverse a more promising search path. Starting from the large neighborhood search method,we give an overview of very large scale neighborhood search methods and discuss recent variants and extensions like variable depth search and adaptive large neighborhood...

  2. Thermal Unit Commitment Scheduling Problem in Utility System by Tabu Search Embedded Genetic Algorithm Method

    C. Christober Asir Rajan

    2008-06-01

    Full Text Available The objective of this paper is to find the generation scheduling such that the total operating cost can be minimized, when subjected to a variety of constraints. This also means that it is desirable to find the optimal unit commitment in the power system for the next H hours. A 66-bus utility power system in India demonstrates the effectiveness of the proposed approach; extensive studies have also been performed for different IEEE test systems consist of 24, 57 and 175 buses. Numerical results are shown comparing the cost solutions and computation time obtained by different intelligence and conventional methods.

  3. A fuzzy method for improving the functionality of search engines based on user's web interactions

    Farzaneh Kabirbeyk

    2015-04-01

    Full Text Available Web mining has been widely used to discover knowledge from various sources in the web. One of the important tools in web mining is mining of web user’s behavior that is considered as a way to discover the potential knowledge of web user’s interaction. Nowadays, Website personalization is regarded as a popular phenomenon among web users and it plays an important role in facilitating user access and provides information of users’ requirements based on their own interests. Extracting important features about web user behavior plays a significant role in web usage mining. Such features are page visit frequency in each session, visit duration, and dates of visiting a certain pages. This paper presents a method to predict user’s interest and to propose a list of pages based on their interests by identifying user’s behavior based on fuzzy techniques called fuzzy clustering method. Due to the user’s different interests and use of one or more interest at a time, user’s interest may belong to several clusters and fuzzy clustering provide a possible overlap. Using the resulted cluster helps extract fuzzy rules. This helps detecting user’s movement pattern and using neural network a list of suggested pages to the users is provided.

  4. EVALUATION OF WEB SEARCHING METHOD USING A NOVEL WPRR ALGORITHM FOR TWO DIFFERENT CASE STUDIES

    V. Lakshmi Praba

    2012-04-01

    Full Text Available The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to web data and documents. Web content mining and web structure mining have important roles in identifying the relevant web page. Relevancy of web page denotes how well a retrieved web page or set of web pages meets the information need of the user. Page Rank, Weighted Page Rank and Hypertext Induced Topic Selection (HITS are existing algorithms which considers only web structure mining. Vector Space Model (VSM, Cover Density Ranking (CDR, Okapi similarity measurement (Okapi and Three-Level Scoring method (TLS are some of existing relevancy score methods which consider only web content mining. In this paper, we propose a new algorithm, Weighted Page with Relevant Rank (WPRR which is blend of both web content mining and web structure mining that demonstrates the relevancy of the page with respect to given query for two different case scenarios. It is shown that WPRR’s performance is better than the existing algorithms.

  5. A Hybrid Maximum Power Point Search Method Using Temperature Measurements in Partial Shading Conditions

    Mroczka Janusz

    2014-12-01

    Full Text Available Photovoltaic panels have a non-linear current-voltage characteristics to produce the maximum power at only one point called the maximum power point. In the case of the uniform illumination a single solar panel shows only one maximum power, which is also the global maximum power point. In the case an irregularly illuminated photovoltaic panel many local maxima on the power-voltage curve can be observed and only one of them is the global maximum. The proposed algorithm detects whether a solar panel is in the uniform insolation conditions. Then an appropriate strategy of tracking the maximum power point is taken using a decision algorithm. The proposed method is simulated in the environment created by the authors, which allows to stimulate photovoltaic panels in real conditions of lighting, temperature and shading.

  6. A novel analytical method for pharmaceutical polymorphs by terahertz spectroscopy and the optimization of crystal form at the discovery stage.

    Ikeda, Yukihiro; Ishihara, Yoko; Moriwaki, Toshiya; Kato, Eiji; Terada, Katsuhide

    2010-01-01

    A novel analytical method for the determination of pharmaceutical polymorphs was developed using terahertz spectroscopy. It was found out that each polymorph of a substance showed a specific terahertz absorption spectrum. In particular, analysis of the second derivative spectrum was enormously beneficial in the discrimination of closely related polymorphs that were difficult to discern by powder X-ray diffractometry. Crystal forms that were obtained by crystallization from various solvents and stored under various conditions were specifically characterized by the second derivative of each terahertz spectrum. Fractional polymorphic transformation for substances stored under stressed conditions was also identified by terahertz spectroscopy during solid-state stability test, but could not be detected by powder X-ray diffractometry. Since polymorphs could be characterized clearly by terahertz spectroscopy, further physicochemical studies could be conducted in a timely manner. The development form of compound examined was determined by the results of comprehensive physicochemical studies that included thermodynamic relationships, as well as chemical and physicochemical stability. In conclusion, terahertz spectroscopy, which has unique power in the elucidation of molecular interaction within a crystal lattice, can play more important role in physicochemical research. Terahertz spectroscopy has a great potential as a tool for polymorphic determination, particularly since the second derivative of the terahertz spectrum possesses high sensitivity for pharmaceutical polymorphs.

  7. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  8. Supernovae Discovery Efficiency

    John, Colin

    2018-01-01

    Abstract:We present supernovae (SN) search efficiency measurements for recent Hubble Space Telescope (HST) surveys. Efficiency is a key component to any search, and is important parameter as a correction factor for SN rates. To achieve an accurate value for efficiency, many supernovae need to be discoverable in surveys. This cannot be achieved from real SN only, due to their scarcity, so fake SN are planted. These fake supernovae—with a goal of realism in mind—yield an understanding of efficiency based on position related to other celestial objects, and brightness. To improve realism, we built a more accurate model of supernovae using a point-spread function. The next improvement to realism is planting these objects close to galaxies and of various parameters of brightness, magnitude, local galactic brightness and redshift. Once these are planted, a very accurate SN is visible and discoverable by the searcher. It is very important to find factors that affect this discovery efficiency. Exploring the factors that effect detection yields a more accurate correction factor. Further inquires into efficiency give us a better understanding of image processing, searching techniques and survey strategies, and result in an overall higher likelihood to find these events in future surveys with Hubble, James Webb, and WFIRST telescopes. After efficiency is discovered and refined with many unique surveys, it factors into measurements of SN rates versus redshift. By comparing SN rates vs redshift against the star formation rate we can test models to determine how long star systems take from the point of inception to explosion (delay time distribution). This delay time distribution is compared to SN progenitors models to get an accurate idea of what these stars were like before their deaths.

  9. Identifying the plant-associated microbiome across aquatic and terrestrial environments: the effects of amplification method on taxa discovery

    Jackrel, Sara L. [Department of Ecology and Evolution, The University of Chicago, 1101 E 57th Street Chicago IL 60637 USA; Owens, Sarah M. [Biosciences Division, Argonne National Laboratory, 9700 S. Cass Avenue Lemont IL 60439 USA; Gilbert, Jack A. [Biosciences Division, Argonne National Laboratory, 9700 S. Cass Avenue Lemont IL 60439 USA; The Microbiome Center, Department of Surgery, The University of Chicago, 5841 S Maryland Ave Chicago IL 60637 USA; Pfister, Catherine A. [Department of Ecology and Evolution, The University of Chicago, 1101 E 57th Street Chicago IL 60637 USA

    2017-01-25

    Plants in terrestrial and aquatic environments contain a diverse microbiome. Yet, the chloroplast and mitochondria organelles of the plant eukaryotic cell originate from free-living cyanobacteria and Rickettsiales. This represents a challenge for sequencing the plant microbiome with universal primers, as ~99% of 16S rRNA sequences may consist of chloroplast and mitochondrial sequences. Peptide nucleic acid clamps offer a potential solution by blocking amplification of host-associated sequences. We assessed the efficacy of chloroplast and mitochondria-blocking clamps against a range of microbial taxa from soil, freshwater and marine environments. While we found that the mitochondrial blocking clamps appear to be a robust method for assessing animal-associated microbiota, Proteobacterial 16S rRNA binds to the chloroplast-blocking clamp, resulting in a strong sequencing bias against this group. We attribute this bias to a conserved 14-bp sequence in the Proteobacteria that matches the 17-bp chloroplast-blocking clamp sequence. By scanning the Greengenes database, we provide a reference list of nearly 1500 taxa that contain this 14-bp sequence, including 48 families such as the Rhodobacteraceae, Phyllobacteriaceae, Rhizobiaceae, Kiloniellaceae and Caulobacteraceae. To determine where these taxa are found in nature, we mapped this taxa reference list against the Earth Microbiome Project database. These taxa are abundant in a variety of environments, particularly aquatic and semiaquatic freshwater and marine habitats. To facilitate informed decisions on effective use of organelle-blocking clamps, we provide a searchable database of microbial taxa in the Greengenes and Silva databases matching various n-mer oligonucleotides of each PNA sequence.

  10. THE USE OF SELECTED ANAESTHETIC DRUGS IN SEARCH OF A METHOD FOR IMPROVING EARTHWORMS’ WELFARE

    Agnieszka Podolak-Machowska

    2013-07-01

    Full Text Available This paper describes selected effects of body contact of earthworms Dendrobaena veneta Rosa with local anaesthetic (LA drugs used for human anesthesia (lidocaine and prilocaine and anaesthetics for aquatic animals (MS-222. The findings showed safe and effective immobilization of earthworms with prilocaine at a concentration of 0.25-1%. At the applied concentrations lidocaine was safe, but less effective. On the other hand, MS-222, at the applied concentrations had a strongly irritating effect for earthworms and induced convulsive body movements connected with a discharge of coelomic fluid. The results may be relevant both for improving the welfare of earthworms during experiments and for the organization of research involving testing drugs on invertebrates. In this case, by using earthworms as an experimental model and by applying the method for measuring their mobility after contact with anaesthetics, which has been described in this article, it might be possible to replace experiments on guinea pigs, rabbits, rats and mice, which are expensive and require an approval of an ethics committee, with laboratory tests on earthworms.

  11. IN SEARCH OF A FAST SCREENING METHOD FOR DETECTING THE MALINGERING OF COGNITIVE IMPAIRMENT

    Amada Ampudia

    2012-07-01

    Full Text Available Forensic settings demand expedient and conclusive forensic psychological assessment. The aim of this study was to design a simple and fast, but reliable psychometric instrument for detecting the malingering of cognitive impairment. In a quasi-experimental design, 156 individuals were divided into three groups: a normal group with no cognitive impairment; a Mild Cognitive Impairment (MCI group; and a group of informed malingerers with no MCI who feigned cognitive impairment. Receiver Operating Curve (ROC analysis of the Test of Memory Malingering (TOMM, and of several subtests of the Wechsler Memory Scale (WMS-III revealed that the WMS-III was as reliable and accurate as the TOMM in discriminating malingerers from the honest. The results revealed that the diagnostic accuracy, sensitivity and specificity of the WMS-III Auditory Recognition Delayed of Verbal Paired Associates subtest was similar to the TOMM in discriminating malingering from genuine memory impairment. In conclusion, the WMS-III Recognition of Verbal Paired Associates subtest and the TOMM provide a fast, valid and reliable screening method for detecting the malingering of cognitive impairment.

  12. The trouble with justification, in search of an ethics of method for energy governance

    Meskens, G.

    2015-01-01

    In this series of slides the author tries to clarify the relationships between risks, knowledge, fairness and public acceptance. Fairness appears to be an important concept when dealing with risks. Fairness means a fair distribution of benefits and burdens and implies an informed consent or the possibility to avoid the risk. Fair risk justification in energy governance is risk justification of which the method of knowledge generation and decision making is trusted as fair by society. Seeking societal trust is the aim of sciences and technologies but the global social challenges we face are ultimately complex and our modern societies based on representative democracy do not favor consensus and stand in various comforts of polarization: pro or contra nuclear energy for instance, the same arguments being used in both sides. The result is a polarisation maintained by a lack of methodological intellectual confrontation in the structures of politics, science and civil society. There is a need for new practical forms of democracy, research and education

  13. A search for susceptibility loci for anorexia nervosa: methods and sample description.

    Kaye, W H; Lilenfeld, L R; Berrettini, W H; Strober, M; Devlin, B; Klump, K L; Goldman, D; Bulik, C M; Halmi, K A; Fichter, M M; Kaplan, A; Woodside, D B; Treasure, J; Plotnicov, K H; Pollice, C; Rao, R; McConaha, C W

    2000-05-01

    Eating disorders have not traditionally been viewed as heritable illnesses; however, recent family and twin studies lend credence to the potential role of genetic transmission. The Price Foundation funded an international, multisite study to identify genetic factors contributing to the pathogenesis of anorexia nervosa (AN) by recruiting affective relative pairs. This article is an overview of study methods and the clinical characteristics of the sample. All probands met modified DSM-IV criteria for AN; all affected first, second, and third degree relatives met DSM-IV criteria for AN, bulimia nervosa (BN), or eating disorder not otherwise specified (NOS). Probands and affected relatives were assessed diagnostically with the Structured Interview for Anorexia and Bulimia. DNA was collected from probands, affected relatives and a subset of their biological parents. Assessments were obtained from 196 probands and 237 affected relatives, over 98% of whom are of Caucasian ancestry. Overall, there were 229 relative pairs who were informative for linkage analysis. Of the proband-relative pairs, 63% were AN-AN, 20% were AN-BN, and 16% were AN-NOS. For family-based association analyses, DNA has been collected from both biological parents of 159 eating-disordered subjects. Few significant differences in demographic characteristics were found between proband and relative groups. The present study represents the first large-scale molecular genetic investigation of AN. Our successful recruitment of over 500 subjects, consisting of affected probands, affected relatives, and their biological parents, will provide the basis to investigate genetic transmission of eating disorders via a genome scan and assessment of candidate genes.

  14. Search Help

    Guidance and search help resource listing examples of common queries that can be used in the Google Search Appliance search request, including examples of special characters, or query term seperators that Google Search Appliance recognizes.

  15. Applying systematic review search methods to the grey literature: a case study examining guidelines for school-based breakfast programs in Canada.

    Godin, Katelyn; Stapleton, Jackie; Kirkpatrick, Sharon I; Hanning, Rhona M; Leatherdale, Scott T

    2015-10-22

    Grey literature is an important source of information for large-scale review syntheses. However, there are many characteristics of grey literature that make it difficult to search systematically. Further, there is no 'gold standard' for rigorous systematic grey literature search methods and few resources on how to conduct this type of search. This paper describes systematic review search methods that were developed and applied to complete a case study systematic review of grey literature that examined guidelines for school-based breakfast programs in Canada. A grey literature search plan was developed to incorporate four different searching strategies: (1) grey literature databases, (2) customized Google search engines, (3) targeted websites, and (4) consultation with contact experts. These complementary strategies were used to minimize the risk of omitting relevant sources. Since abstracts are often unavailable in grey literature documents, items' abstracts, executive summaries, or table of contents (whichever was available) were screened. Screening of publications' full-text followed. Data were extracted on the organization, year published, who they were developed by, intended audience, goal/objectives of document, sources of evidence/resources cited, meals mentioned in the guidelines, and recommendations for program delivery. The search strategies for identifying and screening publications for inclusion in the case study review was found to be manageable, comprehensive, and intuitive when applied in practice. The four search strategies of the grey literature search plan yielded 302 potentially relevant items for screening. Following the screening process, 15 publications that met all eligibility criteria remained and were included in the case study systematic review. The high-level findings of the case study systematic review are briefly described. This article demonstrated a feasible and seemingly robust method for applying systematic search strategies to

  16. Search for neutral leptons

    Perl, M.L.

    1984-12-01

    At present we know of three kinds of neutral leptons: the electron neutrino, the muon neutrino, and the tau neutrino. This paper reviews the search for additional neutral leptons. The method and significance of a search depends upon the model used for the neutral lepton being sought. Some models for the properties and decay modes of proposed neutral leptons are described. Past and present searches are reviewed. The limits obtained by some completed searches are given, and the methods of searches in progress are described. Future searches are discussed. 41 references

  17. Information-Fusion Methods Based Simultaneous Localization and Mapping for Robot Adapting to Search and Rescue Postdisaster Environments

    Hongling Wang

    2018-01-01

    Full Text Available The first application of utilizing unique information-fusion SLAM (IF-SLAM methods is developed for mobile robots performing simultaneous localization and mapping (SLAM adapting to search and rescue (SAR environments in this paper. Several fusion approaches, parallel measurements filtering, exploration trajectories fusing, and combination sensors’ measurements and mobile robots’ trajectories, are proposed. The novel integration particle filter (IPF and optimal improved EKF (IEKF algorithms are derived for information-fusion systems to perform SLAM task in SAR scenarios. The information-fusion architecture consists of multirobots and multisensors (MAM; multiple robots mount on-board laser range finder (LRF sensors, localization sonars, gyro odometry, Kinect-sensor, RGB-D camera, and other proprioceptive sensors. This information-fusion SLAM (IF-SLAM is compared with conventional methods, which indicates that fusion trajectory is more consistent with estimated trajectories and real observation trajectories. The simulations and experiments of SLAM process are conducted in both cluttered indoor environment and outdoor collapsed unstructured scenario, and experimental results validate the effectiveness of the proposed information-fusion methods in improving SLAM performances adapting to SAR scenarios.

  18. Cochrane Qualitative and Implementation Methods Group guidance series-paper 2: methods for question formulation, searching, and protocol development for qualitative evidence synthesis.

    Harris, Janet L; Booth, Andrew; Cargo, Margaret; Hannes, Karin; Harden, Angela; Flemming, Kate; Garside, Ruth; Pantoja, Tomas; Thomas, James; Noyes, Jane

    2018-05-01

    This paper updates previous Cochrane guidance on question formulation, searching, and protocol development, reflecting recent developments in methods for conducting qualitative evidence syntheses to inform Cochrane intervention reviews. Examples are used to illustrate how decisions about boundaries for a review are formed via an iterative process of constructing lines of inquiry and mapping the available information to ascertain whether evidence exists to answer questions related to effectiveness, implementation, feasibility, appropriateness, economic evidence, and equity. The process of question formulation allows reviewers to situate the topic in relation to how it informs and explains effectiveness, using the criterion of meaningfulness, appropriateness, feasibility, and implementation. Questions related to complex questions and interventions can be structured by drawing on an increasingly wide range of question frameworks. Logic models and theoretical frameworks are useful tools for conceptually mapping the literature to illustrate the complexity of the phenomenon of interest. Furthermore, protocol development may require iterative question formulation and searching. Consequently, the final protocol may function as a guide rather than a prescriptive route map, particularly in qualitative reviews that ask more exploratory and open-ended questions. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Nutrition and biomarkers in psychiatry : research on micronutrient deficiencies in schizophrenia, the role of the intestine in the hyperserotonemia of autism, and a method for non-hypothesis driven discovery of biomarkers in urine

    Kemperman, Ramses Franciscus Jacobus

    2007-01-01

    This thesis describes the study of markers of nutrition and intestinal motility in mental disorders with a focus on schizophrenia and autism, and the development, evaluation and application of a biomarker discovery method for urine. The aim of the thesis is to investigate the role of long-chain

  20. Searching for Life with Rovers: Exploration Methods & Science Results from the 2004 Field Campaign of the "Life in the Atacama" Project and Applications to Future Mars Missions

    Cabrol, N. A.a; Wettergreen, D. S.; Whittaker, R.; Grin, E. A.; Moersch, J.; Diaz, G. Chong; Cockell, C.; Coppin, P.; Dohm, J. M.; Fisher, G.

    2005-01-01

    The Life In The Atacama (LITA) project develops and field tests a long-range, solarpowered, automated rover platform (Zo ) and a science payload assembled to search for microbial life in the Atacama desert. Life is barely detectable over most of the driest desert on Earth. Its unique geological, climatic, and biological evolution have created a unique training site for designing and testing exploration strategies and life detection methods for the robotic search for life on Mars.

  1. Applications and Methods Utilizing the Simple Semantic Web Architecture and Protocol (SSWAP) for Bioinformatics Resource Discovery and Disparate Data and Service Integration

    Scientific data integration and computational service discovery are challenges for the bioinformatic community. This process is made more difficult by the separate and independent construction of biological databases, which makes the exchange of scientific data between information resources difficu...

  2. Fluorination methods in drug discovery

    Yerien, Damián Emilio; Bonesi, Sergio Mauricio; Postigo, Jose Alberto

    2017-01-01

    Fluorination reactions of medicinal and biologically-active compounds will be discussed. Late stage fluorination strategies of medicinal targets have recently attracted considerable attention on account of the influence that the fluorine atom can impart to targets of medicinal importance, such as a modulation of lipophilicity, electronegativity, basicity and bioavailability, this latter as a consequence of membrane permeability. Therefore, the recourse to late-stage fluorine substitution on c...

  3. Strategies for the search of life in the universe

    Schneider, Jean

    1996-01-01

    The discovery of an increasing number of Jupiter-like planets in orbit around other stars (or extra-solar planets) is a promising first step toward the search for Life in the Universe. We review all aspects of the question: - definition of Life - definition and characterization of the `habitable zone' around a star - overview of detection methods of planets, with special attention to habitable planets - present fingings - future projects.

  4. Search route decision of environmental monitoring at emergency time

    Aoyama, Isao

    1979-01-01

    The search route decision method is reviewed, especially the adequate arrangement of monitors in view of time in the information-gathering activity by transferring the monitors on the horizontal space after the confirmation of the abnormal release of radioactive material. As for the field of the theory of search, the developmental history is explained, namely the experiences of the naval anti submarine operation in WW-2, the salvage activities and the search problem on the sea. The kinematics for search, the probability theory for detection and the optimum distribution for search are the most important contents of the application of theory of search relating to the environmental monitoring at emergency condition. The combination of a search model consists of the peculiarity of targets, the peculiarity of observers and the standard of optimality. The peculiarity of targets is divided into the space of search, the number of targets, the way of appearance of targets and the motion of targets. The peculiarity of observers is divided into the number of observers, the divisibility of efforts for search, the credibility of search information and the search process. The standard of optimality is divided into the maximum probability of detection, the minimum risk expected and the others. Each item written above of search model is explained. Concerning the formulation of the search model, the theoretical equations for detection probability, discovery potential and instantaneous detection probability, density are derived, and these equations are evaluated and explained. The future plan is to advance the search technology so as to evaluate the detection potential to decide the route of running a monitoring car for a nuclear power plant at accidental condition. (Nakai, Y.)

  5. Searching for intermediate-mass black holes in galaxies with low-luminosity AGN: a multiple-method approach

    Koliopanos, F.; Ciambur, B.; Graham, A.; Webb, N.; Coriat, M.; Mutlu-Pakdil, B.; Davis, B.; Godet, O.; Barret, D.; Seigar, M.

    2017-10-01

    Intermediate Mass Black Holes (IMBHs) are predicted by a variety of models and are the likely seeds for super massive BHs (SMBHs). However, we have yet to establish their existence. One method, by which we can discover IMBHs, is by measuring the mass of an accreting BH, using X-ray and radio observations and drawing on the correlation between radio luminosity, X-ray luminosity and the BH mass, known as the fundamental plane of BH activity (FP-BH). Furthermore, the mass of BHs in the centers of galaxies, can be estimated using scaling relations between BH mass and galactic properties. We are initiating a campaign to search for IMBH candidates in dwarf galaxies with low-luminosity AGN, using - for the first time - three different scaling relations and the FP-BH, simultaneously. In this first stage of our campaign, we measure the mass of seven LLAGN, that have been previously suggested to host central IMBHs, investigate the consistency between the predictions of the BH scaling relations and the FP-BH, in the low mass regime and demonstrate that this multiple method approach provides a robust average mass prediction. In my talk, I will discuss our methodology, results and next steps of this campaign.

  6. AGAPEROS Searches for microlensing in the LMC with the Pixel Method; 1, Data treatment and pixel light curves production

    Melchior, A.-L.; Ansari, R.; Aubourg, E.; Baillon, P.; Bareyre, P.; Bauer, F.; Beaulieu, J.-Ph.; Bouquet, A.; Brehin, S.; Cavalier, F.; Char, S.; Couchot, F.; Coutures, C.; Ferlet, R.; Fernandez, J.; Gaucherel, C.; Giraud-Heraud, Y.; Glicenstein, J.-F.; Goldman, B.; Gondolo, P.; Gros, M.; Guibert, J.; Gry, C.; Hardin, D.; Kaplan, J.; de Kat, J.; Lachieze-Rey, M.; Laurent, B.; Lesquoy, E.; Magneville, Ch.; Mansoux, B.; Marquette, J.-B.; Maurice, E.; Milsztajn, A.; Moniez, M.; Moreau, O.; Moscoso, L.; Palanque-Delabrouille, N.; Perdereau, O.; Prevot, L.; Renault, C.; Queinnec, F.; Rich, J.; Spiro, M.; Vigroux, L.; Zylberajch, S.; Vidal-Madjar, A.; Magneville, Ch.

    1999-01-01

    The presence and abundance of MAssive Compact Halo Objects (MACHOs) towards the Large Magellanic Cloud (LMC) can be studied with microlensing searches. The 10 events detected by the EROS and MACHO groups suggest that objects with 0.5 Mo could fill 50% of the dark halo. This preferred mass is quite surprising, and increasing the presently small statistics is a crucial issue. Additional microlensing of stars too dim to be resolved in crowded fields should be detectable using the Pixel Method. We present here an application of this method to the EROS 91-92 data (one tenth of the whole existing data set). We emphasize the data treatment required for monitoring pixel fluxes. Geometric and photometric alignments are performed on each image. Seeing correction and error estimates are discussed. 3.6" x 3.6" super-pixel light curves, thus produced, are very stable over the 120 days time-span. Fluctuations at a level of 1.8% of the flux in blue and 1.3% in red are measured on the pixel light curves. This level of stabil...

  7. Astrobiology, discovery, and societal impact

    Dick, Steven J

    2018-01-01

    The search for life in the universe, once the stuff of science fiction, is now a robust worldwide research program with a well-defined roadmap probing both scientific and societal issues. This volume examines the humanistic aspects of astrobiology, systematically discussing the approaches, critical issues, and implications of discovering life beyond Earth. What do the concepts of life and intelligence, culture and civilization, technology and communication mean in a cosmic context? What are the theological and philosophical implications if we find life - and if we do not? Steven J. Dick argues that given recent scientific findings, the discovery of life in some form beyond Earth is likely and so we need to study the possible impacts of such a discovery and formulate policies to deal with them. The remarkable and often surprising results are presented here in a form accessible to disciplines across the sciences, social sciences, and humanities.

  8. Search for New Quantum Algorithms

    Lomonaco, Samuel J; Kauffman, Louis H

    2006-01-01

    .... Additionally, methods and techniques of quantum topology have been used to obtain new results in quantum computing including discovery of a relationship between quantum entanglement and topological linking...

  9. Discovery of a general method of solving the Schrödinger and dirac equations that opens a way to accurately predictive quantum chemistry.

    Nakatsuji, Hiroshi

    2012-09-18

    Just as Newtonian law governs classical physics, the Schrödinger equation (SE) and the relativistic Dirac equation (DE) rule the world of chemistry. So, if we can solve these equations accurately, we can use computation to predict chemistry precisely. However, for approximately 80 years after the discovery of these equations, chemists believed that they could not solve SE and DE for atoms and molecules that included many electrons. This Account reviews ideas developed over the past decade to further the goal of predictive quantum chemistry. Between 2000 and 2005, I discovered a general method of solving the SE and DE accurately. As a first inspiration, I formulated the structure of the exact wave function of the SE in a compact mathematical form. The explicit inclusion of the exact wave function's structure within the variational space allows for the calculation of the exact wave function as a solution of the variational method. Although this process sounds almost impossible, it is indeed possible, and I have published several formulations and applied them to solve the full configuration interaction (CI) with a very small number of variables. However, when I examined analytical solutions for atoms and molecules, the Hamiltonian integrals in their secular equations diverged. This singularity problem occurred in all atoms and molecules because it originates from the singularity of the Coulomb potential in their Hamiltonians. To overcome this problem, I first introduced the inverse SE and then the scaled SE. The latter simpler idea led to immediate and surprisingly accurate solution for the SEs of the hydrogen atom, helium atom, and hydrogen molecule. The free complement (FC) method, also called the free iterative CI (free ICI) method, was efficient for solving the SEs. In the FC method, the basis functions that span the exact wave function are produced by the Hamiltonian of the system and the zeroth-order wave function. These basis functions are called complement

  10. Search for antimatter in 1012 eV cosmic rays using Artemis method and interpretation of the cosmic rays spectrum

    Pomarede, D.

    1999-04-01

    This thesis is divided into three parts. The first part is a review of the present knowledge of the antimatter and of the cosmic rays. Theoretical and experimental aspects are presented. It is demonstrated that a measurement of the antimatter abundance in TeV cosmic rays is of fundamental interest, and would establish the symmetric or asymmetric nature of the Universe. The second part is dedicated to the method of antimatter research through the Earth Moon ion spectrometer (ARTEMIS). The account is given of the winter 1996-97 41-nights observation campaign undertaken at the Whipple Observatory in Arizona (USA). A 109 photomultiplier camera is operated on the 40 meter telescope to detect by Cherenkov imaging the cosmic ray initiated showers. We describe the performance of an optical filter used to reduce the noise. The development and the utilization of a simulation program are described. The main work is the analysis of the data: data characterization, understanding of the apparatus, understanding of the noise and its influence, calibration, search for signals by different methods. Subtle systematic effects are uncovered. The simulations establish that the amount of data is insufficient to reveal a shadow effect in the cosmic ray flux. The conclusion of this work is that the experimental setup was not suitable, and we propose important improvements of the method based on a bigger focal plane that would allow to reach a one percent sensitivity on the antimatter content of the cosmic rays. In the third part of the thesis, an interpretation of the total cosmic ray spectrum is proposed and discussed. (author)

  11. Usability of Discovery Portals

    Bulens, J.D.; Vullings, L.A.E.; Houtkamp, J.M.; Vanmeulebrouk, B.

    2013-01-01

    As INSPIRE progresses to be implemented in the EU, many new discovery portals are built to facilitate finding spatial data. Currently the structure of the discovery portals is determined by the way spatial data experts like to work. However, we argue that the main target group for discovery portals are not spatial data experts but professionals with limited spatial knowledge, and a focus outside the spatial domain. An exploratory usability experiment was carried out in which three discovery p...

  12. User needs analysis and usability assessment of DataMed - a biomedical data discovery index.

    Dixit, Ram; Rogith, Deevakar; Narayana, Vidya; Salimi, Mandana; Gururaj, Anupama; Ohno-Machado, Lucila; Xu, Hua; Johnson, Todd R

    2017-11-30

    To present user needs and usability evaluations of DataMed, a Data Discovery Index (DDI) that allows searching for biomedical data from multiple sources. We conducted 2 phases of user studies. Phase 1 was a user needs analysis conducted before the development of DataMed, consisting of interviews with researchers. Phase 2 involved iterative usability evaluations of DataMed prototypes. We analyzed data qualitatively to document researchers' information and user interface needs. Biomedical researchers' information needs in data discovery are complex, multidimensional, and shaped by their context, domain knowledge, and technical experience. User needs analyses validate the need for a DDI, while usability evaluations of DataMed show that even though aggregating metadata into a common search engine and applying traditional information retrieval tools are promising first steps, there remain challenges for DataMed due to incomplete metadata and the complexity of data discovery. Biomedical data poses distinct problems for search when compared to websites or publications. Making data available is not enough to facilitate biomedical data discovery: new retrieval techniques and user interfaces are necessary for dataset exploration. Consistent, complete, and high-quality metadata are vital to enable this process. While available data and researchers' information needs are complex and heterogeneous, a successful DDI must meet those needs and fit into the processes of biomedical researchers. Research directions include formalizing researchers' information needs, standardizing overviews of data to facilitate relevance judgments, implementing user interfaces for concept-based searching, and developing evaluation methods for open-ended discovery systems such as DDIs. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  13. A Novel Method Using Abstract Convex Underestimation in Ab-Initio Protein Structure Prediction for Guiding Search in Conformational Feature Space.

    Hao, Xiao-Hu; Zhang, Gui-Jun; Zhou, Xiao-Gen; Yu, Xu-Feng

    2016-01-01

    To address the searching problem of protein conformational space in ab-initio protein structure prediction, a novel method using abstract convex underestimation (ACUE) based on the framework of evolutionary algorithm was proposed. Computing such conformations, essential to associate structural and functional information with gene sequences, is challenging due to the high-dimensionality and rugged energy surface of the protein conformational space. As a consequence, the dimension of protein conformational space should be reduced to a proper level. In this paper, the high-dimensionality original conformational space was converted into feature space whose dimension is considerably reduced by feature extraction technique. And, the underestimate space could be constructed according to abstract convex theory. Thus, the entropy effect caused by searching in the high-dimensionality conformational space could be avoided through such conversion. The tight lower bound estimate information was obtained to guide the searching direction, and the invalid searching area in which the global optimal solution is not located could be eliminated in advance. Moreover, instead of expensively calculating the energy of conformations in the original conformational space, the estimate value is employed to judge if the conformation is worth exploring to reduce the evaluation time, thereby making computational cost lower and the searching process more efficient. Additionally, fragment assembly and the Monte Carlo method are combined to generate a series of metastable conformations by sampling in the conformational space. The proposed method provides a novel technique to solve the searching problem of protein conformational space. Twenty small-to-medium structurally diverse proteins were tested, and the proposed ACUE method was compared with It Fix, HEA, Rosetta and the developed method LEDE without underestimate information. Test results show that the ACUE method can more rapidly and more

  14. The limits of de novo DNA motif discovery.

    David Simcha

    Full Text Available A major challenge in molecular biology is reverse-engineering the cis-regulatory logic that plays a major role in the control of gene expression. This program includes searching through DNA sequences to identify "motifs" that serve as the binding sites for transcription factors or, more generally, are predictive of gene expression across cellular conditions. Several approaches have been proposed for de novo motif discovery-searching sequences without prior knowledge of binding sites or nucleotide patterns. However, unbiased validation is not straightforward. We consider two approaches to unbiased validation of discovered motifs: testing the statistical significance of a motif using a DNA "background" sequence model to represent the null hypothesis and measuring performance in predicting membership in gene clusters. We demonstrate that the background models typically used are "too null," resulting in overly optimistic assessments of significance, and argue that performance in predicting TF binding or expression patterns from DNA motifs should be assessed by held-out data, as in predictive learning. Applying this criterion to common motif discovery methods resulted in universally poor performance, although there is a marked improvement when motifs are statistically significant against real background sequences. Moreover, on synthetic data where "ground truth" is known, discriminative performance of all algorithms is far below the theoretical upper bound, with pronounced "over-fitting" in training. A key conclusion from this work is that the failure of de novo discovery approaches to accurately identify motifs is basically due to statistical intractability resulting from the fixed size of co-regulated gene clusters, and thus such failures do not necessarily provide evidence that unfound motifs are not active biologically. Consequently, the use of prior knowledge to enhance motif discovery is not just advantageous but necessary. An implementation of

  15. Usability of Discovery Portals

    Bulens, J.D.; Vullings, L.A.E.; Houtkamp, J.M.; Vanmeulebrouk, B.

    2013-01-01

    As INSPIRE progresses to be implemented in the EU, many new discovery portals are built to facilitate finding spatial data. Currently the structure of the discovery portals is determined by the way spatial data experts like to work. However, we argue that the main target group for discovery portals

  16. Discovery and the atom

    1989-01-01

    ''Discovery and the Atom'' tells the story of the founding of nuclear physics. This programme looks at nuclear physics up to the discovery of the neutron in 1932. Animation explains the science of the classic experiments, such as the scattering of alpha particles by Rutherford and the discovery of the nucleus. Archive film shows the people: Lord Rutherford, James Chadwick, Marie Curie. (author)

  17. Mapping online consumer search

    Bronnenberg, B.J.; Kim, J.; Albuquerque, P.

    2011-01-01

    The authors propose a new method to visualize browsing behavior in so-called product search maps. Manufacturers can use these maps to understand how consumers search for competing products before choice, including how information acquisition and product search are organized along brands, product

  18. Promise Fulfilled? An EBSCO Discovery Service Usability Study

    Williams, Sarah C.; Foster, Anita K.

    2011-01-01

    Discovery tools are the next phase of library search systems. Illinois State University's Milner Library implemented EBSCO Discovery Service in August 2010. The authors conducted usability studies on the system in the fall of 2010. The aims of the study were twofold: first, to determine how Milner users set about using the system in order to…

  19. Measurements of the Stiffness and Thickness of the Pavement Asphalt Layer Using the Enhanced Resonance Search Method

    Nur Mustakiza Zakaria

    2014-01-01

    Full Text Available Enhanced resonance search (ERS is a nondestructive testing method that has been created to evaluate the quality of a pavement by means of a special instrument called the pavement integrity scanner (PiScanner. This technique can be used to assess the thickness of the road pavement structure and the profile of shear wave velocity by using the principle of surface wave and body wave propagation. In this study, the ERS technique was used to determine the actual thickness of the asphaltic pavement surface layer, while the shear wave velocities obtained were used to determine its dynamic elastic modulus. A total of fifteen locations were identified and the results were then compared with the specifications of the Malaysian PWD, MDD UKM, and IKRAM. It was found that the value of the elastic modulus of materials is between 3929 MPa and 17726 MPa. A comparison of the average thickness of the samples with the design thickness of MDD UKM showed a difference of 20 to 60%. Thickness of the asphalt surface layer followed the specifications of Malaysian PWD and MDD UKM, while some of the values of stiffness obtained are higher than the standard.

  20. Search for rare processes with a Z+bb signature at the LHC, with the matrix element method

    Beluffi, Camille; Lemaitre, Vincent

    This thesis presents a detailed study of the final state with the Z boson decaying into two leptons, produced in the CMS detector at the LHC. In order to tag this topology, sophisticated b jet tagging algorithms have been used, and the calibration of one of them, the Jet Probability (JP) tagger is exposed. A study of the tagger degradation at high energy has been done and led to a small gain of performance. This investigation is followed by the search for the associated production of the standard model (SM) Higgs boson with a Z boson and decaying into two b quarks (ZH channel), using the Matrix Element Method (MEM) and two b-taggers: JP and Combined Secondary Vertex (CSV). The MEM is an advanced tool that produces an event-by-event discriminating variable, called weight. To apply it, several sets of transfer function have been produced. The final results give an observed limit on the ZH production cross section with the H → bb branching ratio of 5.46xσSM when using the CSV tagger and 4.89xσSM when using t...

  1. A New Method for a Piezoelectric Energy Harvesting System Using a Backtracking Search Algorithm-Based PI Voltage Controller

    Mahidur R. Sarker

    2016-09-01

    Full Text Available This paper presents a new method for a vibration-based piezoelectric energy harvesting system using a backtracking search algorithm (BSA-based proportional-integral (PI voltage controller. This technique eliminates the exhaustive conventional trial-and-error procedure for obtaining optimized parameter values of proportional gain (Kp, and integral gain (Ki for PI voltage controllers. The generated estimate values of Kp and Ki are executed in the PI voltage controller that is developed through the BSA optimization technique. In this study, mean absolute error (MAE is used as an objective function to minimize output error for a piezoelectric energy harvesting system (PEHS. The model for the PEHS is designed and analyzed using the BSA optimization technique. The BSA-based PI voltage controller of the PEHS produces a significant improvement in minimizing the output error of the converter and a robust, regulated pulse-width modulation (PWM signal to convert a MOSFET switch, with the best response in terms of rise time and settling time under various load conditions.

  2. Search for supersymmetry in dileptonic final states with jets and missing transverse energy with the JZB method at CMS

    Feld, Lutz; Schulte, Jan-Frederik; Teroerde, Marius [I. Physikalisches Institut B, RWTH Aachen University (Germany)

    2016-07-01

    Supersymmetry (SUSY) is a popular extension of the Standard Model of particle physics, as it would solve a variety of problems in modern physics. In the model considered in the presented analysis, a possible final state contains jets as well as the stable lightest supersymmetric particle (LSP), which is produced together with a leptonically decaying Z boson. Thus, the signal is characterized by two same-flavour opposite-sign leptons, missing transverse energy (MET) and the presence of two or more jets. An important background for this search is the Drell-Yan process with additional jets, as it has a similar event topology. In contrast to SUSY events, Drell-Yan events only contain instrumental MET. Therefore, the variable ''jet-Z balance'' (JZB), which takes the transverse momentum of the Z boson and the hadronic recoil into account, is distributed differently for Drell-Yan and SUSY events. This allows the definition of a signal depleted control region which is used to predict the Drell-Yan background. The JZB method was successfully used in several analyses at √(s)=7-8 TeV. This talk shows the current progress towards its application on the dataset collected at √(s)=13 TeV.

  3. Evaluation of the fast orthogonal search method for forecasting chloride levels in the Deltona groundwater supply (Florida, USA)

    El-Jaat, Majda; Hulley, Michael; Tétreault, Michel

    2018-02-01

    Despite the broad impact and importance of saltwater intrusion in coastal aquifers, little research has been directed towards forecasting saltwater intrusion in areas where the source of saltwater is uncertain. Saline contamination in inland groundwater supplies is a concern for numerous communities in the southern US including the city of Deltona, Florida. Furthermore, conventional numerical tools for forecasting saltwater contamination are heavily dependent on reliable characterization of the physical characteristics of underlying aquifers, information that is often absent or challenging to obtain. To overcome these limitations, a reliable alternative data-driven model for forecasting salinity in a groundwater supply was developed for Deltona using the fast orthogonal search (FOS) method. FOS was applied on monthly water-demand data and corresponding chloride concentrations at water supply wells. Groundwater salinity measurements from Deltona water supply wells were applied to evaluate the forecasting capability and accuracy of the FOS model. Accurate and reliable groundwater salinity forecasting is necessary to support effective and sustainable coastal-water resource planning and management. The available (27) water supply wells for Deltona were randomly split into three test groups for the purposes of FOS model development and performance assessment. Based on four performance indices (RMSE, RSR, NSEC, and R), the FOS model proved to be a reliable and robust forecaster of groundwater salinity. FOS is relatively inexpensive to apply, is not based on rigorous physical characterization of the water supply aquifer, and yields reliable estimates of groundwater salinity in active water supply wells.

  4. A hybrid artificial bee colony algorithm and pattern search method for inversion of particle size distribution from spectral extinction data

    Wang, Li; Li, Feng; Xing, Jian

    2017-10-01

    In this paper, a hybrid artificial bee colony (ABC) algorithm and pattern search (PS) method is proposed and applied for recovery of particle size distribution (PSD) from spectral extinction data. To be more useful and practical, size distribution function is modelled as the general Johnson's ? function that can overcome the difficulty of not knowing the exact type beforehand encountered in many real circumstances. The proposed hybrid algorithm is evaluated through simulated examples involving unimodal, bimodal and trimodal PSDs with different widths and mean particle diameters. For comparison, all examples are additionally validated by the single ABC algorithm. In addition, the performance of the proposed algorithm is further tested by actual extinction measurements with real standard polystyrene samples immersed in water. Simulation and experimental results illustrate that the hybrid algorithm can be used as an effective technique to retrieve the PSDs with high reliability and accuracy. Compared with the single ABC algorithm, our proposed algorithm can produce more accurate and robust inversion results while taking almost comparative CPU time over ABC algorithm alone. The superiority of ABC and PS hybridization strategy in terms of reaching a better balance of estimation accuracy and computation effort increases its potentials as an excellent inversion technique for reliable and efficient actual measurement of PSD.

  5. A method for combining search coil and fluxgate magnetometer data to reveal finer structures in reconnection physics

    Argall, M. R.; Caide, A.; Chen, L.; Torbert, R. B.

    2012-12-01

    Magnetometers have been used to measure terrestrial and extraterrestrial magnetic fields in space exploration ever since Sputnik 3. Modern space missions, such as Cluster, RBSP, and MMS incorporate both search coil magnetometers (SCMs) and fluxgate magnetometers (FGMs) in their instrument suites: FGMs work well at low frequencies while SCMs perform better at high frequencies. In analyzing the noise floor of these instruments, a cross-over region is apparent around 0.3-1.5Hz. The satellite separation of MMS and average speeds of field convection and plasma flows at the subsolar magnetopause make this a crucial range for the upcoming MMS mission. The method presented here combines the signals from SCM and FGM by taking a weighted average of both in this frequency range in order to draw out key features, such as narrow current sheet structures, that would otherwise not be visible. The technique is applied to burst mode Cluster data for reported magnetopause and magnetotail reconnection events to demonstrate the power of the combined data. This technique is also applied to data from the the EMFISIS instrument on the RBSP mission. The authors acknowledge and thank the FGM and STAFF team for the use of their data from the CLUSTER Active Archive.

  6. Methods to improve and understand the sensitivity of high purity germanium detectors for searches of rare events

    Volynets, Oleksandr

    2012-01-01

    Observation of neutrinoless double beta-decay could answer fundamental questions on the nature of neutrinos. High purity germanium detectors are well suited to search for this rare process in germanium. Successful operation of such experiments requires a good understanding of the detectors and the sources of background. Possible background sources not considered before in the presently running GERDA high purity germanium detector experiment were studied. Pulse shape analysis using artificial neural networks was used to distinguish between signal-like and background-like events. Pulse shape simulation was used to investigate systematic effects influencing the efficiency of the method. Possibilities to localize the origin of unwanted radiation using Compton back-tracking in a granular detector system were examined. Systematic effects in high purity germanium detectors influencing their performance have been further investigated using segmented detectors. The behavior of the detector response at different operational temperatures was studied. The anisotropy effects due to the crystallographic structure of germanium were facilitated in a novel way to determine the orientation of the crystallographic axes.

  7. Methods to improve and understand the sensitivity of high purity germanium detectors for searches of rare events

    Volynets, Oleksandr

    2012-07-27

    Observation of neutrinoless double beta-decay could answer fundamental questions on the nature of neutrinos. High purity germanium detectors are well suited to search for this rare process in germanium. Successful operation of such experiments requires a good understanding of the detectors and the sources of background. Possible background sources not considered before in the presently running GERDA high purity germanium detector experiment were studied. Pulse shape analysis using artificial neural networks was used to distinguish between signal-like and background-like events. Pulse shape simulation was used to investigate systematic effects influencing the efficiency of the method. Possibilities to localize the origin of unwanted radiation using Compton back-tracking in a granular detector system were examined. Systematic effects in high purity germanium detectors influencing their performance have been further investigated using segmented detectors. The behavior of the detector response at different operational temperatures was studied. The anisotropy effects due to the crystallographic structure of germanium were facilitated in a novel way to determine the orientation of the crystallographic axes.

  8. In-silico guided discovery of novel CCR9 antagonists

    Zhang, Xin; Cross, Jason B.; Romero, Jan; Heifetz, Alexander; Humphries, Eric; Hall, Katie; Wu, Yuchuan; Stucka, Sabrina; Zhang, Jing; Chandonnet, Haoqun; Lippa, Blaise; Ryan, M. Dominic; Baber, J. Christian

    2018-03-01

    Antagonism of CCR9 is a promising mechanism for treatment of inflammatory bowel disease, including ulcerative colitis and Crohn's disease. There is limited experimental data on CCR9 and its ligands, complicating efforts to identify new small molecule antagonists. We present here results of a successful virtual screening and rational hit-to-lead campaign that led to the discovery and initial optimization of novel CCR9 antagonists. This work uses a novel data fusion strategy to integrate the output of multiple computational tools, such as 2D similarity search, shape similarity, pharmacophore searching, and molecular docking, as well as the identification and incorporation of privileged chemokine fragments. The application of various ranking strategies, which combined consensus and parallel selection methods to achieve a balance of enrichment and novelty, resulted in 198 virtual screening hits in total, with an overall hit rate of 18%. Several hits were developed into early leads through targeted synthesis and purchase of analogs.

  9. Reflections on New Search Engine 新型搜索引擎畅想

    Huang, Jiannian

    2007-01-01

    English abstract]Quick increment of need on internet information resources leads to a rush of search engines. This article introduces some new type of search engines which is appearing and will appear. These search engines includes as follows: grey document search engine, invisible web search engine, knowledge discovery search engine, clustering meta search engine, academic clustering search engine, conception comparison and conception analogy search engine, consultation search engine, teachi...

  10. Complementary, alternative, and other noncomplete decongestive therapy treatment methods in the management of lymphedema: a systematic search and review.

    Rodrick, Julia R; Poage, Ellen; Wanchai, Ausanee; Stewart, Bob R; Cormier, Janice N; Armer, Jane M

    2014-03-01

    (1) To provide a critical analysis of the contemporary published research that pertains to complementary, alternative, and other noncomplete decongestive therapies for treatment of lymphedema (LE), and (2) to provide practical applications of that evidence to improve care of patients with or at risk for LE. TYPE: This study meets the defining criteria as a systematic search and review because it includes varied study types. All studies that met the inclusion criteria were evaluated for weight of evidence and value. The systematic search and review includes articles published in the contemporary literature (2004-2012). Publications published from 2004-2011 were retrieved from 11 major medical indices by using search terms for LE and management approaches. Literature archives were examined through 2012. Data extraction included study design, objectives pertaining to LE, number and characteristics of participants, interventions, and outcomes. Study strengths and weaknesses were summarized. Study evidence was categorized according to the Oncology Nursing Society Putting Evidence into Practice level-of-evidence guidelines after achieving consensus among the authors. No authors participated in development of nor benefitted from the review of these modality methods or devices. Extracted data from 85 studies were reviewed in 4 subcategories: botanical, pharmaceutical, physical agent modality, and modalities of contemporary value. After review, 47 articles were excluded, which left 16 articles on botanicals and pharmaceuticals and 22 articles for physical agent modality and/or modalities of contemporary value. Pharmaceuticals were later excluded. The authors concluded that botanicals had generated sufficient studies to support a second, more specific systematic review; thus, botanicals are reported elsewhere. It was found that limited high-level evidence was available for all categories. Well-constructed randomized controlled trials related specifically to LE were limited

  11. Discovery in a World of Mashups

    King, T. A.; Ritschel, B.; Hourcle, J. A.; Moon, I. S.

    2014-12-01

    When the first digital information was stored electronically, discovery of what existed was through file names and the organization of the file system. With the advent of networks, digital information was shared on a wider scale, but discovery remained based on file and folder names. With a growing number of information sources, named based discovery quickly became ineffective. The keyword based search engine was one of the first types of a mashup in the world of Web 1.0. Embedded links from one document to another with prescribed relationships between files and the world of Web 2.0 was formed. Search engines like Google used the links to improve search results and a worldwide mashup was formed. While a vast improvement, the need for semantic (meaning rich) discovery was clear, especially for the discovery of scientific data. In response, every science discipline defined schemas to describe their type of data. Some core schemas where shared, but most schemas are custom tailored even though they share many common concepts. As with the networking of information sources, science increasingly relies on data from multiple disciplines. So there is a need to bring together multiple sources of semantically rich information. We explore how harvesting, conceptual mapping, facet based search engines, search term promotion, and style sheets can be combined to create the next generation of mashups in the emerging world of Web 3.0. We use NASA's Planetary Data System and NASA's Heliophysics Data Environment to illustrate how to create a multi-discipline mash-up.

  12. Accounting for discovery bias in genomic prediction

    Our objective was to evaluate an approach to mitigating discovery bias in genomic prediction. Accuracy may be improved by placing greater emphasis on regions of the genome expected to be more influential on a trait. Methods emphasizing regions result in a phenomenon known as “discovery bias” if info...

  13. Methods of Data Collection, Sample Processing, and Data Analysis for Edge-of-Field, Streamgaging, Subsurface-Tile, and Meteorological Stations at Discovery Farms and Pioneer Farm in Wisconsin, 2001-7

    Stuntebeck, Todd D.; Komiskey, Matthew J.; Owens, David W.; Hall, David W.

    2008-01-01

    The University of Wisconsin (UW)-Madison Discovery Farms (Discovery Farms) and UW-Platteville Pioneer Farm (Pioneer Farm) programs were created in 2000 to help Wisconsin farmers meet environmental and economic challenges. As a partner with each program, and in cooperation with the Wisconsin Department of Natural Resources and the Sand County Foundation, the U.S. Geological Survey (USGS) Wisconsin Water Science Center (WWSC) installed, maintained, and operated equipment to collect water-quantity and water-quality data from 25 edge-offield, 6 streamgaging, and 5 subsurface-tile stations at 7 Discovery Farms and Pioneer Farm. The farms are located in the southern half of Wisconsin and represent a variety of landscape settings and crop- and animal-production enterprises common to Wisconsin agriculture. Meteorological stations were established at most farms to measure precipitation, wind speed and direction, air and soil temperature (in profile), relative humidity, solar radiation, and soil moisture (in profile). Data collection began in September 2001 and is continuing through the present (2008). This report describes methods used by USGS WWSC personnel to collect, process, and analyze water-quantity, water-quality, and meteorological data for edge-of-field, streamgaging, subsurface-tile, and meteorological stations at Discovery Farms and Pioneer Farm from September 2001 through October 2007. Information presented includes equipment used; event-monitoring and samplecollection procedures; station maintenance; sample handling and processing procedures; water-quantity, waterquality, and precipitation data analyses; and procedures for determining estimated constituent concentrations for unsampled runoff events.

  14. Deep Learning in Drug Discovery.

    Gawehn, Erik; Hiss, Jan A; Schneider, Gisbert

    2016-01-01

    Artificial neural networks had their first heyday in molecular informatics and drug discovery approximately two decades ago. Currently, we are witnessing renewed interest in adapting advanced neural network architectures for pharmaceutical research by borrowing from the field of "deep learning". Compared with some of the other life sciences, their application in drug discovery is still limited. Here, we provide an overview of this emerging field of molecular informatics, present the basic concepts of prominent deep learning methods and offer motivation to explore these techniques for their usefulness in computer-assisted drug discovery and design. We specifically emphasize deep neural networks, restricted Boltzmann machine networks and convolutional networks. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Bioinformatics in translational drug discovery.

    Wooller, Sarah K; Benstead-Hume, Graeme; Chen, Xiangrong; Ali, Yusuf; Pearl, Frances M G

    2017-08-31

    Bioinformatics approaches are becoming ever more essential in translational drug discovery both in academia and within the pharmaceutical industry. Computational exploitation of the increasing volumes of data generated during all phases of drug discovery is enabling key challenges of the process to be addressed. Here, we highlight some of the areas in which bioinformatics resources and methods are being developed to support the drug discovery pipeline. These include the creation of large data warehouses, bioinformatics algorithms to analyse 'big data' that identify novel drug targets and/or biomarkers, programs to assess the tractability of targets, and prediction of repositioning opportunities that use licensed drugs to treat additional indications. © 2017 The Author(s).

  16. Computational Discovery of Materials Using the Firefly Algorithm

    Avendaño-Franco, Guillermo; Romero, Aldo

    Our current ability to model physical phenomena accurately, the increase computational power and better algorithms are the driving forces behind the computational discovery and design of novel materials, allowing for virtual characterization before their realization in the laboratory. We present the implementation of a novel firefly algorithm, a population-based algorithm for global optimization for searching the structure/composition space. This novel computation-intensive approach naturally take advantage of concurrency, targeted exploration and still keeping enough diversity. We apply the new method in both periodic and non-periodic structures and we present the implementation challenges and solutions to improve efficiency. The implementation makes use of computational materials databases and network analysis to optimize the search and get insights about the geometric structure of local minima on the energy landscape. The method has been implemented in our software PyChemia, an open-source package for materials discovery. We acknowledge the support of DMREF-NSF 1434897 and the Donors of the American Chemical Society Petroleum Research Fund for partial support of this research under Contract 54075-ND10.

  17. A method for the design and development of medical or health care information websites to optimize search engine results page rankings on Google.

    Dunne, Suzanne

    2013-01-01

    The Internet is a widely used source of information for patients searching for medical\\/health care information. While many studies have assessed existing medical\\/health care information on the Internet, relatively few have examined methods for design and delivery of such websites, particularly those aimed at the general public.

  18. Sea Level Rise Data Discovery

    Quach, N.; Huang, T.; Boening, C.; Gill, K. M.

    2016-12-01

    Research related to sea level rise crosses multiple disciplines from sea ice to land hydrology. The NASA Sea Level Change Portal (SLCP) is a one-stop source for current sea level change information and data, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. The architecture behind the SLCP makes it possible to integrate web content and data relevant to sea level change that are archived across various data centers as well as new data generated by sea level change principal investigators. The Extensible Data Gateway Environment (EDGE) is incorporated into the SLCP architecture to provide a unified platform for web content and science data discovery. EDGE is a data integration platform designed to facilitate high-performance geospatial data discovery and access with the ability to support multi-metadata standard specifications. EDGE has the capability to retrieve data from one or more sources and package the resulting sets into a single response to the requestor. With this unified endpoint, the Data Analysis Tool that is available on the SLCP can retrieve dataset and granule level metadata as well as perform geospatial search on the data. This talk focuses on the architecture that makes it possible to seamlessly integrate and enable discovery of disparate data relevant to sea level rise.

  19. Discovery of a Makemakean Moon

    Parker, Alex H.; Buie, Marc W.; Grundy, Will M.; Noll, Keith S.

    2016-01-01

    We describe the discovery of a satellite in orbit about the dwarf planet (136472) Makemake. This satellite, provisionally designated S/2015 (136472) 1, was detected in imaging data collected with the Hubble Space Telescope's Wide Field Camera 3 on UTC 2015 April 27 at 7.80 +/- 0.04 mag fainter than Makemake and at a separation of 0farcs57. It likely evaded detection in previous satellite searches due to a nearly edge-on orbital configuration, placing it deep within the glare of Makemake during a substantial fraction of its orbital period. This configuration would place Makemake and its satellite near a mutual event season. Insufficient orbital motion was detected to make a detailed characterization of its orbital properties, prohibiting a measurement of the system mass with the discovery data alone. Preliminary analysis indicates that if the orbit is circular, its orbital period must be longer than 12.4 days and must have a semimajor axis > or approx. = 21,000 km. We find that the properties of Makemake's moon suggest that the majority of the dark material detected in the system by thermal observations may not reside on the surface of Makemake, but may instead be attributable to S/2015 (136472) 1 having a uniform dark surface. This dark moon hypothesis can be directly tested with future James Webb Space Telescope observations. We discuss the implications of this discovery for the spin state, figure, and thermal properties of Makemake and the apparent ubiquity of trans-Neptunian dwarf planet satellites.

  20. Searches for Supersymmetry in ATLAS

    Cervelli, Alberto; The ATLAS collaboration

    2017-01-01

    After the discovery of the Higgs boson in ATLAS first run of data taking, and due to the lack of observation of new physics, searches for new particles such as Supersymmetric states are one of the main area of interest for the general purpose detectors operating at LHC. In this talk we will present a review of the searches for Supersymmetric particles, performed by the ATLAS experiment

  1. Random searching

    Shlesinger, Michael F

    2009-01-01

    There are a wide variety of searching problems from molecules seeking receptor sites to predators seeking prey. The optimal search strategy can depend on constraints on time, energy, supplies or other variables. We discuss a number of cases and especially remark on the usefulness of Levy walk search patterns when the targets of the search are scarce.

  2. Search Patterns

    Morville, Peter

    2010-01-01

    What people are saying about Search Patterns "Search Patterns is a delight to read -- very thoughtful and thought provoking. It's the most comprehensive survey of designing effective search experiences I've seen." --Irene Au, Director of User Experience, Google "I love this book! Thanks to Peter and Jeffery, I now know that search (yes, boring old yucky who cares search) is one of the coolest ways around of looking at the world." --Dan Roam, author, The Back of the Napkin (Portfolio Hardcover) "Search Patterns is a playful guide to the practical concerns of search interface design. It cont

  3. Methods and Results of a Search for Gravitational Waves Associated with Gamma-Ray Bursts Using the GEO 600, LIGO, and Virgo Detectors

    Aasi, J.; Abbott, B. P.; Abbott, R.; Abbott, T.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Blackburn, Lindy L.; hide

    2013-01-01

    In this paper we report on a search for short-duration gravitational wave bursts in the frequency range 64 Hz-1792 Hz associated with gamma-ray bursts (GRBs), using data from GEO600 and one of the LIGO or Virgo detectors. We introduce the method of a linear search grid to analyze GRB events with large sky localization uncertainties such as the localizations provided by the Fermi Gamma-ray Burst Monitor (GBM). Coherent searches for gravitational waves (GWs) can be computationally intensive when the GRB sky position is not well-localized, due to the corrections required for the difference in arrival time between detectors. Using a linear search grid we are able to reduce the computational cost of the analysis by a factor of O(10) for GBM events. Furthermore, we demonstrate that our analysis pipeline can improve upon the sky localization of GRBs detected by the GBM, if a high-frequency GW signal is observed in coincidence. We use the linear search grid method in a search for GWs associated with 129 GRBs observed satellite-based gamma-ray experiments between 2006 and 2011. The GRBs in our sample had not been previously analyzed for GW counterparts. A fraction of our GRB events are analyzed using data from GEO600 while the detector was using squeezed-light states to improve its sensitivity; this is the first search for GWs using data from a squeezed-light interferometric observatory. We find no evidence for GW signals, either with any individual GRB in this sample or with the population as a whole. For each GRB we place lower bounds on the distance to the progenitor, assuming a fixed GW emission energy of 10(exp -2)Stellar Mass sq c, with a median exclusion distance of 0.8 Mpc for emission at 500 Hz and 0.3 Mpc at 1 kHz. The reduced computational cost associated with a linear search grid will enable rapid searches for GWs associated with Fermi GBM events in the Advanced detector era.

  4. The application of the Luus-Jaakola direct search method to the optimization of a hybrid renewable energy system

    Jatzeck, Bernhard Michael

    2000-10-01

    The application of the Luus-Jaakola direct search method to the optimization of stand-alone hybrid energy systems consisting of wind turbine generators (WTG's), photovoltaic (PV) modules, batteries, and an auxiliary generator was examined. The loads for these systems were for agricultural applications, with the optimization conducted on the basis of minimum capital, operating, and maintenance costs. Five systems were considered: two near Edmonton, Alberta, and one each near Lethbridge, Alberta, Victoria, British Columbia, and Delta, British Columbia. The optimization algorithm used hourly data for the load demand, WTG output power/area, and PV module output power. These hourly data were in two sets: seasonal (summer and winter values separated) and total (summer and winter values combined). The costs for the WTG's, PV modules, batteries, and auxiliary generator fuel were full market values. To examine the effects of price discounts or tax incentives, these values were lowered to 25% of the full costs for the energy sources and two-thirds of the full cost for agricultural fuel. Annual costs for a renewable energy system depended upon the load, location, component costs, and which data set (seasonal or total) was used. For one Edmonton load, the cost for a renewable energy system consisting of 27.01 m2 of WTG area, 14 PV modules, and 18 batteries (full price, total data set) was 6873/year. For Lethbridge, a system with 22.85 m2 of WTG area, 47 PV modules, and 5 batteries (reduced prices, seasonal data set) cost 2913/year. The performance of renewable energy systems based on the obtained results was tested in a simulation using load and weather data for selected days. Test results for one Edmonton load showed that the simulations for most of the systems examined ran for at least 17 hours per day before failing due to either an excessive load on the auxiliary generator or a battery constraint being violated. Additional testing indicated that increasing the generator

  5. Search for a planet

    Tokovinin, A.A.

    1986-01-01

    The problem of search for star planets is discussed in a popular form. Two methods of search for planets are considered: astrometric and spectral. Both methods complement one another. An assumption is made that potential possessors of planets are in the first place yellow and red dwarfs with slow axial rotation. These stars are the most numerous representatives of Galaxy population

  6. IMPROVEMENT EFFORTS TO LEARN LESSONS ACTIVITIES CHASSIS POWER TRANSFER STANDARD COMPETENCE AND CORRECT STEERING SYSTEM WITH LEARNING METHOD DISCOVERY INQUIRY CLASS XIB SMK MUHAMMADIYAH GAMPING ACADEMIC YEAR 2013/2014

    Harry Suharto

    2013-12-01

    Full Text Available The purpose of the study to determine the increase learners' learning activities subjects chassis and power transfer competency standard steering system repair discovery learning through the implementation of class XI inquiry Lightweight Vehicle Technology SMK Muhammadiyah Gamping, Sleman academic year 2013/2014. This research including action research   Research conducted at SMK Muhammadiyah Gamping XIB class academic year 2013/2014 with a sample of 26 students. Techniques of data collection using questionnaire sheet, observation sheets and documentation to determine the increase in student activity. Instrument validation study using experts judgment. Analysis using descriptive statistics using the technique .   The results showed that the increased activity of the first cycle to the second cycle include an increase of 57.7 % Visual activities; Oral activities amounted to 61.6 %; Listening activities amounted to 23.04 %; Writing activities by 8.7 %; Mental activities of 73.1 %; Emotional activities of 42.3 % ( for the spirit of the students in learning activities ; Motor activities amounted to -7.7 % ( decrease negative activity . Based on these results can be known to most students in SMK Muhammadiyah Gamping gave a positive opinion on the use of inquiry and discovery learning method has a view that the use of inquiry discovery learning methods can be useful for students and schools themselves. Learners who have a good perception of the use of discovery learning method of inquiry he has known and fully aware of the standards of achievement of competence theory fix the steering system. Learning discovery learning methods on achievement of competency standards inquiry repair steering systems theory pleased with the learning process, they are also able to: 1 increase the motivation to learn, 2 improving learning achievement; 3 enhancing creativity; 4 listen, respect, and accept the opinion of the participants other students; 5 reduce boredom

  7. Improved accuracy of supervised CRM discovery with interpolated Markov models and cross-species comparison.

    Kazemian, Majid; Zhu, Qiyun; Halfon, Marc S; Sinha, Saurabh

    2011-12-01

    Despite recent advances in experimental approaches for identifying transcriptional cis-regulatory modules (CRMs, 'enhancers'), direct empirical discovery of CRMs for all genes in all cell types and environmental conditions is likely to remain an elusive goal. Effective methods for computational CRM discovery are thus a critically needed complement to empirical approaches. However, existing computational methods that search for clusters of putative binding sites are ineffective if the relevant TFs and/or their binding specificities are unknown. Here, we provide a significantly improved method for 'motif-blind' CRM discovery that does not depend on knowledge or accurate prediction of TF-binding motifs and is effective when limited knowledge of functional CRMs is available to 'supervise' the search. We propose a new statistical method, based on 'Interpolated Markov Models', for motif-blind, genome-wide CRM discovery. It captures the statistical profile of variable length words in known CRMs of a regulatory network and finds candidate CRMs that match this profile. The method also uses orthologs of the known CRMs from closely related genomes. We perform in silico evaluation of predicted CRMs by assessing whether their neighboring genes are enriched for the expected expression patterns. This assessment uses a novel statistical test that extends the widely used Hypergeometric test of gene set enrichment to account for variability in intergenic lengths. We find that the new CRM prediction method is superior to existing methods. Finally, we experimentally validate 12 new CRM predictions by examining their regulatory activity in vivo in Drosophila; 10 of the tested CRMs were found to be functional, while 6 of the top 7 predictions showed the expected activity patterns. We make our program available as downloadable source code, and as a plugin for a genome browser installed on our servers. © The Author(s) 2011. Published by Oxford University Press.

  8. Service Discovery At Home

    Sundramoorthy, V.; Scholten, Johan; Jansen, P.G.; Hartel, Pieter H.

    Service discovery is a fady new field that kicked off since the advent of ubiquitous computing and has been found essential in the making of intelligent networks by implementing automated discovery and remote control between deviies. This paper provides an ovewiew and comparison of several prominent

  9. Academic Drug Discovery Centres

    Kirkegaard, Henriette Schultz; Valentin, Finn

    2014-01-01

    Academic drug discovery centres (ADDCs) are seen as one of the solutions to fill the innovation gap in early drug discovery, which has proven challenging for previous organisational models. Prior studies of ADDCs have identified the need to analyse them from the angle of their economic...

  10. Decades of Discovery

    2011-06-01

    For the past two-and-a-half decades, the Office of Science at the U.S. Department of Energy has been at the forefront of scientific discovery. Over 100 important discoveries supported by the Office of Science are represented in this document.

  11. Service discovery at home

    Sundramoorthy, V.; Scholten, Johan; Jansen, P.G.; Hartel, Pieter H.

    2003-01-01

    Service discovery is a fairly new field that kicked off since the advent of ubiquitous computing and has been found essential in the making of intelligent networks by implementing automated discovery and remote control between devices. This paper provides an overview and comparison of several

  12. Searching for Multi-Targeting Neurotherapeutics against Alzheimer’s: Discovery of Potent AChE-MAO B Inhibitors through the Decoration of the 2H-Chromen-2-one Structural Motif

    Leonardo Pisani

    2016-03-01

    Full Text Available The need for developing real disease-modifying drugs against neurodegenerative syndromes, particularly Alzheimer’s disease (AD, shifted research towards reliable drug discovery strategies to unveil clinical candidates with higher therapeutic efficacy than single-targeting drugs. By following the multi-target approach, we designed and synthesized a novel class of dual acetylcholinesterase (AChE-monoamine oxidase B (MAO-B inhibitors through the decoration of the 2H-chromen-2-one skeleton. Compounds bearing a propargylamine moiety at position 3 displayed the highest in vitro inhibitory activities against MAO-B. Within this series, derivative 3h emerged as the most interesting hit compound, being a moderate AChE inhibitor (IC50 = 8.99 µM and a potent and selective MAO-B inhibitor (IC50 = 2.8 nM. Preliminary studies in human neuroblastoma SH-SY5Y cell lines demonstrated its low cytotoxicity and disclosed a promising neuroprotective effect at low doses (0.1 µM under oxidative stress conditions promoted by two mitochondrial toxins (oligomycin-A and rotenone. In a Madin-Darby canine kidney (MDCKII-MDR1 cell-based transport study, Compound 3h was able to permeate the BBB-mimicking monolayer and did not result in a glycoprotein-p (P-gp substrate, showing an efflux ratio = 0.96, close to that of diazepam.

  13. Proteomic and metabolomic approaches to biomarker discovery

    Issaq, Haleem J

    2013-01-01

    Proteomic and Metabolomic Approaches to Biomarker Discovery demonstrates how to leverage biomarkers to improve accuracy and reduce errors in research. Disease biomarker discovery is one of the most vibrant and important areas of research today, as the identification of reliable biomarkers has an enormous impact on disease diagnosis, selection of treatment regimens, and therapeutic monitoring. Various techniques are used in the biomarker discovery process, including techniques used in proteomics, the study of the proteins that make up an organism, and metabolomics, the study of chemical fingerprints created from cellular processes. Proteomic and Metabolomic Approaches to Biomarker Discovery is the only publication that covers techniques from both proteomics and metabolomics and includes all steps involved in biomarker discovery, from study design to study execution.  The book describes methods, and presents a standard operating procedure for sample selection, preparation, and storage, as well as data analysis...

  14. Self-learning search engines

    Schuth, A.

    2015-01-01

    How does a search engine such as Google know which search results to display? There are many competing algorithms that generate search results, but which one works best? We developed a new probabilistic method for quickly comparing large numbers of search algorithms by examining the results users

  15. Maximum Entropy in Drug Discovery

    Chih-Yuan Tseng

    2014-07-01

    Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.

  16. Evaluation of Proteomic Search Engines for the Analysis of Histone Modifications

    2015-01-01

    Identification of histone post-translational modifications (PTMs) is challenging for proteomics search engines. Including many histone PTMs in one search increases the number of candidate peptides dramatically, leading to low search speed and fewer identified spectra. To evaluate database search engines on identifying histone PTMs, we present a method in which one kind of modification is searched each time, for example, unmodified, individually modified, and multimodified, each search result is filtered with false discovery rate less than 1%, and the identifications of multiple search engines are combined to obtain confident results. We apply this method for eight search engines on histone data sets. We find that two search engines, pFind and Mascot, identify most of the confident results at a reasonable speed, so we recommend using them to identify histone modifications. During the evaluation, we also find some important aspects for the analysis of histone modifications. Our evaluation of different search engines on identifying histone modifications will hopefully help those who are hoping to enter the histone proteomics field. The mass spectrometry proteomics data have been deposited to the ProteomeXchange Consortium with the data set identifier PXD001118. PMID:25167464

  17. Evaluation of proteomic search engines for the analysis of histone modifications.

    Yuan, Zuo-Fei; Lin, Shu; Molden, Rosalynn C; Garcia, Benjamin A

    2014-10-03

    Identification of histone post-translational modifications (PTMs) is challenging for proteomics search engines. Including many histone PTMs in one search increases the number of candidate peptides dramatically, leading to low search speed and fewer identified spectra. To evaluate database search engines on identifying histone PTMs, we present a method in which one kind of modification is searched each time, for example, unmodified, individually modified, and multimodified, each search result is filtered with false discovery rate less than 1%, and the identifications of multiple search engines are combined to obtain confident results. We apply this method for eight search engines on histone data sets. We find that two search engines, pFind and Mascot, identify most of the confident results at a reasonable speed, so we recommend using them to identify histone modifications. During the evaluation, we also find some important aspects for the analysis of histone modifications. Our evaluation of different search engines on identifying histone modifications will hopefully help those who are hoping to enter the histone proteomics field. The mass spectrometry proteomics data have been deposited to the ProteomeXchange Consortium with the data set identifier PXD001118.

  18. Personalized Search

    AUTHOR|(SzGeCERN)749939

    2015-01-01

    As the volume of electronically available information grows, relevant items become harder to find. This work presents an approach to personalizing search results in scientific publication databases. This work focuses on re-ranking search results from existing search engines like Solr or ElasticSearch. This work also includes the development of Obelix, a new recommendation system used to re-rank search results. The project was proposed and performed at CERN, using the scientific publications available on the CERN Document Server (CDS). This work experiments with re-ranking using offline and online evaluation of users and documents in CDS. The experiments conclude that the personalized search result outperform both latest first and word similarity in terms of click position in the search result for global search in CDS.

  19. Search for the Higgs boson decaying into $\\tau$ lepton pairs with the Matrix Element Method and $\\tau$ trigger optimization in the CMS experiment at the LHC

    AUTHOR|(CDS)2083962; Beaudette, Florian; Beaudette, Florian

    2016-01-01

    I performed my thesis work in Particle Physics at the laboratoire Leprince-Ringuet of the Ecole Polytechnique. I have been involved in the analysis of the data produced in the proton-proton collisions at the Large Hadron Collider (CERN) and collected by the CMS experiment. Particle physics is a scientific field which is currently undergoing very important breakthroughs. The discovery of the Higgs boson is a major step forward as the mass of vector bosons are explained through their interactions with the corresponding field. I worked on the newly discovered heavy boson analysis. As its direct coupling to fermions remained to be exhibited, I focused on the search for the Higgs boson decaying in tau lepton pairs. The Higgs decay into tau pairs is the most promising decay channel to measure the couplings between the Standard Model Higgs boson and the fermions. Indeed, this decay channel benefits from a large expected event rate compared to the other leptonic decay modes. The Higgs boson decaying to tau lepton ana...

  20. "Eureka, Eureka!" Discoveries in Science

    Agarwal, Pankaj

    2011-01-01

    Accidental discoveries have been of significant value in the progress of science. Although accidental discoveries are more common in pharmacology and chemistry, other branches of science have also benefited from such discoveries. While most discoveries are the result of persistent research, famous accidental discoveries provide a fascinating…