WorldWideScience

Sample records for searching method developed

  1. Text mining for search term development in systematic reviewing: A discussion of some methods and challenges.

    Science.gov (United States)

    Stansfield, Claire; O'Mara-Eves, Alison; Thomas, James

    2017-09-01

    Using text mining to aid the development of database search strings for topics described by diverse terminology has potential benefits for systematic reviews; however, methods and tools for accomplishing this are poorly covered in the research methods literature. We briefly review the literature on applications of text mining for search term development for systematic reviewing. We found that the tools can be used in 5 overarching ways: improving the precision of searches; identifying search terms to improve search sensitivity; aiding the translation of search strategies across databases; searching and screening within an integrated system; and developing objectively derived search strategies. Using a case study and selected examples, we then reflect on the utility of certain technologies (term frequency-inverse document frequency and Termine, term frequency, and clustering) in improving the precision and sensitivity of searches. Challenges in using these tools are discussed. The utility of these tools is influenced by the different capabilities of the tools, the way the tools are used, and the text that is analysed. Increased awareness of how the tools perform facilitates the further development of methods for their use in systematic reviews. Copyright © 2017 John Wiley & Sons, Ltd.

  2. An automated full-symmetry Patterson search method

    International Nuclear Information System (INIS)

    Rius, J.; Miravitlles, C.

    1987-01-01

    A full-symmetry Patterson search method is presented that performs a molecular coarse rotation search in vector space and orientation refinement using the σ function. The oriented molecule is positioned using the fast translation function τ 0 , which is based on the automated interpretation of τ projections using the sum function. This strategy reduces the number of Patterson-function values to be stored in the rotation search, and the use of the τ 0 function minimizes the required time for the development of all probable rotation search solutions. The application of this method to five representative test examples is shown. (orig.)

  3. Real-time earthquake monitoring using a search engine method.

    Science.gov (United States)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  4. Routine development of objectively derived search strategies

    Directory of Open Access Journals (Sweden)

    Hausner Elke

    2012-02-01

    Full Text Available Abstract Background Over the past few years, information retrieval has become more and more professionalized, and information specialists are considered full members of a research team conducting systematic reviews. Research groups preparing systematic reviews and clinical practice guidelines have been the driving force in the development of search strategies, but open questions remain regarding the transparency of the development process and the available resources. An empirically guided approach to the development of a search strategy provides a way to increase transparency and efficiency. Methods Our aim in this paper is to describe the empirically guided development process for search strategies as applied by the German Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen, or "IQWiG". This strategy consists of the following steps: generation of a test set, as well as the development, validation and standardized documentation of the search strategy. Results We illustrate our approach by means of an example, that is, a search for literature on brachytherapy in patients with prostate cancer. For this purpose, a test set was generated, including a total of 38 references from 3 systematic reviews. The development set for the generation of the strategy included 25 references. After application of textual analytic procedures, a strategy was developed that included all references in the development set. To test the search strategy on an independent set of references, the remaining 13 references in the test set (the validation set were used. The validation set was also completely identified. Discussion Our conclusion is that an objectively derived approach similar to that used in search filter development is a feasible way to develop and validate reliable search strategies. Besides creating high-quality strategies, the widespread application of this approach will result in a

  5. Job Search as Goal-Directed Behavior: Objectives and Methods

    Science.gov (United States)

    Van Hoye, Greet; Saks, Alan M.

    2008-01-01

    This study investigated the relationship between job search objectives (finding a new job/turnover, staying aware of job alternatives, developing a professional network, and obtaining leverage against an employer) and job search methods (looking at job ads, visiting job sites, networking, contacting employment agencies, contacting employers, and…

  6. Knowing How Good Our Searches Are: An Approach Derived from Search Filter Development Methodology

    Directory of Open Access Journals (Sweden)

    Sarah Hayman

    2015-12-01

    Full Text Available Objective – Effective literature searching is of paramount importance in supporting evidence based practice, research, and policy. Missed references can have adverse effects on outcomes. This paper reports on the development and evaluation of an online learning resource, designed for librarians and other interested searchers, presenting an evidence based approach to enhancing and testing literature searches. Methods – We developed and evaluated the set of free online learning modules for librarians called Smart Searching, suggesting the use of techniques derived from search filter development undertaken by the CareSearch Palliative Care Knowledge Network and its associated project Flinders Filters. The searching module content has been informed by the processes and principles used in search filter development. The self-paced modules are intended to help librarians and other interested searchers test the effectiveness of their literature searches, provide evidence of search performance that can be used to improve searches, as well as to evaluate and promote searching expertise. Each module covers one of four techniques, or core principles, employed in search filter development: (1 collaboration with subject experts; (2 use of a reference sample set; (3 term identification through frequency analysis; and (4 iterative testing. Evaluation of the resource comprised ongoing monitoring of web analytics to determine factors such as numbers of users and geographic origin; a user survey conducted online elicited qualitative information about the usefulness of the resource. Results – The resource was launched in May 2014. Web analytics show over 6,000 unique users from 101 countries (at 9 August 2015. Responses to the survey (n=50 indicated that 80% would recommend the resource to a colleague. Conclusions – An evidence based approach to searching, derived from search filter development methodology, has been shown to have value as an online learning

  7. Employed and unemployed job search methods: Australian evidence on search duration, wages and job stability

    OpenAIRE

    Colin Green

    2012-01-01

    This paper examines the use and impact of job search methods of both unemployed and employed job seekers. Informal job search methods are associated with relativel high level of job exit and shorter search duration. Job exists through the public employment agency (PEA) display positive duration dependence for the unemployed. This may suggest that the PEA is used as a job search method of last resort. Informal job search methods have lower associated duration in search and higher wages than th...

  8. Search Results | Page 88 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Search Results. Showing 871 - 880 of 8491 results. Studies ... Strengthening Nurses' Capacity in HIV Policy Development in Sub-Saharan Africa and the Caribbean ... Novel Epidemiological Method: Using Newspapers To Provide Insight Into ...

  9. Comparison tomography relocation hypocenter grid search and guided grid search method in Java island

    International Nuclear Information System (INIS)

    Nurdian, S. W.; Adu, N.; Palupi, I. R.; Raharjo, W.

    2016-01-01

    The main data in this research is earthquake data recorded from 1952 to 2012 with 9162 P wave and 2426 events are recorded by 30 stations located around Java island. Relocation hypocenter processed using grid search and guidded grid search method. Then the result of relocation hypocenter become input for tomography pseudo bending inversion process. It can be used to identification the velocity distribution in subsurface. The result of relocation hypocenter by grid search and guided grid search method after tomography process shown in locally and globally. In locally area grid search method result is better than guided grid search according to geological reseach area. But in globally area the result of guided grid search method is better for a broad area because the velocity variation is more diverse than the other one and in accordance with local geological research conditions. (paper)

  10. Heuristic method for searching global maximum of multimodal unknown function

    Energy Technology Data Exchange (ETDEWEB)

    Kamei, K; Araki, Y; Inoue, K

    1983-06-01

    The method is composed of three kinds of searches. They are called g (grasping)-mode search, f (finding)-mode search and c (confirming)-mode search. In the g-mode search and the c-mode search, a heuristic method is used which was extracted from search behaviors of human subjects. In f-mode search, the simplex method is used which is well known as a search method for unimodal unknown function. Each mode search and its transitions are shown in the form of flowchart. The numerical results for one-dimensional through six-dimensional multimodal functions prove the proposed search method to be an effective one. 11 references.

  11. Phonetic search methods for large speech databases

    CERN Document Server

    Moyal, Ami; Tetariy, Ella; Gishri, Michal

    2013-01-01

    “Phonetic Search Methods for Large Databases” focuses on Keyword Spotting (KWS) within large speech databases. The brief will begin by outlining the challenges associated with Keyword Spotting within large speech databases using dynamic keyword vocabularies. It will then continue by highlighting the various market segments in need of KWS solutions, as well as, the specific requirements of each market segment. The work also includes a detailed description of the complexity of the task and the different methods that are used, including the advantages and disadvantages of each method and an in-depth comparison. The main focus will be on the Phonetic Search method and its efficient implementation. This will include a literature review of the various methods used for the efficient implementation of Phonetic Search Keyword Spotting, with an emphasis on the authors’ own research which entails a comparative analysis of the Phonetic Search method which includes algorithmic details. This brief is useful for resea...

  12. Non-contact method of search and analysis of pulsating vessels

    Science.gov (United States)

    Avtomonov, Yuri N.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Despite the variety of existing methods of recording the human pulse and a solid history of their development, there is still considerable interest in this topic. The development of new non-contact methods, based on advanced image processing, caused a new wave of interest in this issue. We present a simple but quite effective method for analyzing the mechanical pulsations of blood vessels lying close to the surface of the skin. Our technique is a modification of imaging (or remote) photoplethysmography (i-PPG). We supplemented this method with the addition of a laser light source, which made it possible to use other methods of searching for the proposed pulsation zone. During the testing of the method, several series of experiments were carried out with both artificial oscillating objects as well as with the target signal source (human wrist). The obtained results show that our method allows correct interpretation of complex data. To summarize, we proposed and tested an alternative method for the search and analysis of pulsating vessels.

  13. Exploration of Stellarator Configuration Space with Global Search Methods

    International Nuclear Information System (INIS)

    Mynick, H.E.; Pomphrey, N.; Ethier, S.

    2001-01-01

    An exploration of stellarator configuration space z for quasi-axisymmetric stellarator (QAS) designs is discussed, using methods which provide a more global view of that space. To this end, we have implemented a ''differential evolution'' (DE) search algorithm in an existing stellarator optimizer, which is much less prone to become trapped in local, suboptimal minima of the cost function chi than the local search methods used previously. This search algorithm is complemented by mapping studies of chi over z aimed at gaining insight into the results of the automated searches. We find that a wide range of the attractive QAS configurations previously found fall into a small number of classes, with each class corresponding to a basin of chi(z). We develop maps on which these earlier stellarators can be placed, the relations among them seen, and understanding gained into the physics differences between them. It is also found that, while still large, the region of z space containing practically realizable QAS configurations is much smaller than earlier supposed

  14. THE METHOD OF APPLICATION OF A COLLECTIVE SEARCH ACTIVITY AS A TOOL DEVELOPING METHODOLOGICAL THINKING OF A TEACHER

    Directory of Open Access Journals (Sweden)

    Ibragimova Luiza Vahaevna

    2013-02-01

    Full Text Available To realize any pedagogical theory into practice it is necessary to transform the theoretical concepts in teaching methods. The development of all abilities, including thinking, occurs only in the activity, which is specially organized by creating the required pedagogical conditions, in this case – it is a the application of enhanced mental activity in teachers training course and vocational training b establishment of a "virtual university" for teachers in an institute of professional training c the organization of interdisciplinary interaction of teachers, based on conditions of the nonlinear didactics (training teachers of different subjects. The presented method is implemented for two years and consists of three phases: the motivational and educational, intellectual and developmental, innovative and reflective. At the motivational and educational stage, possibilities of collective search activity actualize during the course of training, group goals are set and chosen methods of their achieving by using the first pedagogical conditions. At intellectual and developmental stage, the development of skills to the collective search for effective teaching decisions during intercourse training with the first-and second-pedagogical conditions is carried out. The innovative step is the promotion of teachers to self-determination of techniques and tools that improve the quality of the educational process, providing assistance to each other in the development of teaching manuals, which is achieved with the help of all three pedagogical conditions.

  15. Developing topic-specific search filters for PubMed with click-through data.

    Science.gov (United States)

    Li, J; Lu, Z

    2013-01-01

    Search filters have been developed and demonstrated for better information access to the immense and ever-growing body of publications in the biomedical domain. However, to date the number of filters remains quite limited because the current filter development methods require significant human efforts in manual document review and filter term selection. In this regard, we aim to investigate automatic methods for generating search filters. We present an automated method to develop topic-specific filters on the basis of users' search logs in PubMed. Specifically, for a given topic, we first detect its relevant user queries and then include their corresponding clicked articles to serve as the topic-relevant document set accordingly. Next, we statistically identify informative terms that best represent the topic-relevant document set using a background set composed of topic irrelevant articles. Lastly, the selected representative terms are combined with Boolean operators and evaluated on benchmark datasets to derive the final filter with the best performance. We applied our method to develop filters for four clinical topics: nephrology, diabetes, pregnancy, and depression. For the nephrology filter, our method obtained performance comparable to the state of the art (sensitivity of 91.3%, specificity of 98.7%, precision of 94.6%, and accuracy of 97.2%). Similarly, high-performing results (over 90% in all measures) were obtained for the other three search filters. Based on PubMed click-through data, we successfully developed a high-performance method for generating topic-specific search filters that is significantly more efficient than existing manual methods. All data sets (topic-relevant and irrelevant document sets) used in this study and a demonstration system are publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/downloads/CQ_filter/

  16. Dual-mode nested search method for categorical uncertain multi-objective optimization

    Science.gov (United States)

    Tang, Long; Wang, Hu

    2016-10-01

    Categorical multi-objective optimization is an important issue involved in many matching design problems. Non-numerical variables and their uncertainty are the major challenges of such optimizations. Therefore, this article proposes a dual-mode nested search (DMNS) method. In the outer layer, kriging metamodels are established using standard regular simplex mapping (SRSM) from categorical candidates to numerical values. Assisted by the metamodels, a k-cluster-based intelligent sampling strategy is developed to search Pareto frontier points. The inner layer uses an interval number method to model the uncertainty of categorical candidates. To improve the efficiency, a multi-feature convergent optimization via most-promising-area stochastic search (MFCOMPASS) is proposed to determine the bounds of objectives. Finally, typical numerical examples are employed to demonstrate the effectiveness of the proposed DMNS method.

  17. An introduction to harmony search optimization method

    CERN Document Server

    Wang, Xiaolei; Zenger, Kai

    2014-01-01

    This brief provides a detailed introduction, discussion and bibliographic review of the nature1-inspired optimization algorithm called Harmony Search. It uses a large number of simulation results to demonstrate the advantages of Harmony Search and its variants and also their drawbacks. The authors show how weaknesses can be amended by hybridization with other optimization methods. The Harmony Search Method with Applications will be of value to researchers in computational intelligence in demonstrating the state of the art of research on an algorithm of current interest. It also helps researche

  18. THE METHOD OF APPLICATION OF A COLLECTIVE SEARCH ACTIVITY AS A TOOL DEVELOPING METHODOLOGICAL THINKING OF A TEACHER

    Directory of Open Access Journals (Sweden)

    Луиза Вахаевна Ибрагимова

    2013-04-01

    Full Text Available To realize any pedagogical theory into practice it is necessary to transform the theoretical concepts in teaching methods. The development of all abilities, including thinking, occurs only in the activity, which is specially organized by creating the required pedagogical conditions, in this case – it is a the application of enhanced mental activity in teachers training course and vocational training b establishment of a "virtual university" for teachers in an institute of professional training c the organization of interdisciplinary interaction of teachers, based on conditions of the nonlinear didactics (training teachers of different subjects. The presented method is implemented for two years and consists of three phases: the motivational and educational, intellectual and developmental, innovative and reflective. At the motivational and educational stage, possibilities of collective search activity actualize during the course of training, group goals are set and chosen methods of their achieving by using the first pedagogical conditions. At intellectual and developmental stage, the development of skills to the collective search for effective teaching decisions during intercourse training with the first-and second-pedagogical conditions is carried out. The innovative step is the promotion of teachers to self-determination of techniques and tools that improve the quality of the educational process, providing assistance to each other in the development of teaching manuals, which is achieved with the help of all three pedagogical conditions.DOI: http://dx.doi.org/10.12731/2218-7405-2013-2-17

  19. Cochrane Qualitative and Implementation Methods Group guidance series-paper 2: methods for question formulation, searching, and protocol development for qualitative evidence synthesis.

    Science.gov (United States)

    Harris, Janet L; Booth, Andrew; Cargo, Margaret; Hannes, Karin; Harden, Angela; Flemming, Kate; Garside, Ruth; Pantoja, Tomas; Thomas, James; Noyes, Jane

    2018-05-01

    This paper updates previous Cochrane guidance on question formulation, searching, and protocol development, reflecting recent developments in methods for conducting qualitative evidence syntheses to inform Cochrane intervention reviews. Examples are used to illustrate how decisions about boundaries for a review are formed via an iterative process of constructing lines of inquiry and mapping the available information to ascertain whether evidence exists to answer questions related to effectiveness, implementation, feasibility, appropriateness, economic evidence, and equity. The process of question formulation allows reviewers to situate the topic in relation to how it informs and explains effectiveness, using the criterion of meaningfulness, appropriateness, feasibility, and implementation. Questions related to complex questions and interventions can be structured by drawing on an increasingly wide range of question frameworks. Logic models and theoretical frameworks are useful tools for conceptually mapping the literature to illustrate the complexity of the phenomenon of interest. Furthermore, protocol development may require iterative question formulation and searching. Consequently, the final protocol may function as a guide rather than a prescriptive route map, particularly in qualitative reviews that ask more exploratory and open-ended questions. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Cumulative query method for influenza surveillance using search engine data.

    Science.gov (United States)

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  1. Statistic methods for searching inundated radioactive entities

    International Nuclear Information System (INIS)

    Dubasov, Yu.V.; Krivokhatskij, A.S.; Khramov, N.N.

    1993-01-01

    The problem of searching flooded radioactive object in a present area was considered. Various models of the searching route plotting are discussed. It is shown that spiral route by random points from the centre of the area examined is the most efficient one. The conclusion is made that, when searching flooded radioactive objects, it is advisable to use multidimensional statistical methods of classification

  2. Searching the online biomedical literature from developing countries

    African Journals Online (AJOL)

    Administrator

    This commentary highlights popular research literature databases and the use of the internet to obtain valuable research information. These literature retrieval methods include the use of the popular. PubMed as well as internet search engines. Specific websites catering to developing countries' information and journals' ...

  3. Searching the online biomedical literature from developing countries ...

    African Journals Online (AJOL)

    This commentary highlights popular research literature databases and the use of the internet to obtain valuable research information. These literature retrieval methods include the use of the popular PubMed as well as internet search engines. Specific websites catering to developing countries' information and journals' ...

  4. A semantics-based method for clustering of Chinese web search results

    Science.gov (United States)

    Zhang, Hui; Wang, Deqing; Wang, Li; Bi, Zhuming; Chen, Yong

    2014-01-01

    Information explosion is a critical challenge to the development of modern information systems. In particular, when the application of an information system is over the Internet, the amount of information over the web has been increasing exponentially and rapidly. Search engines, such as Google and Baidu, are essential tools for people to find the information from the Internet. Valuable information, however, is still likely submerged in the ocean of search results from those tools. By clustering the results into different groups based on subjects automatically, a search engine with the clustering feature allows users to select most relevant results quickly. In this paper, we propose an online semantics-based method to cluster Chinese web search results. First, we employ the generalised suffix tree to extract the longest common substrings (LCSs) from search snippets. Second, we use the HowNet to calculate the similarities of the words derived from the LCSs, and extract the most representative features by constructing the vocabulary chain. Third, we construct a vector of text features and calculate snippets' semantic similarities. Finally, we improve the Chameleon algorithm to cluster snippets. Extensive experimental results have shown that the proposed algorithm has outperformed over the suffix tree clustering method and other traditional clustering methods.

  5. Development of Pulsar Detection Methods for a Galactic Center Search

    Science.gov (United States)

    Thornton, Stephen; Wharton, Robert; Cordes, James; Chatterjee, Shami

    2018-01-01

    Finding pulsars within the inner parsec of the galactic center would be incredibly beneficial: for pulsars sufficiently close to Sagittarius A*, extremely precise tests of general relativity in the strong field regime could be performed through measurement of post-Keplerian parameters. Binary pulsar systems with sufficiently short orbital periods could provide the same laboratories with which to test existing theories. Fast and efficient methods are needed to parse large sets of time-domain data from different telescopes to search for periodicity in signals and differentiate radio frequency interference (RFI) from pulsar signals. Here we demonstrate several techniques to reduce red noise (low-frequency interference), generate signals from pulsars in binary orbits, and create plots that allow for fast detection of both RFI and pulsars.

  6. ARSTEC, Nonlinear Optimization Program Using Random Search Method

    International Nuclear Information System (INIS)

    Rasmuson, D. M.; Marshall, N. H.

    1979-01-01

    1 - Description of problem or function: The ARSTEC program was written to solve nonlinear, mixed integer, optimization problems. An example of such a problem in the nuclear industry is the allocation of redundant parts in the design of a nuclear power plant to minimize plant unavailability. 2 - Method of solution: The technique used in ARSTEC is the adaptive random search method. The search is started from an arbitrary point in the search region and every time a point that improves the objective function is found, the search region is centered at that new point. 3 - Restrictions on the complexity of the problem: Presently, the maximum number of independent variables allowed is 10. This can be changed by increasing the dimension of the arrays

  7. Development and experimental test of support vector machines virtual screening method for searching Src inhibitors from large compound libraries

    Directory of Open Access Journals (Sweden)

    Han Bucong

    2012-11-01

    Full Text Available Abstract Background Src plays various roles in tumour progression, invasion, metastasis, angiogenesis and survival. It is one of the multiple targets of multi-target kinase inhibitors in clinical uses and trials for the treatment of leukemia and other cancers. These successes and appearances of drug resistance in some patients have raised significant interest and efforts in discovering new Src inhibitors. Various in-silico methods have been used in some of these efforts. It is desirable to explore additional in-silico methods, particularly those capable of searching large compound libraries at high yields and reduced false-hit rates. Results We evaluated support vector machines (SVM as virtual screening tools for searching Src inhibitors from large compound libraries. SVM trained and tested by 1,703 inhibitors and 63,318 putative non-inhibitors correctly identified 93.53%~ 95.01% inhibitors and 99.81%~ 99.90% non-inhibitors in 5-fold cross validation studies. SVM trained by 1,703 inhibitors reported before 2011 and 63,318 putative non-inhibitors correctly identified 70.45% of the 44 inhibitors reported since 2011, and predicted as inhibitors 44,843 (0.33% of 13.56M PubChem, 1,496 (0.89% of 168 K MDDR, and 719 (7.73% of 9,305 MDDR compounds similar to the known inhibitors. Conclusions SVM showed comparable yield and reduced false hit rates in searching large compound libraries compared to the similarity-based and other machine-learning VS methods developed from the same set of training compounds and molecular descriptors. We tested three virtual hits of the same novel scaffold from in-house chemical libraries not reported as Src inhibitor, one of which showed moderate activity. SVM may be potentially explored for searching Src inhibitors from large compound libraries at low false-hit rates.

  8. Development and experimental test of support vector machines virtual screening method for searching Src inhibitors from large compound libraries.

    Science.gov (United States)

    Han, Bucong; Ma, Xiaohua; Zhao, Ruiying; Zhang, Jingxian; Wei, Xiaona; Liu, Xianghui; Liu, Xin; Zhang, Cunlong; Tan, Chunyan; Jiang, Yuyang; Chen, Yuzong

    2012-11-23

    Src plays various roles in tumour progression, invasion, metastasis, angiogenesis and survival. It is one of the multiple targets of multi-target kinase inhibitors in clinical uses and trials for the treatment of leukemia and other cancers. These successes and appearances of drug resistance in some patients have raised significant interest and efforts in discovering new Src inhibitors. Various in-silico methods have been used in some of these efforts. It is desirable to explore additional in-silico methods, particularly those capable of searching large compound libraries at high yields and reduced false-hit rates. We evaluated support vector machines (SVM) as virtual screening tools for searching Src inhibitors from large compound libraries. SVM trained and tested by 1,703 inhibitors and 63,318 putative non-inhibitors correctly identified 93.53%~ 95.01% inhibitors and 99.81%~ 99.90% non-inhibitors in 5-fold cross validation studies. SVM trained by 1,703 inhibitors reported before 2011 and 63,318 putative non-inhibitors correctly identified 70.45% of the 44 inhibitors reported since 2011, and predicted as inhibitors 44,843 (0.33%) of 13.56M PubChem, 1,496 (0.89%) of 168 K MDDR, and 719 (7.73%) of 9,305 MDDR compounds similar to the known inhibitors. SVM showed comparable yield and reduced false hit rates in searching large compound libraries compared to the similarity-based and other machine-learning VS methods developed from the same set of training compounds and molecular descriptors. We tested three virtual hits of the same novel scaffold from in-house chemical libraries not reported as Src inhibitor, one of which showed moderate activity. SVM may be potentially explored for searching Src inhibitors from large compound libraries at low false-hit rates.

  9. The search conference as a method in planning community health promotion actions

    Directory of Open Access Journals (Sweden)

    Eva Magnus

    2016-08-01

    Full Text Available Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives.

  10. Automated search method for AFM and profilers

    Science.gov (United States)

    Ray, Michael; Martin, Yves C.

    2001-08-01

    A new automation software creates a search model as an initial setup and searches for a user-defined target in atomic force microscopes or stylus profilometers used in semiconductor manufacturing. The need for such automation has become critical in manufacturing lines. The new method starts with a survey map of a small area of a chip obtained from a chip-design database or an image of the area. The user interface requires a user to point to and define a precise location to be measured, and to select a macro function for an application such as line width or contact hole. The search algorithm automatically constructs a range of possible scan sequences within the survey, and provides increased speed and functionality compared to the methods used in instruments to date. Each sequence consists in a starting point relative to the target, a scan direction, and a scan length. The search algorithm stops when the location of a target is found and criteria for certainty in positioning is met. With today's capability in high speed processing and signal control, the tool can simultaneously scan and search for a target in a robotic and continuous manner. Examples are given that illustrate the key concepts.

  11. Development of intelligent semantic search system for rubber research data in Thailand

    Science.gov (United States)

    Kaewboonma, Nattapong; Panawong, Jirapong; Pianhanuruk, Ekkawit; Buranarach, Marut

    2017-10-01

    The rubber production of Thailand increased not only by strong demand from the world market, but was also stimulated strongly through the replanting program of the Thai Government from 1961 onwards. With the continuous growth of rubber research data volume on the Web, the search for information has become a challenging task. Ontologies are used to improve the accuracy of information retrieval from the web by incorporating a degree of semantic analysis during the search. In this context, we propose an intelligent semantic search system for rubber research data in Thailand. The research methods included 1) analyzing domain knowledge, 2) ontologies development, and 3) intelligent semantic search system development to curate research data in trusted digital repositories may be shared among the wider Thailand rubber research community.

  12. [Development of domain specific search engines].

    Science.gov (United States)

    Takai, T; Tokunaga, M; Maeda, K; Kaminuma, T

    2000-01-01

    As cyber space exploding in a pace that nobody has ever imagined, it becomes very important to search cyber space efficiently and effectively. One solution to this problem is search engines. Already a lot of commercial search engines have been put on the market. However these search engines respond with such cumbersome results that domain specific experts can not tolerate. Using a dedicate hardware and a commercial software called OpenText, we have tried to develop several domain specific search engines. These engines are for our institute's Web contents, drugs, chemical safety, endocrine disruptors, and emergent response for chemical hazard. These engines have been on our Web site for testing.

  13. Introducing PALETTE: an iterative method for conducting a literature search for a review in palliative care.

    Science.gov (United States)

    Zwakman, Marieke; Verberne, Lisa M; Kars, Marijke C; Hooft, Lotty; van Delden, Johannes J M; Spijker, René

    2018-06-02

    In the rapidly developing specialty of palliative care, literature reviews have become increasingly important to inform and improve the field. When applying widely used methods for literature reviews developed for intervention studies onto palliative care, challenges are encountered such as the heterogeneity of palliative care in practice (wide range of domains in patient characteristics, stages of illness and stakeholders), the explorative character of review questions, and the poorly defined keywords and concepts. To overcome the challenges and to provide guidance for researchers to conduct a literature search for a review in palliative care, Palliative cAre Literature rEview iTeraTive mEthod (PALLETE), a pragmatic framework, was developed. We assessed PALETTE with a detailed description. PALETTE consists of four phases; developing the review question, building the search strategy, validating the search strategy and performing the search. The framework incorporates different information retrieval techniques: contacting experts, pearl growing, citation tracking and Boolean searching in a transparent way to maximize the retrieval of literature relevant to the topic of interest. The different components and techniques are repeated until no new articles are qualified for inclusion. The phases within PALETTE are interconnected by a recurrent process of validation on 'golden bullets' (articles that undoubtedly should be part of the review), citation tracking and concept terminology reflecting the review question. To give insight in the value of PALETTE, we compared PALETTE with the recommended search method for reviews of intervention studies. By using PALETTE on two palliative care literature reviews, we were able to improve our review questions and search strategies. Moreover, in comparison with the recommended search for intervention reviews, the number of articles needed to be screened was decreased whereas more relevant articles were retrieved. Overall, PALETTE

  14. Fast radio burst search: cross spectrum vs. auto spectrum method

    Science.gov (United States)

    Liu, Lei; Zheng, Weimin; Yan, Zhen; Zhang, Juan

    2018-06-01

    The search for fast radio bursts (FRBs) is a hot topic in current radio astronomy studies. In this work, we carry out a single pulse search with a very long baseline interferometry (VLBI) pulsar observation data set using both auto spectrum and cross spectrum search methods. The cross spectrum method, first proposed in Liu et al., maximizes the signal power by fully utilizing the fringe phase information of the baseline cross spectrum. The auto spectrum search method is based on the popular pulsar software package PRESTO, which extracts single pulses from the auto spectrum of each station. According to our comparison, the cross spectrum method is able to enhance the signal power and therefore extract single pulses from data contaminated by high levels of radio frequency interference (RFI), which makes it possible to carry out a search for FRBs in regular VLBI observations when RFI is present.

  15. Searching the ASRS Database Using QUORUM Keyword Search, Phrase Search, Phrase Generation, and Phrase Discovery

    Science.gov (United States)

    McGreevy, Michael W.; Connors, Mary M. (Technical Monitor)

    2001-01-01

    To support Search Requests and Quick Responses at the Aviation Safety Reporting System (ASRS), four new QUORUM methods have been developed: keyword search, phrase search, phrase generation, and phrase discovery. These methods build upon the core QUORUM methods of text analysis, modeling, and relevance-ranking. QUORUM keyword search retrieves ASRS incident narratives that contain one or more user-specified keywords in typical or selected contexts, and ranks the narratives on their relevance to the keywords in context. QUORUM phrase search retrieves narratives that contain one or more user-specified phrases, and ranks the narratives on their relevance to the phrases. QUORUM phrase generation produces a list of phrases from the ASRS database that contain a user-specified word or phrase. QUORUM phrase discovery finds phrases that are related to topics of interest. Phrase generation and phrase discovery are particularly useful for finding query phrases for input to QUORUM phrase search. The presentation of the new QUORUM methods includes: a brief review of the underlying core QUORUM methods; an overview of the new methods; numerous, concrete examples of ASRS database searches using the new methods; discussion of related methods; and, in the appendices, detailed descriptions of the new methods.

  16. Efficient searching in meshfree methods

    Science.gov (United States)

    Olliff, James; Alford, Brad; Simkins, Daniel C.

    2018-04-01

    Meshfree methods such as the Reproducing Kernel Particle Method and the Element Free Galerkin method have proven to be excellent choices for problems involving complex geometry, evolving topology, and large deformation, owing to their ability to model the problem domain without the constraints imposed on the Finite Element Method (FEM) meshes. However, meshfree methods have an added computational cost over FEM that come from at least two sources: increased cost of shape function evaluation and the determination of adjacency or connectivity. The focus of this paper is to formally address the types of adjacency information that arises in various uses of meshfree methods; a discussion of available techniques for computing the various adjacency graphs; propose a new search algorithm and data structure; and finally compare the memory and run time performance of the methods.

  17. The commission errors search and assessment (CESA) method

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B.; Dang, V. N

    2007-05-15

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  18. The commission errors search and assessment (CESA) method

    International Nuclear Information System (INIS)

    Reer, B.; Dang, V. N.

    2007-05-01

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  19. Job Search Methods: Consequences for Gender-based Earnings Inequality.

    Science.gov (United States)

    Huffman, Matt L.; Torres, Lisa

    2001-01-01

    Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)

  20. Development of a Computerized Visual Search Test

    Science.gov (United States)

    Reid, Denise; Babani, Harsha; Jon, Eugenia

    2009-01-01

    Visual attention and visual search are the features of visual perception, essential for attending and scanning one's environment while engaging in daily occupations. This study describes the development of a novel web-based test of visual search. The development information including the format of the test will be described. The test was designed…

  1. Search and foraging behaviors from movement data: A comparison of methods.

    Science.gov (United States)

    Bennison, Ashley; Bearhop, Stuart; Bodey, Thomas W; Votier, Stephen C; Grecian, W James; Wakefield, Ewan D; Hamer, Keith C; Jessopp, Mark

    2018-01-01

    Search behavior is often used as a proxy for foraging effort within studies of animal movement, despite it being only one part of the foraging process, which also includes prey capture. While methods for validating prey capture exist, many studies rely solely on behavioral annotation of animal movement data to identify search and infer prey capture attempts. However, the degree to which search correlates with prey capture is largely untested. This study applied seven behavioral annotation methods to identify search behavior from GPS tracks of northern gannets ( Morus bassanus ), and compared outputs to the occurrence of dives recorded by simultaneously deployed time-depth recorders. We tested how behavioral annotation methods vary in their ability to identify search behavior leading to dive events. There was considerable variation in the number of dives occurring within search areas across methods. Hidden Markov models proved to be the most successful, with 81% of all dives occurring within areas identified as search. k -Means clustering and first passage time had the highest rates of dives occurring outside identified search behavior. First passage time and hidden Markov models had the lowest rates of false positives, identifying fewer search areas with no dives. All behavioral annotation methods had advantages and drawbacks in terms of the complexity of analysis and ability to reflect prey capture events while minimizing the number of false positives and false negatives. We used these results, with consideration of analytical difficulty, to provide advice on the most appropriate methods for use where prey capture behavior is not available. This study highlights a need to critically assess and carefully choose a behavioral annotation method suitable for the research question being addressed, or resulting species management frameworks established.

  2. Reporting Quality of Search Methods in Systematic Reviews of HIV Behavioral Interventions (2000–2010): Are the Searches Clearly Explained, Systematic and Reproducible?

    Science.gov (United States)

    Mullins, Mary M.; DeLuca, Julia B.; Crepaz, Nicole; Lyles, Cynthia M.

    2018-01-01

    Systematic reviews are an essential tool for researchers, prevention providers and policy makers who want to remain current with the evidence in the field. Systematic review must adhere to strict standards, as the results can provide a more objective appraisal of evidence for making scientific decisions than traditional narrative reviews. An integral component of a systematic review is the development and execution of a comprehensive systematic search to collect available and relevant information. A number of reporting guidelines have been developed to ensure quality publications of systematic reviews. These guidelines provide the essential elements to include in the review process and report in the final publication for complete transparency. We identified the common elements of reporting guidelines and examined the reporting quality of search methods in HIV behavioral intervention literature. Consistent with the findings from previous evaluations of reporting search methods of systematic reviews in other fields, our review shows a lack of full and transparent reporting within systematic reviews even though a plethora of guidelines exist. This review underscores the need for promoting the completeness of and adherence to transparent systematic search reporting within systematic reviews. PMID:26052651

  3. Low-Mode Conformational Search Method with Semiempirical Quantum Mechanical Calculations: Application to Enantioselective Organocatalysis.

    Science.gov (United States)

    Kamachi, Takashi; Yoshizawa, Kazunari

    2016-02-22

    A conformational search program for finding low-energy conformations of large noncovalent complexes has been developed. A quantitatively reliable semiempirical quantum mechanical PM6-DH+ method, which is able to accurately describe noncovalent interactions at a low computational cost, was employed in contrast to conventional conformational search programs in which molecular mechanical methods are usually adopted. Our approach is based on the low-mode method whereby an initial structure is perturbed along one of its low-mode eigenvectors to generate new conformations. This method was applied to determine the most stable conformation of transition state for enantioselective alkylation by the Maruoka and cinchona alkaloid catalysts and Hantzsch ester hydrogenation of imines by chiral phosphoric acid. Besides successfully reproducing the previously reported most stable DFT conformations, the conformational search with the semiempirical quantum mechanical calculations newly discovered a more stable conformation at a low computational cost.

  4. Fast optimization of binary clusters using a novel dynamic lattice searching method

    International Nuclear Information System (INIS)

    Wu, Xia; Cheng, Wen

    2014-01-01

    Global optimization of binary clusters has been a difficult task despite of much effort and many efficient methods. Directing toward two types of elements (i.e., homotop problem) in binary clusters, two classes of virtual dynamic lattices are constructed and a modified dynamic lattice searching (DLS) method, i.e., binary DLS (BDLS) method, is developed. However, it was found that the BDLS can only be utilized for the optimization of binary clusters with small sizes because homotop problem is hard to be solved without atomic exchange operation. Therefore, the iterated local search (ILS) method is adopted to solve homotop problem and an efficient method based on the BDLS method and ILS, named as BDLS-ILS, is presented for global optimization of binary clusters. In order to assess the efficiency of the proposed method, binary Lennard-Jones clusters with up to 100 atoms are investigated. Results show that the method is proved to be efficient. Furthermore, the BDLS-ILS method is also adopted to study the geometrical structures of (AuPd) 79 clusters with DFT-fit parameters of Gupta potential

  5. A method for the design and development of medical or health care information websites to optimize search engine results page rankings on Google.

    Science.gov (United States)

    Dunne, Suzanne; Cummins, Niamh Maria; Hannigan, Ailish; Shannon, Bill; Dunne, Colum; Cullen, Walter

    2013-08-27

    The Internet is a widely used source of information for patients searching for medical/health care information. While many studies have assessed existing medical/health care information on the Internet, relatively few have examined methods for design and delivery of such websites, particularly those aimed at the general public. This study describes a method of evaluating material for new medical/health care websites, or for assessing those already in existence, which is correlated with higher rankings on Google's Search Engine Results Pages (SERPs). A website quality assessment (WQA) tool was developed using criteria related to the quality of the information to be contained in the website in addition to an assessment of the readability of the text. This was retrospectively applied to assess existing websites that provide information about generic medicines. The reproducibility of the WQA tool and its predictive validity were assessed in this study. The WQA tool demonstrated very high reproducibility (intraclass correlation coefficient=0.95) between 2 independent users. A moderate to strong correlation was found between WQA scores and rankings on Google SERPs. Analogous correlations were seen between rankings and readability of websites as determined by Flesch Reading Ease and Flesch-Kincaid Grade Level scores. The use of the WQA tool developed in this study is recommended as part of the design phase of a medical or health care information provision website, along with assessment of readability of the material to be used. This may ensure that the website performs better on Google searches. The tool can also be used retrospectively to make improvements to existing websites, thus, potentially enabling better Google search result positions without incurring the costs associated with Search Engine Optimization (SEO) professionals or paid promotion.

  6. The method of search of tendencies

    International Nuclear Information System (INIS)

    Reuss, Paul.

    1981-08-01

    The search of tendencies is an application of the mean squares method. Its objective is the better possible evaluation of the basic data used in the calculations from the comparison between measurements of integral characteristics and the corresponding theoretical results. This report presents the minimization which allows the estimation of the basic data and, above all, the methods which are necessary for the critical analysis of the obtained results [fr

  7. Method of Improving Personal Name Search in Academic Information Service

    Directory of Open Access Journals (Sweden)

    Heejun Han

    2012-12-01

    Full Text Available All academic information on the web or elsewhere has its creator, that is, a subject who has created the information. The subject can be an individual, a group, or an institution, and can be a nation depending on the nature of the relevant information. Most information is composed of a title, an author, and contents. An essay which is under the academic information category has metadata including a title, an author, keyword, abstract, data about publication, place of publication, ISSN, and the like. A patent has metadata including the title, an applicant, an inventor, an attorney, IPC, number of application, and claims of the invention. Most web-based academic information services enable users to search the information by processing the meta-information. An important element is to search information by using the author field which corresponds to a personal name. This study suggests a method of efficient indexing and using the adjacent operation result ranking algorithm to which phrase search-based boosting elements are applied, and thus improving the accuracy of the search results of personal names. It also describes a method for providing the results of searching co-authors and related researchers in searching personal names. This method can be effectively applied to providing accurate and additional search results in the academic information services.

  8. Searching for Suicide Methods: Accessibility of Information About Helium as a Method of Suicide on the Internet.

    Science.gov (United States)

    Gunnell, David; Derges, Jane; Chang, Shu-Sen; Biddle, Lucy

    2015-01-01

    Helium gas suicides have increased in England and Wales; easy-to-access descriptions of this method on the Internet may have contributed to this rise. To investigate the availability of information on using helium as a method of suicide and trends in searching about this method on the Internet. We analyzed trends in (a) Google searching (2004-2014) and (b) hits on a Wikipedia article describing helium as a method of suicide (2013-2014). We also investigated the extent to which helium was described as a method of suicide on web pages and discussion forums identified via Google. We found no evidence of rises in Internet searching about suicide using helium. News stories about helium suicides were associated with increased search activity. The Wikipedia article may have been temporarily altered to increase awareness of suicide using helium around the time of a celebrity suicide. Approximately one third of the links retrieved using Google searches for suicide methods mentioned helium. Information about helium as a suicide method is readily available on the Internet; the Wikipedia article describing its use was highly accessed following celebrity suicides. Availability of online information about this method may contribute to rises in helium suicides.

  9. Hybrid Genetic Algorithm - Local Search Method for Ground-Water Management

    Science.gov (United States)

    Chiu, Y.; Nishikawa, T.; Martin, P.

    2008-12-01

    Ground-water management problems commonly are formulated as a mixed-integer, non-linear programming problem (MINLP). Relying only on conventional gradient-search methods to solve the management problem is computationally fast; however, the methods may become trapped in a local optimum. Global-optimization schemes can identify the global optimum, but the convergence is very slow when the optimal solution approaches the global optimum. In this study, we developed a hybrid optimization scheme, which includes a genetic algorithm and a gradient-search method, to solve the MINLP. The genetic algorithm identifies a near- optimal solution, and the gradient search uses the near optimum to identify the global optimum. Our methodology is applied to a conjunctive-use project in the Warren ground-water basin, California. Hi- Desert Water District (HDWD), the primary water-manager in the basin, plans to construct a wastewater treatment plant to reduce future septic-tank effluent from reaching the ground-water system. The treated wastewater instead will recharge the ground-water basin via percolation ponds as part of a larger conjunctive-use strategy, subject to State regulations (e.g. minimum distances and travel times). HDWD wishes to identify the least-cost conjunctive-use strategies that control ground-water levels, meet regulations, and identify new production-well locations. As formulated, the MINLP objective is to minimize water-delivery costs subject to constraints including pump capacities, available recharge water, water-supply demand, water-level constraints, and potential new-well locations. The methodology was demonstrated by an enumerative search of the entire feasible solution and comparing the optimum solution with results from the branch-and-bound algorithm. The results also indicate that the hybrid method identifies the global optimum within an affordable computation time. Sensitivity analyses, which include testing different recharge-rate scenarios, pond

  10. Considerations for the development of task-based search engines

    DEFF Research Database (Denmark)

    Petcu, Paula; Dragusin, Radu

    2013-01-01

    Based on previous experience from working on a task-based search engine, we present a list of suggestions and ideas for an Information Retrieval (IR) framework that could inform the development of next generation professional search systems. The specific task that we start from is the clinicians......' information need in finding rare disease diagnostic hypotheses at the time and place where medical decisions are made. Our experience from the development of a search engine focused on supporting clinicians in completing this task has provided us valuable insights in what aspects should be considered...... by the developers of vertical search engines....

  11. LITERATURE SEARCH FOR METHODS FOR HAZARD ANALYSES OF AIR CARRIER OPERATIONS.

    Energy Technology Data Exchange (ETDEWEB)

    MARTINEZ - GURIDI,G.; SAMANTA,P.

    2002-07-01

    Representatives of the Federal Aviation Administration (FAA) and several air carriers under Title 14 of the Code of Federal Regulations (CFR) Part 121 developed a system-engineering model of the functions of air-carrier operations. Their analyses form the foundation or basic architecture upon which other task areas are based: hazard analyses, performance measures, and risk indicator design. To carry out these other tasks, models may need to be developed using the basic architecture of the Air Carrier Operations System Model (ACOSM). Since ACOSM encompasses various areas of air-carrier operations and can be used to address different task areas with differing but interrelated objectives, the modeling needs are broad. A literature search was conducted to identify and analyze the existing models that may be applicable for pursuing the task areas in ACOSM. The intent of the literature search was not necessarily to identify a specific model that can be directly used, but rather to identify relevant ones that have similarities with the processes and activities defined within ACOSM. Such models may provide useful inputs and insights in structuring ACOSM models. ACOSM simulates processes and activities in air-carrier operation, but, in a general framework, it has similarities with other industries where attention also has been paid to hazard analyses, emphasizing risk management, and in designing risk indicators. To assure that efforts in other industries are adequately considered, the literature search includes publications from other industries, e.g., chemical, nuclear, and process industries. This report discusses the literature search, the relevant methods identified and provides a preliminary assessment of their use in developing the models needed for the ACOSM task areas. A detailed assessment of the models has not been made. Defining those applicable for ACOSM will need further analyses of both the models and tools identified. The report is organized in four chapters

  12. Evaluation of a new method for librarian-mediated literature searches for systematic reviews

    NARCIS (Netherlands)

    W.M. Bramer (Wichor); Rethlefsen, M.L. (Melissa L.); F. Mast (Frans); J. Kleijnen (Jos)

    2017-01-01

    textabstractObjective: To evaluate and validate the time of completion and results of a new method of searching for systematic reviews, the exhaustive search method (ESM), using a pragmatic comparison. Methods: Single-line search strategies were prepared in a text document. Term completeness was

  13. Development and tuning of an original search engine for patent libraries in medicinal chemistry.

    Science.gov (United States)

    Pasche, Emilie; Gobeill, Julien; Kreim, Olivier; Oezdemir-Zaech, Fatma; Vachon, Therese; Lovis, Christian; Ruch, Patrick

    2014-01-01

    The large increase in the size of patent collections has led to the need of efficient search strategies. But the development of advanced text-mining applications dedicated to patents of the biomedical field remains rare, in particular to address the needs of the pharmaceutical & biotech industry, which intensively uses patent libraries for competitive intelligence and drug development. We describe here the development of an advanced retrieval engine to search information in patent collections in the field of medicinal chemistry. We investigate and combine different strategies and evaluate their respective impact on the performance of the search engine applied to various search tasks, which covers the putatively most frequent search behaviours of intellectual property officers in medical chemistry: 1) a prior art search task; 2) a technical survey task; and 3) a variant of the technical survey task, sometimes called known-item search task, where a single patent is targeted. The optimal tuning of our engine resulted in a top-precision of 6.76% for the prior art search task, 23.28% for the technical survey task and 46.02% for the variant of the technical survey task. We observed that co-citation boosting was an appropriate strategy to improve prior art search tasks, while IPC classification of queries was improving retrieval effectiveness for technical survey tasks. Surprisingly, the use of the full body of the patent was always detrimental for search effectiveness. It was also observed that normalizing biomedical entities using curated dictionaries had simply no impact on the search tasks we evaluate. The search engine was finally implemented as a web-application within Novartis Pharma. The application is briefly described in the report. We have presented the development of a search engine dedicated to patent search, based on state of the art methods applied to patent corpora. We have shown that a proper tuning of the system to adapt to the various search tasks

  14. Sliding surface searching method for slopes containing a potential weak structural surface

    Directory of Open Access Journals (Sweden)

    Aijun Yao

    2014-06-01

    Full Text Available Weak structural surface is one of the key factors controlling the stability of slopes. The stability of rock slopes is in general concerned with set of discontinuities. However, in soft rocks, failure can occur along surfaces approaching to a circular failure surface. To better understand the position of potential sliding surface, a new method called simplex-finite stochastic tracking method is proposed. This method basically divides sliding surface into two parts: one is described by smooth curve obtained by random searching, the other one is polyline formed by the weak structural surface. Single or multiple sliding surfaces can be considered, and consequently several types of combined sliding surfaces can be simulated. The paper will adopt the arc-polyline to simulate potential sliding surface and analyze the searching process of sliding surface. Accordingly, software for slope stability analysis using this method was developed and applied in real cases. The results show that, using simplex-finite stochastic tracking method, it is possible to locate the position of a potential sliding surface in the slope.

  15. Bayesian methods in the search for MH370

    CERN Document Server

    Davey, Sam; Holland, Ian; Rutten, Mark; Williams, Jason

    2016-01-01

    This book demonstrates how nonlinear/non-Gaussian Bayesian time series estimation methods were used to produce a probability distribution of potential MH370 flight paths. It provides details of how the probabilistic models of aircraft flight dynamics, satellite communication system measurements, environmental effects and radar data were constructed and calibrated. The probability distribution was used to define the search zone in the southern Indian Ocean. The book describes particle-filter based numerical calculation of the aircraft flight-path probability distribution and validates the method using data from several of the involved aircraft’s previous flights. Finally it is shown how the Reunion Island flaperon debris find affects the search probability distribution.

  16. Identifying complications of interventional procedures from UK routine healthcare databases: a systematic search for methods using clinical codes.

    Science.gov (United States)

    Keltie, Kim; Cole, Helen; Arber, Mick; Patrick, Hannah; Powell, John; Campbell, Bruce; Sims, Andrew

    2014-11-28

    Several authors have developed and applied methods to routine data sets to identify the nature and rate of complications following interventional procedures. But, to date, there has been no systematic search for such methods. The objective of this article was to find, classify and appraise published methods, based on analysis of clinical codes, which used routine healthcare databases in a United Kingdom setting to identify complications resulting from interventional procedures. A literature search strategy was developed to identify published studies that referred, in the title or abstract, to the name or acronym of a known routine healthcare database and to complications from procedures or devices. The following data sources were searched in February and March 2013: Cochrane Methods Register, Conference Proceedings Citation Index - Science, Econlit, EMBASE, Health Management Information Consortium, Health Technology Assessment database, MathSciNet, MEDLINE, MEDLINE in-process, OAIster, OpenGrey, Science Citation Index Expanded and ScienceDirect. Of the eligible papers, those which reported methods using clinical coding were classified and summarised in tabular form using the following headings: routine healthcare database; medical speciality; method for identifying complications; length of follow-up; method of recording comorbidity. The benefits and limitations of each approach were assessed. From 3688 papers identified from the literature search, 44 reported the use of clinical codes to identify complications, from which four distinct methods were identified: 1) searching the index admission for specified clinical codes, 2) searching a sequence of admissions for specified clinical codes, 3) searching for specified clinical codes for complications from procedures and devices within the International Classification of Diseases 10th revision (ICD-10) coding scheme which is the methodology recommended by NHS Classification Service, and 4) conducting manual clinical

  17. Distributed Cooperative Search Control Method of Multiple UAVs for Moving Target

    Directory of Open Access Journals (Sweden)

    Chang-jian Ru

    2015-01-01

    Full Text Available To reduce the impact of uncertainties caused by unknown motion parameters on searching plan of moving targets and improve the efficiency of UAV’s searching, a novel distributed Multi-UAVs cooperative search control method for moving target is proposed in this paper. Based on detection results of onboard sensors, target probability map is updated using Bayesian theory. A Gaussian distribution of target transition probability density function is introduced to calculate prediction probability of moving target existence, and then target probability map can be further updated in real-time. A performance index function combining with target cost, environment cost, and cooperative cost is constructed, and the cooperative searching problem can be transformed into a central optimization problem. To improve computational efficiency, the distributed model predictive control method is presented, and thus the control command of each UAV can be obtained. The simulation results have verified that the proposed method can avoid the blindness of UAV searching better and improve overall efficiency of the team effectively.

  18. New procedure for criticality search using coarse mesh nodal methods

    International Nuclear Information System (INIS)

    Pereira, Wanderson F.; Silva, Fernando C. da; Martinez, Aquilino S.

    2011-01-01

    The coarse mesh nodal methods have as their primary goal to calculate the neutron flux inside the reactor core. Many computer systems use a specific form of calculation, which is called nodal method. In classical computing systems that use the criticality search is made after the complete convergence of the iterative process of calculating the neutron flux. In this paper, we proposed a new method for the calculation of criticality, condition which will be over very iterative process of calculating the neutron flux. Thus, the processing time for calculating the neutron flux was reduced by half compared with the procedure developed by the Nuclear Engineering Program of COPPE/UFRJ (PEN/COPPE/UFRJ). (author)

  19. New procedure for criticality search using coarse mesh nodal methods

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Wanderson F.; Silva, Fernando C. da; Martinez, Aquilino S., E-mail: wneto@con.ufrj.b, E-mail: fernando@con.ufrj.b, E-mail: Aquilino@lmp.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    The coarse mesh nodal methods have as their primary goal to calculate the neutron flux inside the reactor core. Many computer systems use a specific form of calculation, which is called nodal method. In classical computing systems that use the criticality search is made after the complete convergence of the iterative process of calculating the neutron flux. In this paper, we proposed a new method for the calculation of criticality, condition which will be over very iterative process of calculating the neutron flux. Thus, the processing time for calculating the neutron flux was reduced by half compared with the procedure developed by the Nuclear Engineering Program of COPPE/UFRJ (PEN/COPPE/UFRJ). (author)

  20. Learning search-driven application development with SharePoint 2013

    CERN Document Server

    Tordgeman, Johnny

    2013-01-01

    A fast paced, practical guide, filled with code examples and demonstrations of enterprise search using SharePoint 2013.This book is written for SharePoint and JavaScript developers who want to get started with SharePoint search and create search-driven applications. The book assumes working knowledge with previous versions of SharePoint and some experience with JavaScript and client side development

  1. Developing a Systematic Patent Search Training Program

    Science.gov (United States)

    Zhang, Li

    2009-01-01

    This study aims to develop a systematic patent training program using patent analysis and citation analysis techniques applied to patents held by the University of Saskatchewan. The results indicate that the target audience will be researchers in life sciences, and aggregated patent database searching and advanced search techniques should be…

  2. Applying systematic review search methods to the grey literature: a case study examining guidelines for school-based breakfast programs in Canada.

    Science.gov (United States)

    Godin, Katelyn; Stapleton, Jackie; Kirkpatrick, Sharon I; Hanning, Rhona M; Leatherdale, Scott T

    2015-10-22

    Grey literature is an important source of information for large-scale review syntheses. However, there are many characteristics of grey literature that make it difficult to search systematically. Further, there is no 'gold standard' for rigorous systematic grey literature search methods and few resources on how to conduct this type of search. This paper describes systematic review search methods that were developed and applied to complete a case study systematic review of grey literature that examined guidelines for school-based breakfast programs in Canada. A grey literature search plan was developed to incorporate four different searching strategies: (1) grey literature databases, (2) customized Google search engines, (3) targeted websites, and (4) consultation with contact experts. These complementary strategies were used to minimize the risk of omitting relevant sources. Since abstracts are often unavailable in grey literature documents, items' abstracts, executive summaries, or table of contents (whichever was available) were screened. Screening of publications' full-text followed. Data were extracted on the organization, year published, who they were developed by, intended audience, goal/objectives of document, sources of evidence/resources cited, meals mentioned in the guidelines, and recommendations for program delivery. The search strategies for identifying and screening publications for inclusion in the case study review was found to be manageable, comprehensive, and intuitive when applied in practice. The four search strategies of the grey literature search plan yielded 302 potentially relevant items for screening. Following the screening process, 15 publications that met all eligibility criteria remained and were included in the case study systematic review. The high-level findings of the case study systematic review are briefly described. This article demonstrated a feasible and seemingly robust method for applying systematic search strategies to

  3. Remarks on search methods for stable, massive, elementary particles

    International Nuclear Information System (INIS)

    Perl, Martin L.

    2001-01-01

    This paper was presented at the 69th birthday celebration of Professor Eugene Commins, honoring his research achievements. These remarks are about the experimental techniques used in the search for new stable, massive particles, particles at least as massive as the electron. A variety of experimental methods such as accelerator experiments, cosmic ray studies, searches for halo particles in the galaxy and searches for exotic particles in bulk matter are described. A summary is presented of the measured limits on the existence of new stable, massive particle

  4. Developing a Grid-based search and categorization tool

    CERN Document Server

    Haya, Glenn; Vigen, Jens

    2003-01-01

    Grid technology has the potential to improve the accessibility of digital libraries. The participants in Project GRACE (Grid Search And Categorization Engine) are in the process of developing a search engine that will allow users to search through heterogeneous resources stored in geographically distributed digital collections. What differentiates this project from current search tools is that GRACE will be run on the European Data Grid, a large distributed network, and will not have a single centralized index as current web search engines do. In some cases, the distributed approach offers advantages over the centralized approach since it is more scalable, can be used on otherwise inaccessible material, and can provide advanced search options customized for each data source.

  5. The Search Conference as a Method in Planning Community Health Promotion Actions

    Science.gov (United States)

    Magnus, Eva; Knudtsen, Margunn Skjei; Wist, Guri; Weiss, Daniel; Lillefjell, Monica

    2016-01-01

    Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives. Significance for public health This article describe and discuss how the Search conference can be used as a method when working with knowledge based health promotion actions in local communities. The article describe the sequences of the conference and shows how this have been adapted when planning and prioritizing health promotion actions in three Norwegian municipalities. The significance of the article is that it shows how central elements in the planning of health promotion actions, as participation and involvements as well as evidence was a fundamental thinking in how the conference were accomplished. The article continue discussing how the method function as both a top-down and a bottom-up strategy, and in what way working evidence based can be in conflict with a bottom-up strategy. The experiences described can be used as guidance planning knowledge based health promotion actions in communities. PMID:27747199

  6. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  7. Harmony Search Method: Theory and Applications

    Directory of Open Access Journals (Sweden)

    X. Z. Gao

    2015-01-01

    Full Text Available The Harmony Search (HS method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem.

  8. Social Work Literature Searching: Current Issues with Databases and Online Search Engines

    Science.gov (United States)

    McGinn, Tony; Taylor, Brian; McColgan, Mary; McQuilkan, Janice

    2016-01-01

    Objectives: To compare the performance of a range of search facilities; and to illustrate the execution of a comprehensive literature search for qualitative evidence in social work. Context: Developments in literature search methods and comparisons of search facilities help facilitate access to the best available evidence for social workers.…

  9. Development of power change maneuvering method for BWR

    International Nuclear Information System (INIS)

    Fukuzaki, Takaharu; Yamada, Naoyuki; Kiguchi, Takashi; Sakurai, Mikio.

    1985-01-01

    A power change maneuvering method for BWR has been proposed to generate an optimal power control maneuver, which realizes the power change operation closest to a power change demand pattern under operating constraints. The method searches for the maneuver as an optimization problem, where the variables are thermal power levels sampled from the demand pattern, the performance index is defined to express the power mismatch between demand and feasible patterns, and the constraints are limit lines on the thermal power-core flow rate map and limits on keeping fuel integrity. The usable feasible direction method is utilized as the optimization algorithm, with newly developed techniques for initial value generation and step length determination, which apply one-dimensional search and inverse-interpolation methods, respectively, to realize the effective search of the optimal solution. Simulation results show that a typical computing time is about 5 min by a general purpose computer and the method has been verified to be practical even for on-line use. (author)

  10. Self-learning search engines

    NARCIS (Netherlands)

    Schuth, A.

    2015-01-01

    How does a search engine such as Google know which search results to display? There are many competing algorithms that generate search results, but which one works best? We developed a new probabilistic method for quickly comparing large numbers of search algorithms by examining the results users

  11. Building maps to search the web: the method Sewcom

    Directory of Open Access Journals (Sweden)

    Corrado Petrucco

    2002-01-01

    Full Text Available Seeking information on the Internet is becoming a necessity 'at school, at work and in every social sphere. Unfortunately the difficulties' inherent in the use of search engines and the use of unconscious cognitive approaches inefficient limit their effectiveness. It is in this respect presented a method, called SEWCOM that lets you create conceptual maps through interaction with search engines.

  12. Search method optimization technique for thermal design of high power RFQ structure

    International Nuclear Information System (INIS)

    Sharma, N.K.; Joshi, S.C.

    2009-01-01

    RRCAT has taken up the development of 3 MeV RFQ structure for the low energy part of 100 MeV H - ion injector linac. RFQ is a precision machined resonating structure designed for high rf duty factor. RFQ structural stability during high rf power operation is an important design issue. The thermal analysis of RFQ has been performed using ANSYS finite element analysis software and optimization of various parameters is attempted using Search Method optimization technique. It is an effective optimization technique for the systems governed by a large number of independent variables. The method involves examining a number of combinations of values of independent variables and drawing conclusions from the magnitude of the objective function at these combinations. In these methods there is a continuous improvement in the objective function throughout the course of the search and hence these methods are very efficient. The method has been employed in optimization of various parameters (called independent variables) of RFQ like cooling water flow rate, cooling water inlet temperatures, cavity thickness etc. involved in RFQ thermal design. The temperature rise within RFQ structure is the objective function during the thermal design. Using ANSYS Programming Development Language (APDL), various multiple iterative programmes are written and the analysis are performed to minimize the objective function. The dependency of the objective function on various independent variables is established and the optimum values of the parameters are evaluated. The results of the analysis are presented in the paper. (author)

  13. The Use of Resistivity Methods in Terrestrial Forensic Searches

    Science.gov (United States)

    Wolf, R. C.; Raisuddin, I.; Bank, C.

    2013-12-01

    The increasing use of near-surface geophysical methods in forensic searches has demonstrated the need for further studies to identify the ideal physical, environmental and temporal settings for each geophysical method. Previous studies using resistivity methods have shown promising results, but additional work is required to more accurately interpret and analyze survey findings. The Ontario Provincial Police's UCRT (Urban Search and Rescue; Chemical, Biolgical, Radiological, Nuclear and Explosives; Response Team) is collaborating with the University of Toronto and two additional universities in a multi-year study investigating the applications of near-surface geophysical methods to terrestrial forensic searches. In the summer of 2012, on a test site near Bolton, Ontario, the OPP buried weapons, drums and pigs (naked, tarped, and clothed) to simulate clandestine graves and caches. Our study aims to conduct repeat surveys using an IRIS Syscal Junior with 48 electrode switching system resistivity-meter. These surveys will monitor changes in resistivity reflecting decomposition of the object since burial, and identify the strengths and weaknesses of resistivity when used in a rural, clandestine burial setting. Our initial findings indicate the usefulness of this method, as prominent resistivity changes have been observed. We anticipate our results will help to assist law enforcement agencies in determining the type of resistivity results to expect based on time since burial, depth of burial and state of dress of the body.

  14. New hybrid conjugate gradient methods with the generalized Wolfe line search.

    Science.gov (United States)

    Xu, Xiao; Kong, Fan-Yu

    2016-01-01

    The conjugate gradient method was an efficient technique for solving the unconstrained optimization problem. In this paper, we made a linear combination with parameters β k of the DY method and the HS method, and putted forward the hybrid method of DY and HS. We also proposed the hybrid of FR and PRP by the same mean. Additionally, to present the two hybrid methods, we promoted the Wolfe line search respectively to compute the step size α k of the two hybrid methods. With the new Wolfe line search, the two hybrid methods had descent property and global convergence property of the two hybrid methods that can also be proved.

  15. Implementation Of Haversine Formula And Best First Search Method In Searching Of Tsunami Evacuation Route

    Science.gov (United States)

    Anisya; Yoga Swara, Ganda

    2017-12-01

    Padang is one of the cities prone to earthquake disaster with tsunami due to its position at the meeting of two active plates, this is, a source of potentially powerful earthquake and tsunami. Central government and most offices are located in the red zone (vulnerable areas), it will also affect the evacuation of the population during the earthquake and tsunami disaster. In this study, researchers produced a system of search nearest shelter using best-first-search method. This method uses the heuristic function, the amount of cost taken and the estimated value or travel time, path length and population density. To calculate the length of the path, researchers used method of haversine formula. The value obtained from the calculation process is implemented on a web-based system. Some alternative paths and some of the closest shelters will be displayed in the system.

  16. Validation of a search strategy to identify nutrition trials in PubMed using the relative recall method.

    Science.gov (United States)

    Durão, Solange; Kredo, Tamara; Volmink, Jimmy

    2015-06-01

    To develop, assess, and maximize the sensitivity of a search strategy to identify diet and nutrition trials in PubMed using relative recall. We developed a search strategy to identify diet and nutrition trials in PubMed. We then constructed a gold standard reference set to validate the identified trials using the relative recall method. Relative recall was calculated by dividing the number of references from the gold standard our search strategy identified by the total number of references in the gold standard. Our gold standard comprised 298 trials, derived from 16 included systematic reviews. The initial search strategy identified 242 of 298 references, with a relative recall of 81.2% [95% confidence interval (CI): 76.3%, 85.5%]. We analyzed titles and abstracts of the 56 missed references for possible additional terms. We then modified the search strategy accordingly. The relative recall of the final search strategy was 88.6% (95% CI: 84.4%, 91.9%). We developed a search strategy to identify diet and nutrition trials in PubMed with a high relative recall (sensitivity). This could be useful for establishing a nutrition trials register to support the conduct of future research, including systematic reviews. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. A method of searching LDAP directories using XQuery

    International Nuclear Information System (INIS)

    Hesselroth, Ted

    2011-01-01

    A method by which an LDAP directory can be searched using XQuery is described. The strategy behind the tool consists of four steps. First the XQuery script is examined and relevant XPath expressions are extracted, determined to be sufficient to define all information needed to perform the query. Then the XPath expressions are converted into their equivalent LDAP search filters by use of the published LDAP schema of the service, and search requests are made to the LDAP host. The search results are then merged and converted to an XML document that conforms to the hierarchy of the LDAP schema. Finally, the XQuery script is executed on the working XML document by conventional means. Examples are given of application of the tool in the Open Science Grid, which for discovery purposes operates an LDAP server that contains Glue schema-based information on site configuration and authorization policies. The XQuery scripts compactly replace hundreds of lines of custom python code that relied on the unix ldapsearch utility. Installation of the tool is available through the Virtual Data Toolkit.

  18. Study on boundary search method for DFM mesh generation

    Directory of Open Access Journals (Sweden)

    Li Ri

    2012-08-01

    Full Text Available The boundary mesh of the casting model was determined by direct calculation on the triangular facets extracted from the STL file of the 3D model. Then the inner and outer grids of the model were identified by the algorithm in which we named Inner Seed Grid Method. Finally, a program to automatically generate a 3D FDM mesh was compiled. In the paper, a method named Triangle Contraction Search Method (TCSM was put forward to ensure not losing the boundary grids; while an algorithm to search inner seed grids to identify inner/outer grids of the casting model was also brought forward. Our algorithm was simple, clear and easy to construct program. Three examples for the casting mesh generation testified the validity of the program.

  19. Search Method Based on Figurative Indexation of Folksonomic Features of Graphic Files

    Directory of Open Access Journals (Sweden)

    Oleg V. Bisikalo

    2013-11-01

    Full Text Available In this paper the search method based on usage of figurative indexation of folksonomic characteristics of graphical files is described. The method takes into account extralinguistic information, is based on using a model of figurative thinking of humans. The paper displays the creation of a method of searching image files based on their formal, including folksonomical clues.

  20. Development of health information search engine based on metadata and ontology.

    Science.gov (United States)

    Song, Tae-Min; Park, Hyeoun-Ae; Jin, Dal-Lae

    2014-04-01

    The aim of the study was to develop a metadata and ontology-based health information search engine ensuring semantic interoperability to collect and provide health information using different application programs. Health information metadata ontology was developed using a distributed semantic Web content publishing model based on vocabularies used to index the contents generated by the information producers as well as those used to search the contents by the users. Vocabulary for health information ontology was mapped to the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), and a list of about 1,500 terms was proposed. The metadata schema used in this study was developed by adding an element describing the target audience to the Dublin Core Metadata Element Set. A metadata schema and an ontology ensuring interoperability of health information available on the internet were developed. The metadata and ontology-based health information search engine developed in this study produced a better search result compared to existing search engines. Health information search engine based on metadata and ontology will provide reliable health information to both information producer and information consumers.

  1. Ethiopian Journal of Development Research: Advanced Search

    African Journals Online (AJOL)

    PROMOTING ACCESS TO AFRICAN RESEARCH ... Ethiopian Journal of Development Research: Advanced Search ... containing either term; e.g., education OR research; Use parentheses to create more complex queries; ... Ethiopian Journal of Business and Economics (The), Ethiopian Journal of Development Research ...

  2. Recent developments in MrBUMP: better search-model preparation, graphical interaction with search models, and solution improvement and assessment.

    Science.gov (United States)

    Keegan, Ronan M; McNicholas, Stuart J; Thomas, Jens M H; Simpkin, Adam J; Simkovic, Felix; Uski, Ville; Ballard, Charles C; Winn, Martyn D; Wilson, Keith S; Rigden, Daniel J

    2018-03-01

    Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case.

  3. A review of the scientific rationale and methods used in the search for other planetary systems

    Science.gov (United States)

    Black, D. C.

    1985-01-01

    Planetary systems appear to be one of the crucial links in the chain leading from simple molecules to living systems, particularly complex (intelligent?) living systems. Although there is currently no observational proof of the existence of any planetary system other than our own, techniques are now being developed which will permit a comprehensive search for other planetary systems. The scientific rationale for and methods used in such a search effort are reviewed here.

  4. A Fast Radio Burst Search Method for VLBI Observation

    Science.gov (United States)

    Liu, Lei; Tong, Fengxian; Zheng, Weimin; Zhang, Juan; Tong, Li

    2018-02-01

    We introduce the cross-spectrum-based fast radio burst (FRB) search method for Very Long Baseline Interferometer (VLBI) observation. This method optimizes the fringe fitting scheme in geodetic VLBI data post-processing, which fully utilizes the cross-spectrum fringe phase information and therefore maximizes the power of single-pulse signals. Working with cross-spectrum greatly reduces the effect of radio frequency interference compared with using auto-power spectrum. Single-pulse detection confidence increases by cross-identifying detections from multiple baselines. By combining the power of multiple baselines, we may improve the detection sensitivity. Our method is similar to that of coherent beam forming, but without the computational expense to form a great number of beams to cover the whole field of view of our telescopes. The data processing pipeline designed for this method is easy to implement and parallelize, which can be deployed in various kinds of VLBI observations. In particular, we point out that VGOS observations are very suitable for FRB search.

  5. Fuzzy Search Method for Hi Education Information Security

    Directory of Open Access Journals (Sweden)

    Grigory Grigorevich Novikov

    2016-03-01

    Full Text Available The main reason of the research is how to use fuzzy search method for information security of Hi Education or some similar purposes. So many sensitive information leaks are through non SUMMARY 149 classified documents legal publishing. That’s why many intelligence services so love to use the «mosaic» information collection method. This article is about how to prevent it.

  6. Performance comparison of a new hybrid conjugate gradient method under exact and inexact line searches

    Science.gov (United States)

    Ghani, N. H. A.; Mohamed, N. S.; Zull, N.; Shoid, S.; Rivaie, M.; Mamat, M.

    2017-09-01

    Conjugate gradient (CG) method is one of iterative techniques prominently used in solving unconstrained optimization problems due to its simplicity, low memory storage, and good convergence analysis. This paper presents a new hybrid conjugate gradient method, named NRM1 method. The method is analyzed under the exact and inexact line searches in given conditions. Theoretically, proofs show that the NRM1 method satisfies the sufficient descent condition with both line searches. The computational result indicates that NRM1 method is capable in solving the standard unconstrained optimization problems used. On the other hand, the NRM1 method performs better under inexact line search compared with exact line search.

  7. A new greedy search method for the design of digital IIR filter

    Directory of Open Access Journals (Sweden)

    Ranjit Kaur

    2015-07-01

    Full Text Available A new greedy search method is applied in this paper to design the optimal digital infinite impulse response (IIR filter. The greedy search method is based on binary successive approximation (BSA and evolutionary search (ES. The suggested greedy search method optimizes the magnitude response and the phase response simultaneously and also finds the lowest order of the filter. The order of the filter is controlled by a control gene whose value is also optimized along with the filter coefficients to obtain optimum order of designed IIR filter. The stability constraints of IIR filter are taken care of during the design procedure. To determine the trade-off relationship between conflicting objectives in the non-inferior domain, the weighting method is exploited. The proposed approach is effectively applied to solve the multiobjective optimization problems of designing the digital low-pass (LP, high-pass (HP, bandpass (BP, and bandstop (BS filters. It has been demonstrated that this technique not only fulfills all types of filter performance requirements, but also the lowest order of the filter can be found. The computational experiments show that the proposed approach gives better digital IIR filters than the existing evolutionary algorithm (EA based methods.

  8. Comparing the Precision of Information Retrieval of MeSH-Controlled Vocabulary Search Method and a Visual Method in the Medline Medical Database.

    Science.gov (United States)

    Hariri, Nadjla; Ravandi, Somayyeh Nadi

    2014-01-01

    Medline is one of the most important databases in the biomedical field. One of the most important hosts for Medline is Elton B. Stephens CO. (EBSCO), which has presented different search methods that can be used based on the needs of the users. Visual search and MeSH-controlled search methods are among the most common methods. The goal of this research was to compare the precision of the retrieved sources in the EBSCO Medline base using MeSH-controlled and visual search methods. This research was a semi-empirical study. By holding training workshops, 70 students of higher education in different educational departments of Kashan University of Medical Sciences were taught MeSH-Controlled and visual search methods in 2012. Then, the precision of 300 searches made by these students was calculated based on Best Precision, Useful Precision, and Objective Precision formulas and analyzed in SPSS software using the independent sample T Test, and three precisions obtained with the three precision formulas were studied for the two search methods. The mean precision of the visual method was greater than that of the MeSH-Controlled search for all three types of precision, i.e. Best Precision, Useful Precision, and Objective Precision, and their mean precisions were significantly different (P searches. Fifty-three percent of the participants in the research also mentioned that the use of the combination of the two methods produced better results. For users, it is more appropriate to use a natural, language-based method, such as the visual method, in the EBSCO Medline host than to use the controlled method, which requires users to use special keywords. The potential reason for their preference was that the visual method allowed them more freedom of action.

  9. Multi-Agent Based Beam Search for Real-Time Production Scheduling and Control Method, Software and Industrial Application

    CERN Document Server

    Kang, Shu Gang

    2013-01-01

    The Multi-Agent Based Beam Search (MABBS) method systematically integrates four major requirements of manufacturing production - representation capability, solution quality, computation efficiency, and implementation difficulty - within a unified framework to deal with the many challenges of complex real-world production planning and scheduling problems. Multi-agent Based Beam Search for Real-time Production Scheduling and Control introduces this method, together with its software implementation and industrial applications.  This book connects academic research with industrial practice, and develops a practical solution to production planning and scheduling problems. To simplify implementation, a reusable software platform is developed to build the MABBS method into a generic computation engine.  This engine is integrated with a script language, called the Embedded Extensible Application Script Language (EXASL), to provide a flexible and straightforward approach to representing complex real-world problems. ...

  10. The development of search filters for adverse effects of surgical interventions in medline and Embase.

    Science.gov (United States)

    Golder, Su; Wright, Kath; Loke, Yoon Kong

    2018-03-31

    Search filter development for adverse effects has tended to focus on retrieving studies of drug interventions. However, a different approach is required for surgical interventions. To develop and validate search filters for medline and Embase for the adverse effects of surgical interventions. Systematic reviews of surgical interventions where the primary focus was to evaluate adverse effect(s) were sought. The included studies within these reviews were divided randomly into a development set, evaluation set and validation set. Using word frequency analysis we constructed a sensitivity maximising search strategy and this was tested in the evaluation and validation set. Three hundred and fifty eight papers were included from 19 surgical intervention reviews. Three hundred and fifty two papers were available on medline and 348 were available on Embase. Generic adverse effects search strategies in medline and Embase could achieve approximately 90% relative recall. Recall could be further improved with the addition of specific adverse effects terms to the search strategies. We have derived and validated a novel search filter that has reasonable performance for identifying adverse effects of surgical interventions in medline and Embase. However, we appreciate the limitations of our methods, and recommend further research on larger sample sizes and prospective systematic reviews. © 2018 The Authors Health Information and Libraries Journal published by John Wiley & Sons Ltd on behalf of Health Libraries Group.

  11. Optimal generation and reserve dispatch in a multi-area competitive market using a hybrid direct search method

    International Nuclear Information System (INIS)

    Chun Lung Chen

    2005-01-01

    With restructuring of the power industry, competitive bidding for energy and ancillary services are increasingly recognized as an important part of electricity markets. It is desirable to optimize not only the generator's bid prices for energy and for providing minimized ancillary services but also the transmission congestion costs. In this paper, a hybrid approach of combining sequential dispatch with a direct search method is developed to deal with the multi-product and multi-area electricity market dispatch problem. The hybrid direct search method (HDSM) incorporates sequential dispatch into the direct search method to facilitate economic sharing of generation and reserve across areas and to minimize the total market cost in a multi-area competitive electricity market. The effects of tie line congestion and area spinning reserve requirement are also consistently reflected in the marginal price in each area. Numerical experiments are included to understand the various constraints in the market cost analysis and to provide valuable information for market participants in a pool oriented electricity market. (author)

  12. Optimal generation and reserve dispatch in a multi-area competitive market using a hybrid direct search method

    International Nuclear Information System (INIS)

    Chen, C.-L.

    2005-01-01

    With restructuring of the power industry, competitive bidding for energy and ancillary services are increasingly recognized as an important part of electricity markets. It is desirable to optimize not only the generator's bid prices for energy and for providing minimized ancillary services but also the transmission congestion costs. In this paper, a hybrid approach of combining sequential dispatch with a direct search method is developed to deal with the multi-product and multi-area electricity market dispatch problem. The hybrid direct search method (HDSM) incorporates sequential dispatch into the direct search method to facilitate economic sharing of generation and reserve across areas and to minimize the total market cost in a multi-area competitive electricity market. The effects of tie line congestion and area spinning reserve requirement are also consistently reflected in the marginal price in each area. Numerical experiments are included to understand the various constraints in the market cost analysis and to provide valuable information for market participants in a pool oriented electricity market

  13. A comparison of two search methods for determining the scope of systematic reviews and health technology assessments.

    Science.gov (United States)

    Forsetlund, Louise; Kirkehei, Ingvild; Harboe, Ingrid; Odgaard-Jensen, Jan

    2012-01-01

    This study aims to compare two different search methods for determining the scope of a requested systematic review or health technology assessment. The first method (called the Direct Search Method) included performing direct searches in the Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts of Reviews of Effects (DARE) and the Health Technology Assessments (HTA). Using the comparison method (called the NHS Search Engine) we performed searches by means of the search engine of the British National Health Service, NHS Evidence. We used an adapted cross-over design with a random allocation of fifty-five requests for systematic reviews. The main analyses were based on repeated measurements adjusted for the order in which the searches were conducted. The Direct Search Method generated on average fewer hits (48 percent [95 percent confidence interval {CI} 6 percent to 72 percent], had a higher precision (0.22 [95 percent CI, 0.13 to 0.30]) and more unique hits than when searching by means of the NHS Search Engine (50 percent [95 percent CI, 7 percent to 110 percent]). On the other hand, the Direct Search Method took longer (14.58 minutes [95 percent CI, 7.20 to 21.97]) and was perceived as somewhat less user-friendly than the NHS Search Engine (-0.60 [95 percent CI, -1.11 to -0.09]). Although the Direct Search Method had some drawbacks such as being more time-consuming and less user-friendly, it generated more unique hits than the NHS Search Engine, retrieved on average fewer references and fewer irrelevant results.

  14. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    Science.gov (United States)

    Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. Methods A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. Results We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r s = 0.88, P < .001), “Bychkov” (r s = .78, P < .001) and “Khimki”(r s = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for “Bychkov” and

  15. Research on Large-Scale Road Network Partition and Route Search Method Combined with Traveler Preferences

    Directory of Open Access Journals (Sweden)

    De-Xin Yu

    2013-01-01

    Full Text Available Combined with improved Pallottino parallel algorithm, this paper proposes a large-scale route search method, which considers travelers’ route choice preferences. And urban road network is decomposed into multilayers effectively. Utilizing generalized travel time as road impedance function, the method builds a new multilayer and multitasking road network data storage structure with object-oriented class definition. Then, the proposed path search algorithm is verified by using the real road network of Guangzhou city as an example. By the sensitive experiments, we make a comparative analysis of the proposed path search method with the current advanced optimal path algorithms. The results demonstrate that the proposed method can increase the road network search efficiency by more than 16% under different search proportion requests, node numbers, and computing process numbers, respectively. Therefore, this method is a great breakthrough in the guidance field of urban road network.

  16. Search Results | Page 797 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Search Results. Showing 7961 - 7970 of 8491 results. Story. Health Natural Resources Mining ECOHEALTH ... CASE STUDY: Latin America — An engine for rural development in Latin America. Research in Action. Biodiversity Gender Crops ...

  17. (Re)interpreting LHC New Physics Search Results : Tools and Methods, 3rd Workshop

    CERN Document Server

    The quest for new physics beyond the SM is arguably the driving topic for LHC Run2. LHC collaborations are pursuing searches for new physics in a vast variety of channels. Although collaborations provide various interpretations for their search results, the full understanding of these results requires a much wider interpretation scope involving all kinds of theoretical models. This is a very active field, with close theory-experiment interaction. In particular, development of dedicated methodologies and tools is crucial for such scale of interpretation. Recently, a Forum was initiated to host discussions among LHC experimentalists and theorists on topics related to the BSM (re)interpretation of LHC data, and especially on the development of relevant interpretation tools and infrastructure: https://twiki.cern.ch/twiki/bin/view/LHCPhysics/InterpretingLHCresults Two meetings were held at CERN, where active discussions and concrete work on (re)interpretation methods and tools took place, with valuable cont...

  18. Improving e-book access via a library-developed full-text search tool.

    Science.gov (United States)

    Foust, Jill E; Bergen, Phillip; Maxeiner, Gretchen L; Pawlowski, Peter N

    2007-01-01

    This paper reports on the development of a tool for searching the contents of licensed full-text electronic book (e-book) collections. The Health Sciences Library System (HSLS) provides services to the University of Pittsburgh's medical programs and large academic health system. The HSLS has developed an innovative tool for federated searching of its e-book collections. Built using the XML-based Vivísimo development environment, the tool enables a user to perform a full-text search of over 2,500 titles from the library's seven most highly used e-book collections. From a single "Google-style" query, results are returned as an integrated set of links pointing directly to relevant sections of the full text. Results are also grouped into categories that enable more precise retrieval without reformulation of the search. A heuristic evaluation demonstrated the usability of the tool and a web server log analysis indicated an acceptable level of usage. Based on its success, there are plans to increase the number of online book collections searched. This library's first foray into federated searching has produced an effective tool for searching across large collections of full-text e-books and has provided a good foundation for the development of other library-based federated searching products.

  19. A human-machine interface evaluation method: A difficulty evaluation method in information searching (DEMIS)

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2009-01-01

    A human-machine interface (HMI) evaluation method, which is named 'difficulty evaluation method in information searching (DEMIS)', is proposed and demonstrated with an experimental study. The DEMIS is based on a human performance model and two measures of attentional-resource effectiveness in monitoring and detection tasks in nuclear power plants (NPPs). Operator competence and HMI design are modeled to be most significant factors to human performance. One of the two effectiveness measures is fixation-to-importance ratio (FIR) which represents attentional resource (eye fixations) spent on an information source compared to importance of the information source. The other measure is selective attention effectiveness (SAE) which incorporates FIRs for all information sources. The underlying principle of the measures is that the information source should be selectively attended to according to its informational importance. In this study, poor performance in information searching tasks is modeled to be coupled with difficulties caused by poor mental models of operators or/and poor HMI design. Human performance in information searching tasks is evaluated by analyzing the FIR and the SAE. Operator mental models are evaluated by a questionnaire-based method. Then difficulties caused by a poor HMI design are evaluated by a focused interview based on the FIR evaluation and then root causes leading to poor performance are identified in a systematic way.

  20. Development of a search system of NRDF on WWW

    International Nuclear Information System (INIS)

    Masui, Hiroshi; Ohbayashi, Yoshihide; Aoyama, Shigeyoshi; Ohnishi, Akira; Kato, Kiyoshi; Chiba, Masaki

    2000-01-01

    We develop a data search system and a data entry system for the Nuclear Reaction Data File (NRDF), which is one of the charged-particle reaction database compiled by Japan Charged Particle Reaction Group (JCPRG). Using a WWW browser, we can easily search, retrieve and utilize the data of NRDF. (author)

  1. Beam angle optimization for intensity-modulated radiation therapy using a guided pattern search method

    International Nuclear Information System (INIS)

    Rocha, Humberto; Dias, Joana M; Ferreira, Brígida C; Lopes, Maria C

    2013-01-01

    Generally, the inverse planning of radiation therapy consists mainly of the fluence optimization. The beam angle optimization (BAO) in intensity-modulated radiation therapy (IMRT) consists of selecting appropriate radiation incidence directions and may influence the quality of the IMRT plans, both to enhance better organ sparing and to improve tumor coverage. However, in clinical practice, most of the time, beam directions continue to be manually selected by the treatment planner without objective and rigorous criteria. The goal of this paper is to introduce a novel approach that uses beam’s-eye-view dose ray tracing metrics within a pattern search method framework in the optimization of the highly non-convex BAO problem. Pattern search methods are derivative-free optimization methods that require a few function evaluations to progress and converge and have the ability to better avoid local entrapment. The pattern search method framework is composed of a search step and a poll step at each iteration. The poll step performs a local search in a mesh neighborhood and ensures the convergence to a local minimizer or stationary point. The search step provides the flexibility for a global search since it allows searches away from the neighborhood of the current iterate. Beam’s-eye-view dose metrics assign a score to each radiation beam direction and can be used within the pattern search framework furnishing a priori knowledge of the problem so that directions with larger dosimetric scores are tested first. A set of clinical cases of head-and-neck tumors treated at the Portuguese Institute of Oncology of Coimbra is used to discuss the potential of this approach in the optimization of the BAO problem. (paper)

  2. The development of organized visual search

    Science.gov (United States)

    Woods, Adam J.; Goksun, Tilbe; Chatterjee, Anjan; Zelonis, Sarah; Mehta, Anika; Smith, Sabrina E.

    2013-01-01

    Visual search plays an important role in guiding behavior. Children have more difficulty performing conjunction search tasks than adults. The present research evaluates whether developmental differences in children's ability to organize serial visual search (i.e., search organization skills) contribute to performance limitations in a typical conjunction search task. We evaluated 134 children between the ages of 2 and 17 on separate tasks measuring search for targets defined by a conjunction of features or by distinct features. Our results demonstrated that children organize their visual search better as they get older. As children's skills at organizing visual search improve they become more accurate at locating targets with conjunction of features amongst distractors, but not for targets with distinct features. Developmental limitations in children's abilities to organize their visual search of the environment are an important component of poor conjunction search in young children. In addition, our findings provide preliminary evidence that, like other visuospatial tasks, exposure to reading may influence children's spatial orientation to the visual environment when performing a visual search. PMID:23584560

  3. Search Results | Page 642 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Search Results. Showing 6411 - 6420 of 9601 results. Project ... STI for Development in Asia : a Platform for Information Sharing and Learning. This project is based on the premise that science, technology and innovation (STI) are critical ...

  4. The Development of Information Search Expertise of Research Students

    Science.gov (United States)

    Kai-Wah Chu, Samuel; Law, Nancy

    2008-01-01

    This study identifies the development of information search expertise of 12 beginning research students (six in education and six in engineering) who were provided with a set of systematic search training sessions over a period of one year. The study adopts a longitudinal approach in investigating whether there were different stages in the…

  5. Developing optimal search strategies for detecting clinically sound prognostic studies in MEDLINE: an analytic survey

    Directory of Open Access Journals (Sweden)

    Haynes R Brian

    2004-06-01

    Full Text Available Abstract Background Clinical end users of MEDLINE have a difficult time retrieving articles that are both scientifically sound and directly relevant to clinical practice. Search filters have been developed to assist end users in increasing the success of their searches. Many filters have been developed for the literature on therapy and reviews but little has been done in the area of prognosis. The objective of this study is to determine how well various methodologic textwords, Medical Subject Headings, and their Boolean combinations retrieve methodologically sound literature on the prognosis of health disorders in MEDLINE. Methods An analytic survey was conducted, comparing hand searches of journals with retrievals from MEDLINE for candidate search terms and combinations. Six research assistants read all issues of 161 journals for the publishing year 2000. All articles were rated using purpose and quality indicators and categorized into clinically relevant original studies, review articles, general papers, or case reports. The original and review articles were then categorized as 'pass' or 'fail' for methodologic rigor in the areas of prognosis and other clinical topics. Candidate search strategies were developed for prognosis and run in MEDLINE – the retrievals being compared with the hand search data. The sensitivity, specificity, precision, and accuracy of the search strategies were calculated. Results 12% of studies classified as prognosis met basic criteria for scientific merit for testing clinical applications. Combinations of terms reached peak sensitivities of 90%. Compared with the best single term, multiple terms increased sensitivity for sound studies by 25.2% (absolute increase, and increased specificity, but by a much smaller amount (1.1% when sensitivity was maximized. Combining terms to optimize both sensitivity and specificity achieved sensitivities and specificities of approximately 83% for each. Conclusion Empirically derived

  6. Best, Useful and Objective Precisions for Information Retrieval of Three Search Methods in PubMed and iPubMed

    Directory of Open Access Journals (Sweden)

    Somayyeh Nadi Ravandi

    2016-10-01

    Full Text Available MEDLINE is one of the valuable sources of medical information on the Internet. Among the different open access sites of MEDLINE, PubMed is the best-known site. In 2010, iPubMed was established with an interaction-fuzzy search method for MEDLINE access. In the present work, we aimed to compare the precision of the retrieved sources (Best, Useful and Objective precision in the PubMed and iPubMed using two search methods (simple and MeSH search in PubMed and interaction-fuzzy method in iPubmed. During our semi-empirical study period, we held training workshops for 61 students of higher education to teach them Simple Search, MeSH Search, and Fuzzy-Interaction Search methods. Then, the precision of 305 searches for each method prepared by the students was calculated on the basis of Best precision, Useful precision, and Objective precision formulas. Analyses were done in SPSS version 11.5 using the Friedman and Wilcoxon Test, and three precisions obtained with the three precision formulas were studied for the three search methods. The mean precision of the interaction-fuzzy Search method was higher than that of the simple search and MeSH search for all three types of precision, i.e., Best precision, Useful precision, and Objective precision, and the Simple search method was in the next rank, and their mean precisions were significantly different (P < 0.001. The precision of the interaction-fuzzy search method in iPubmed was investigated for the first time. Also for the first time, three types of precision were evaluated in PubMed and iPubmed. The results showed that the Interaction-Fuzzy search method is more precise than using the natural language search (simple search and MeSH search, and users of this method found papers that were more related to their queries; even though search in Pubmed is useful, it is important that users apply new search methods to obtain the best results.

  7. Improving e-book access via a library-developed full-text search tool*

    Science.gov (United States)

    Foust, Jill E.; Bergen, Phillip; Maxeiner, Gretchen L.; Pawlowski, Peter N.

    2007-01-01

    Purpose: This paper reports on the development of a tool for searching the contents of licensed full-text electronic book (e-book) collections. Setting: The Health Sciences Library System (HSLS) provides services to the University of Pittsburgh's medical programs and large academic health system. Brief Description: The HSLS has developed an innovative tool for federated searching of its e-book collections. Built using the XML-based Vivísimo development environment, the tool enables a user to perform a full-text search of over 2,500 titles from the library's seven most highly used e-book collections. From a single “Google-style” query, results are returned as an integrated set of links pointing directly to relevant sections of the full text. Results are also grouped into categories that enable more precise retrieval without reformulation of the search. Results/Evaluation: A heuristic evaluation demonstrated the usability of the tool and a web server log analysis indicated an acceptable level of usage. Based on its success, there are plans to increase the number of online book collections searched. Conclusion: This library's first foray into federated searching has produced an effective tool for searching across large collections of full-text e-books and has provided a good foundation for the development of other library-based federated searching products. PMID:17252065

  8. Searching for truth: internet search patterns as a method of investigating online responses to a Russian illicit drug policy debate.

    Science.gov (United States)

    Zheluk, Andrey; Gillespie, James A; Quinn, Casey

    2012-12-13

    This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r(s) = 0.88, P < .001), "Bychkov" (r(s) = .78, P < .001) and "Khimki"(r(s) = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for "Bychkov" and 48,084 for "Egor Bychkov", compared to 53

  9. A comparison of methods for gravitational wave burst searches from LIGO and Virgo

    International Nuclear Information System (INIS)

    Beauville, F; Buskulic, D; Grosjean, D; Bizouard, M-A; Cavalier, F; Clapson, A-C; Hello, P; Blackburn, L; Katsavounidis, E; Bosi, L; Brocco, L; Brown, D A; Chatterji, S; Christensen, N; Knight, M; Fairhurst, S; Guidi, G; Heng, S; Hewitson, M; Klimenko, S

    2008-01-01

    The search procedure for burst gravitational waves has been studied using 24 h of simulated data in a network of three interferometers (Hanford 4 km, Livingston 4 km and Virgo 3 km are the example interferometers). Several methods to detect burst events developed in the LIGO Scientific Collaboration (LSC) and Virgo Collaboration have been studied and compared. We have performed coincidence analysis of the triggers obtained in the different interferometers with and without simulated signals added to the data. The benefits of having multiple interferometers of similar sensitivity are demonstrated by comparing the detection performance of the joint coincidence analysis with LSC and Virgo only burst searches. Adding Virgo to the LIGO detector network can increase by 50% the detection efficiency for this search. Another advantage of a joint LIGO-Virgo network is the ability to reconstruct the source sky position. The reconstruction accuracy depends on the timing measurement accuracy of the events in each interferometer, and is displayed in this paper with a fixed source position example

  10. A comparison of methods for gravitational wave burst searches from LIGO and Virgo

    Energy Technology Data Exchange (ETDEWEB)

    Beauville, F; Buskulic, D; Grosjean, D [Laboratoire d' Annecy-le-Vieux de Physique des Particules, Chemin de Bellevue, BP 110, 74941 Annecy-le-Vieux Cedex (France); Bizouard, M-A; Cavalier, F; Clapson, A-C; Hello, P [Laboratoire de l' Accelerateur Lineaire, IN2P3/CNRS-Universite de Paris XI, BP 34, 91898 Orsay Cedex (France); Blackburn, L; Katsavounidis, E [LIGO-Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Bosi, L [INFN Sezione di Perugia and/or Universita di Perugia, Via A Pascoli, I-06123 Perugia (Italy); Brocco, L [INFN Sezione di Roma and/or Universita ' La Sapienza' , P le A Moro 2, I-00185 Roma (Italy); Brown, D A; Chatterji, S [LIGO-California Institute of Technology, Pasadena, CA 91125 (United States); Christensen, N; Knight, M [Carleton College, Northfield, MN 55057 (United States); Fairhurst, S [University of Wisconsin-Milwaukee, Milwaukee, WI 53201 (United States); Guidi, G [INFN Sezione Firenze/Urbino Via G Sansone 1, I-50019 Sesto Fiorentino (Italy); and/or Universita di Firenze, Largo E Fermi 2, I-50125 Firenze and/or Universita di Urbino, Via S Chiara 27, I-61029 Urbino (Italy); Heng, S; Hewitson, M [University of Glasgow, Glasgow, G12 8QQ (United Kingdom); Klimenko, S [University of Florida-Gainesville, FL 32611 (United States)] (and others)

    2008-02-21

    The search procedure for burst gravitational waves has been studied using 24 h of simulated data in a network of three interferometers (Hanford 4 km, Livingston 4 km and Virgo 3 km are the example interferometers). Several methods to detect burst events developed in the LIGO Scientific Collaboration (LSC) and Virgo Collaboration have been studied and compared. We have performed coincidence analysis of the triggers obtained in the different interferometers with and without simulated signals added to the data. The benefits of having multiple interferometers of similar sensitivity are demonstrated by comparing the detection performance of the joint coincidence analysis with LSC and Virgo only burst searches. Adding Virgo to the LIGO detector network can increase by 50% the detection efficiency for this search. Another advantage of a joint LIGO-Virgo network is the ability to reconstruct the source sky position. The reconstruction accuracy depends on the timing measurement accuracy of the events in each interferometer, and is displayed in this paper with a fixed source position example.

  11. In Search of a Sustainable Economic Development Agenda in ...

    African Journals Online (AJOL)

    In Search of a Sustainable Economic Development Agenda in Ghana since ... for a sustainable economic development agenda to better the lives of her citizens. ... that could surpass all interests to guide the country‟s development course.

  12. The development of PubMed search strategies for patient preferences for treatment outcomes

    Directory of Open Access Journals (Sweden)

    Ralph van Hoorn

    2016-07-01

    Full Text Available Abstract Background The importance of respecting patients’ preferences when making treatment decisions is increasingly recognized. Efficiently retrieving papers from the scientific literature reporting on the presence and nature of such preferences can help to achieve this goal. The objective of this study was to create a search filter for PubMed to help retrieve evidence on patient preferences for treatment outcomes. Methods A total of 27 journals were hand-searched for articles on patient preferences for treatment outcomes published in 2011. Selected articles served as a reference set. To develop optimal search strategies to retrieve this set, all articles in the reference set were randomly split into a development and a validation set. MeSH-terms and keywords retrieved using PubReMiner were tested individually and as combinations in PubMed and evaluated for retrieval performance (e.g. sensitivity (Se and specificity (Sp. Results Of 8238 articles, 22 were considered to report empirical evidence on patient preferences for specific treatment outcomes. The best search filters reached Se of 100 % [95 % CI 100-100] with Sp of 95 % [94–95 %] and Sp of 97 % [97–98 %] with 75 % Se [74–76 %]. In the validation set these queries reached values of Se of 90 % [89–91 %] with Sp 94 % [93–95 %] and Se of 80 % [79–81 %] with Sp of 97 % [96–96 %], respectively. Conclusions Narrow and broad search queries were developed which can help in retrieving literature on patient preferences for treatment outcomes. Identifying such evidence may in turn enhance the incorporation of patient preferences in clinical decision making and health technology assessment.

  13. New Internet search volume-based weighting method for integrating various environmental impacts

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Changyoon, E-mail: changyoon@yonsei.ac.kr; Hong, Taehoon, E-mail: hong7@yonsei.ac.kr

    2016-01-15

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  14. New Internet search volume-based weighting method for integrating various environmental impacts

    International Nuclear Information System (INIS)

    Ji, Changyoon; Hong, Taehoon

    2016-01-01

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  15. Far-infrared contraband-detection-system development for personnel-search applications

    International Nuclear Information System (INIS)

    Schellenbaum, R.L.

    1982-09-01

    Experiments have been conducted toward the development of an active near-millimeter-wave, far infrared, personnel search system for the detection of contraband. These experiments employed a microwave hybrid tee interferometer/radiometer scanning system and quasi-optical techniques at 3.3-mm wavelength to illuminate and detect the reflection from target objects against a human body background. Clothing and other common concealing materials are transport at this wavelength. Retroreflector arrays, in conjunction with a Gunn diode radiation source, were investigated to provide all-angle illumination and detection of specular reflections from unaligned and irregular-shaped objects. Results indicate that, under highly controlled search conditions, metal objects greater than or equal to 25 cm 2 can be detected in an enclosure lined with retroreflectors. Further development is required to produce a practical personnel search system. The investigation and feasibility of alternate far infrared search techniques are presented. 23 figures, 2 tables

  16. Development and Validation of a Self-reported Questionnaire for Measuring Internet Search Dependence

    OpenAIRE

    Wang, Yifan; Wu, Lingdan; Zhou, Hongli; Xu, Jiaojing; Dong, Guangheng

    2016-01-01

    Internet search has become the most common way that people deal with issues and problems in everyday life. The wide use of Internet search has largely changed the way people search for and store information. There is a growing interest in the impact of Internet search on users’ affect, cognition, and behavior. Thus, it is essential to develop a tool to measure the changes in psychological characteristics as a result of long-term use of Internet search. The aim of this study is to develop a Qu...

  17. Search Results | Page 818 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 8171 - 8180 of 9601 ... Does immigration promote innovation in developing countries? ... Corporate political activities and firm growth in emerging economies (part of annex 18 of ... Managerial interpretation of environment dynamism, nonlocal search ... Business model of a public intermediary : a case study of China ...

  18. Search method for long-duration gravitational-wave transients from neutron stars

    International Nuclear Information System (INIS)

    Prix, R.; Giampanis, S.; Messenger, C.

    2011-01-01

    We introduce a search method for a new class of gravitational-wave signals, namely, long-duration O(hours-weeks) transients from spinning neutron stars. We discuss the astrophysical motivation from glitch relaxation models and we derive a rough estimate for the maximal expected signal strength based on the superfluid excess rotational energy. The transient signal model considered here extends the traditional class of infinite-duration continuous-wave signals by a finite start-time and duration. We derive a multidetector Bayes factor for these signals in Gaussian noise using F-statistic amplitude priors, which simplifies the detection statistic and allows for an efficient implementation. We consider both a fully coherent statistic, which is computationally limited to directed searches for known pulsars, and a cheaper semicoherent variant, suitable for wide parameter-space searches for transients from unknown neutron stars. We have tested our method by Monte-Carlo simulation, and we find that it outperforms orthodox maximum-likelihood approaches both in sensitivity and in parameter-estimation quality.

  19. Finding Your Voice: Talent Development Centers and the Academic Talent Search

    Science.gov (United States)

    Rushneck, Amy S.

    2012-01-01

    Talent Development Centers are just one of many tools every family, teacher, and gifted advocate should have in their tool box. To understand the importance of Talent Development Centers, it is essential to also understand the Academic Talent Search Program. Talent Search participants who obtain scores comparable to college-bound high school…

  20. Search | Page 4 | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Search | Page 7 | IDRC - International Development Research Centre. Hammou Lammrani has been working for IDRC in the Middle East and North Africa since 2007. Specialising in agriculture, water, and knowledge management, .

  1. Hooke–Jeeves Method-used Local Search in a Hybrid Global Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    V. D. Sulimov

    2014-01-01

    Full Text Available Modern methods for optimization investigation of complex systems are based on development and updating the mathematical models of systems because of solving the appropriate inverse problems. Input data desirable for solution are obtained from the analysis of experimentally defined consecutive characteristics for a system or a process. Causal characteristics are the sought ones to which equation coefficients of mathematical models of object, limit conditions, etc. belong. The optimization approach is one of the main ones to solve the inverse problems. In the main case it is necessary to find a global extremum of not everywhere differentiable criterion function. Global optimization methods are widely used in problems of identification and computation diagnosis system as well as in optimal control, computing to-mography, image restoration, teaching the neuron networks, other intelligence technologies. Increasingly complicated systems of optimization observed during last decades lead to more complicated mathematical models, thereby making solution of appropriate extreme problems significantly more difficult. A great deal of practical applications may have the problem con-ditions, which can restrict modeling. As a consequence, in inverse problems the criterion functions can be not everywhere differentiable and noisy. Available noise means that calculat-ing the derivatives is difficult and unreliable. It results in using the optimization methods without calculating the derivatives.An efficiency of deterministic algorithms of global optimization is significantly restrict-ed by their dependence on the extreme problem dimension. When the number of variables is large they use the stochastic global optimization algorithms. As stochastic algorithms yield too expensive solutions, so this drawback restricts their applications. Developing hybrid algo-rithms that combine a stochastic algorithm for scanning the variable space with deterministic local search

  2. Search Strategy Development in a Flipped Library Classroom: A Student-Focused Assessment

    Science.gov (United States)

    Goates, Michael C.; Nelson, Gregory M.; Frost, Megan

    2017-01-01

    Librarians at Brigham Young University compared search statement development between traditional lecture and flipped instruction sessions. Students in lecture sessions scored significantly higher on developing search statements than those in flipped sessions. However, student evaluations show a strong preference for pedagogies that incorporate…

  3. Development of a multivariate tool to reject background in a WZ diboson search for the CDF experiment

    Energy Technology Data Exchange (ETDEWEB)

    Cremonesi, Matteo [Univ. of of Rome Tor Vergata (Italy)

    2015-08-27

    In the frame of the strong on-going data analysis effort of the CDF collaboration at Fermilab, a method was developed by the candidate to improve the background rejection efficiency in the search for associated pair production of electroweak W, Z bosons. The performaces of the method for vetoing the tt background in a WZ/ZZ → fνq$\\bar{q}$ diboson search are reported. The method was developed in the inclusive 2-jets sample and applied to the “tag-2 jets" region, the subsample defined by the request that the two jets carry beauty flavor. In this region the tt production is one of the largest backgrounds. The tt veto proceeds in two steps: first, a set of pre-selection cuts are applied in a candidate sample where up to two leptons are accepted in addition to a jet pair, and the ZZ component of the signal is thus preserved; next, a Neural Network is trained to indicate the probability that the event be top-pair production. To validate the the method as developed in the inclusive 2-jets sample, it is applied to veto region providing a significant rejection of this important background.

  4. A fast tomographic method for searching the minimum free energy path

    International Nuclear Information System (INIS)

    Chen, Changjun; Huang, Yanzhao; Xiao, Yi; Jiang, Xuewei

    2014-01-01

    Minimum Free Energy Path (MFEP) provides a lot of important information about the chemical reactions, like the free energy barrier, the location of the transition state, and the relative stability between reactant and product. With MFEP, one can study the mechanisms of the reaction in an efficient way. Due to a large number of degrees of freedom, searching the MFEP is a very time-consuming process. Here, we present a fast tomographic method to perform the search. Our approach first calculates the free energy surfaces in a sequence of hyperplanes perpendicular to a transition path. Based on an objective function and the free energy gradient, the transition path is optimized in the collective variable space iteratively. Applications of the present method to model systems show that our method is practical. It can be an alternative approach for finding the state-to-state MFEP

  5. Geometrical Fuzzy Search Method for the Business Information Security Systems

    Directory of Open Access Journals (Sweden)

    Grigory Grigorievich Novikov

    2014-12-01

    Full Text Available The main reason of the article is how to use one of new fuzzy search method for information security of business or some other purposes. So many sensitive information leaks are through non-classified documents legal publishing. That’s why many intelligence services like to use the “mosaic” information collection method so much: This article is about how to prevent it.

  6. A Novel Method Using Abstract Convex Underestimation in Ab-Initio Protein Structure Prediction for Guiding Search in Conformational Feature Space.

    Science.gov (United States)

    Hao, Xiao-Hu; Zhang, Gui-Jun; Zhou, Xiao-Gen; Yu, Xu-Feng

    2016-01-01

    To address the searching problem of protein conformational space in ab-initio protein structure prediction, a novel method using abstract convex underestimation (ACUE) based on the framework of evolutionary algorithm was proposed. Computing such conformations, essential to associate structural and functional information with gene sequences, is challenging due to the high-dimensionality and rugged energy surface of the protein conformational space. As a consequence, the dimension of protein conformational space should be reduced to a proper level. In this paper, the high-dimensionality original conformational space was converted into feature space whose dimension is considerably reduced by feature extraction technique. And, the underestimate space could be constructed according to abstract convex theory. Thus, the entropy effect caused by searching in the high-dimensionality conformational space could be avoided through such conversion. The tight lower bound estimate information was obtained to guide the searching direction, and the invalid searching area in which the global optimal solution is not located could be eliminated in advance. Moreover, instead of expensively calculating the energy of conformations in the original conformational space, the estimate value is employed to judge if the conformation is worth exploring to reduce the evaluation time, thereby making computational cost lower and the searching process more efficient. Additionally, fragment assembly and the Monte Carlo method are combined to generate a series of metastable conformations by sampling in the conformational space. The proposed method provides a novel technique to solve the searching problem of protein conformational space. Twenty small-to-medium structurally diverse proteins were tested, and the proposed ACUE method was compared with It Fix, HEA, Rosetta and the developed method LEDE without underestimate information. Test results show that the ACUE method can more rapidly and more

  7. Development and use of a content search strategy for retrieving studies on patients' views and preferences.

    Science.gov (United States)

    Selva, Anna; Solà, Ivan; Zhang, Yuan; Pardo-Hernandez, Hector; Haynes, R Brian; Martínez García, Laura; Navarro, Tamara; Schünemann, Holger; Alonso-Coello, Pablo

    2017-08-30

    Identifying scientific literature addressing patients' views and preferences is complex due to the wide range of studies that can be informative and the poor indexing of this evidence. Given the lack of guidance we developed a search strategy to retrieve this type of evidence. We assembled an initial list of terms from several sources, including the revision of the terms and indexing of topic-related studies and, methods research literature, and other relevant projects and systematic reviews. We used the relative recall approach, evaluating the capacity of the designed search strategy for retrieving studies included in relevant systematic reviews for the topic. We implemented in practice the final version of the search strategy for conducting systematic reviews and guidelines, and calculated search's precision and the number of references needed to read (NNR). We assembled an initial version of the search strategy, which had a relative recall of 87.4% (yield of 132/out of 151 studies). We then added some additional terms from the studies not initially identified, and re-tested this improved version against the studies included in a new set of systematic reviews, reaching a relative recall of 85.8% (151/out of 176 studies, 95% CI 79.9 to 90.2). This final version of the strategy includes two sets of terms related with two domains: "Patient Preferences and Decision Making" and "Health State Utilities Values". When we used the search strategy for the development of systematic reviews and clinical guidelines we obtained low precision values (ranging from 2% to 5%), and the NNR from 20 to 50. This search strategy fills an important research gap in this field. It will help systematic reviewers, clinical guideline developers, and policy-makers to retrieve published research on patients' views and preferences. In turn, this will facilitate the inclusion of this critical aspect when formulating heath care decisions, including recommendations.

  8. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...

  9. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    Science.gov (United States)

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.

  10. The contemporary art of cost management methods during product development

    NARCIS (Netherlands)

    Wouters, M.; Morales, S.; Epstein, M.J.; Lee, J.Y.

    2014-01-01

    Purpose To provide an overview of research published in the management accounting literature on methods for cost management in new product development, such as a target costing, life cycle costing, component commonality, and modular design. Methodology/approach The structured literature search

  11. Efficient protein structure search using indexing methods.

    Science.gov (United States)

    Kim, Sungchul; Sael, Lee; Yu, Hwanjo

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively.

  12. Assessment of the effectiveness of uranium deposit searching methods

    International Nuclear Information System (INIS)

    Suran, J.

    1998-01-01

    The following groups of uranium deposit searching methods are described: radiometric review of foreign work; aerial radiometric survey; automobile radiometric survey; emanation survey up to 1 m; emanation survey up to 2 m; ground radiometric survey; radiometric survey in pits; deep radiometric survey; combination of the above methods; and other methods (drilling survey). For vein-type deposits, the majority of Czech deposits were discovered in 1945-1965 by radiometric review of foreign work, automobile radiometric survey, and emanation survey up to 1 m. The first significant indications of sandstone type uranium deposits were observed in the mid-1960 by aerial radiometric survey and confirmed later by drilling. (P.A.)

  13. Harbourscape Aalborg - Design Based Methods in Waterfront Development

    DEFF Research Database (Denmark)

    Kiib, Hans

    2012-01-01

    How can city planners and developers gain knowledge and develop new sustainable concepts for water front developments? The waterfront is far too often threatened by new privatisation, lack of public access and bad architecture. And in a time where low growth rates and crises in the building...... industry is leaving great parts of the harbour as urban voids planners are in search of new tools for bridging the time gap until new projects can be a reality. This chapter presents the development of waterfront regeneration concepts that resulted from design based workshops, Harbourscape Aalborg in 2005...... and Performative Architecture Workshop in 2008, and evaluates the method and the thinking behind this. The design workshops provide different design-based development methods which can be tested with the purpose of developing new concepts for the relationship between the city and its harbour, and in addition...

  14. Large Neighborhood Search

    DEFF Research Database (Denmark)

    Pisinger, David; Røpke, Stefan

    2010-01-01

    Heuristics based on large neighborhood search have recently shown outstanding results in solving various transportation and scheduling problems. Large neighborhood search methods explore a complex neighborhood by use of heuristics. Using large neighborhoods makes it possible to find better...... candidate solutions in each iteration and hence traverse a more promising search path. Starting from the large neighborhood search method,we give an overview of very large scale neighborhood search methods and discuss recent variants and extensions like variable depth search and adaptive large neighborhood...

  15. Law, Democracy & Development: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  16. Madagascar Conservation & Development: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  17. MRS algorithm: a new method for searching myocardial region in SPECT myocardial perfusion images.

    Science.gov (United States)

    He, Yuan-Lie; Tian, Lian-Fang; Chen, Ping; Li, Bin; Mao, Zhong-Yuan

    2005-10-01

    First, the necessity of automatically segmenting myocardium from myocardial SPECT image is discussed in Section 1. To eliminate the influence of the background, the optimal threshold segmentation method modified for the MRS algorithm is explained in Section 2. Then, the image erosion structure is applied to identify the myocardium region and the liver region. The contour tracing method is introduced to extract the myocardial contour. To locate the centriod of the myocardium, the myocardial centriod searching method is developed. The protocol of the MRS algorithm is summarized in Section 6. The performance of the MRS algorithm is investigated and the conclusion is drawn in Section 7. Finally, the importance of the MRS algorithm and the improvement of the MRS algorithm are discussed.

  18. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    OpenAIRE

    Zheluk, Andrey; Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's a...

  19. Short Term Gain, Long Term Pain:Informal Job Search Methods and Post-Displacement Outcomes

    OpenAIRE

    Green, Colin

    2012-01-01

    This paper examines the role of informal job search methods on the labour market outcomes of displaced workers. Informal job search methods could alleviate short-term labour market difficulties of displaced workers by providing information on job opportunities, allowing them to signal their productivity and may mitigate wage losses through better post-displacement job matching. However if displacement results from reductions in demand for specific sectors/skills, the use of informal job searc...

  20. Evolutionary Policy Transfer and Search Methods for Boosting Behavior Quality: RoboCup Keep-Away Case Study

    Directory of Open Access Journals (Sweden)

    Geoff Nitschke

    2017-11-01

    Full Text Available This study evaluates various evolutionary search methods to direct neural controller evolution in company with policy (behavior transfer across increasingly complex collective robotic (RoboCup keep-away tasks. Robot behaviors are first evolved in a source task and then transferred for further evolution to more complex target tasks. Evolutionary search methods tested include objective-based search (fitness function, behavioral and genotypic diversity maintenance, and hybrids of such diversity maintenance and objective-based search. Evolved behavior quality is evaluated according to effectiveness and efficiency. Effectiveness is the average task performance of transferred and evolved behaviors, where task performance is the average time the ball is controlled by a keeper team. Efficiency is the average number of generations taken for the fittest evolved behaviors to reach a minimum task performance threshold given policy transfer. Results indicate that policy transfer coupled with hybridized evolution (behavioral diversity maintenance and objective-based search addresses the bootstrapping problem for increasingly complex keep-away tasks. That is, this hybrid method (coupled with policy transfer evolves behaviors that could not otherwise be evolved. Also, this hybrid evolutionary search was demonstrated as consistently evolving topologically simple neural controllers that elicited high-quality behaviors.

  1. Identifying nurse staffing research in Medline: development and testing of empirically derived search strategies with the PubMed interface.

    Science.gov (United States)

    Simon, Michael; Hausner, Elke; Klaus, Susan F; Dunton, Nancy E

    2010-08-23

    The identification of health services research in databases such as PubMed/Medline is a cumbersome task. This task becomes even more difficult if the field of interest involves the use of diverse methods and data sources, as is the case with nurse staffing research. This type of research investigates the association between nurse staffing parameters and nursing and patient outcomes. A comprehensively developed search strategy may help identify nurse staffing research in PubMed/Medline. A set of relevant references in PubMed/Medline was identified by means of three systematic reviews. This development set was used to detect candidate free-text and MeSH terms. The frequency of these terms was compared to a random sample from PubMed/Medline in order to identify terms specific to nurse staffing research, which were then used to develop a sensitive, precise and balanced search strategy. To determine their precision, the newly developed search strategies were tested against a) the pool of relevant references extracted from the systematic reviews, b) a reference set identified from an electronic journal screening, and c) a sample from PubMed/Medline. Finally, all newly developed strategies were compared to PubMed's Health Services Research Queries (PubMed's HSR Queries). The sensitivities of the newly developed search strategies were almost 100% in all of the three test sets applied; precision ranged from 6.1% to 32.0%. PubMed's HSR queries were less sensitive (83.3% to 88.2%) than the new search strategies. Only minor differences in precision were found (5.0% to 32.0%). As with other literature on health services research, nurse staffing studies are difficult to identify in PubMed/Medline. Depending on the purpose of the search, researchers can choose between high sensitivity and retrieval of a large number of references or high precision, i.e. and an increased risk of missing relevant references, respectively. More standardized terminology (e.g. by consistent use of the

  2. Development and Validation of a Self-reported Questionnaire for Measuring Internet Search Dependence.

    Science.gov (United States)

    Wang, Yifan; Wu, Lingdan; Zhou, Hongli; Xu, Jiaojing; Dong, Guangheng

    2016-01-01

    Internet search has become the most common way that people deal with issues and problems in everyday life. The wide use of Internet search has largely changed the way people search for and store information. There is a growing interest in the impact of Internet search on users' affect, cognition, and behavior. Thus, it is essential to develop a tool to measure the changes in psychological characteristics as a result of long-term use of Internet search. The aim of this study is to develop a Questionnaire on Internet Search Dependence (QISD) and test its reliability and validity. We first proposed a preliminary structure and items of the QISD based on literature review, supplemental investigations, and interviews. And then, we assessed the psychometric properties and explored the factor structure of the initial version via exploratory factor analysis (EFA). The EFA results indicated that four dimensions of the QISD were very reliable, i.e., habitual use of Internet search, withdrawal reaction, Internet search trust, and external storage under Internet search. Finally, we tested the factor solution obtained from EFA through confirmatory factor analysis (CFA). The results of CFA confirmed that the four dimensions model fits the data well. In all, this study suggests that the 12-item QISD is of high reliability and validity and can serve as a preliminary tool to measure the features of Internet search dependence.

  3. Development and Validation of a Self-reported Questionnaire for Measuring Internet Search Dependence

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2016-12-01

    Full Text Available Internet search has become the most common way that people deal with issues and problems in everyday life. The wide use of Internet search has largely changed the way people search for and store information. There is a growing interest in the impact of Internet search on users’ affect, cognition and behavior. Thus, it is essential to develop a tool to measure the changes in psychological characteristics as a result of long-term use of Internet search. The present study aimed to develop a Questionnaire on Internet Search Dependence (QISD, and test its reliability and validity. We first proposed a preliminary structure and items of the QISD based on literature review, supplemental investigations, and interviews. And then, we assessed the psychometric properties and explored the factor structure of the initial version via exploratory factor analysis (EFA. The EFA results indicated that four dimensions of the QISD were very reliable, i.e., habitual use of Internet search, withdrawal reaction, Internet search trust and external storage under Internet search. Lastly, we tested the factor solution obtained from EFA through confirmatory factor analysis (CFA. The results of CFA confirmed that the four dimensions model fits the data well. In all, the present study suggests that the 12-item QISD is of high reliability and validity, and can serve as a preliminary tool to measure the features of Internet search dependence.

  4. An Efficient Hybrid Conjugate Gradient Method with the Strong Wolfe-Powell Line Search

    Directory of Open Access Journals (Sweden)

    Ahmad Alhawarat

    2015-01-01

    Full Text Available Conjugate gradient (CG method is an interesting tool to solve optimization problems in many fields, such as design, economics, physics, and engineering. In this paper, we depict a new hybrid of CG method which relates to the famous Polak-Ribière-Polyak (PRP formula. It reveals a solution for the PRP case which is not globally convergent with the strong Wolfe-Powell (SWP line search. The new formula possesses the sufficient descent condition and the global convergent properties. In addition, we further explained about the cases where PRP method failed with SWP line search. Furthermore, we provide numerical computations for the new hybrid CG method which is almost better than other related PRP formulas in both the number of iterations and the CPU time under some standard test functions.

  5. Review of areas of search for renewable energy developments

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This report addresses planning policy issues related to the development of wind energy, small scale hydro power, energy from waste, landfill gas, and biomass fuels with the aim of improving planning policies in the development plans so that the benefits from renewable energy are recognised in the individual planning applications. The background to the project is traced, and renewable energy technologies and current renewable energy policies are reviewed. The relevance of 'Area of Search' and criteria-based policies, and the renewable energy policies are examined, and key findings relating to the ongoing reviews of planning policies, the national policy guidance, and required targets for renewable energy, the appropriateness of areas of search policies, community benefits and perceptions, local energy strategies, and consistency of renewable energy policies are discussed. (UK)

  6. Surfing for suicide methods and help: content analysis of websites retrieved with search engines in Austria and the United States.

    Science.gov (United States)

    Till, Benedikt; Niederkrotenthaler, Thomas

    2014-08-01

    The Internet provides a variety of resources for individuals searching for suicide-related information. Structured content-analytic approaches to assess intercultural differences in web contents retrieved with method-related and help-related searches are scarce. We used the 2 most popular search engines (Google and Yahoo/Bing) to retrieve US-American and Austrian search results for the term suicide, method-related search terms (e.g., suicide methods, how to kill yourself, painless suicide, how to hang yourself), and help-related terms (e.g., suicidal thoughts, suicide help) on February 11, 2013. In total, 396 websites retrieved with US search engines and 335 websites from Austrian searches were analyzed with content analysis on the basis of current media guidelines for suicide reporting. We assessed the quality of websites and compared findings across search terms and between the United States and Austria. In both countries, protective outweighed harmful website characteristics by approximately 2:1. Websites retrieved with method-related search terms (e.g., how to hang yourself) contained more harmful (United States: P search engines generally had more protective characteristics (P search engines. Resources with harmful characteristics were better ranked than those with protective characteristics (United States: P < .01, Austria: P < .05). The quality of suicide-related websites obtained depends on the search terms used. Preventive efforts to improve the ranking of preventive web content, particularly regarding method-related search terms, seem necessary. © Copyright 2014 Physicians Postgraduate Press, Inc.

  7. Novel citation-based search method for scientific literature: application to meta-analyses.

    Science.gov (United States)

    Janssens, A Cecile J W; Gwinn, M

    2015-10-13

    Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of co-citation with one or more "known" articles before reviewing their eligibility. In two independent studies, we aimed to reproduce the results of literature searches for sets of published meta-analyses (n = 10 and n = 42). For each meta-analysis, we extracted co-citations for the randomly selected 'known' articles from the Web of Science database, counted their frequencies and screened all articles with a score above a selection threshold. In the second study, we extended the method by retrieving direct citations for all selected articles. In the first study, we retrieved 82% of the studies included in the meta-analyses while screening only 11% as many articles as were screened for the original publications. Articles that we missed were published in non-English languages, published before 1975, published very recently, or available only as conference abstracts. In the second study, we retrieved 79% of included studies while screening half the original number of articles. Citation searching appears to be an efficient and reasonably accurate method for finding articles similar to one or more articles of interest for meta-analysis and reviews.

  8. Developing a distributed HTML5-based search engine for geospatial resource discovery

    Science.gov (United States)

    ZHOU, N.; XIA, J.; Nebert, D.; Yang, C.; Gui, Z.; Liu, K.

    2013-12-01

    With explosive growth of data, Geospatial Cyberinfrastructure(GCI) components are developed to manage geospatial resources, such as data discovery and data publishing. However, the efficiency of geospatial resources discovery is still challenging in that: (1) existing GCIs are usually developed for users of specific domains. Users may have to visit a number of GCIs to find appropriate resources; (2) The complexity of decentralized network environment usually results in slow response and pool user experience; (3) Users who use different browsers and devices may have very different user experiences because of the diversity of front-end platforms (e.g. Silverlight, Flash or HTML). To address these issues, we developed a distributed and HTML5-based search engine. Specifically, (1)the search engine adopts a brokering approach to retrieve geospatial metadata from various and distributed GCIs; (2) the asynchronous record retrieval mode enhances the search performance and user interactivity; (3) the search engine based on HTML5 is able to provide unified access capabilities for users with different devices (e.g. tablet and smartphone).

  9. An ontology-based search engine for protein-protein interactions.

    Science.gov (United States)

    Park, Byungkyu; Han, Kyungsook

    2010-01-18

    Keyword matching or ID matching is the most common searching method in a large database of protein-protein interactions. They are purely syntactic methods, and retrieve the records in the database that contain a keyword or ID specified in a query. Such syntactic search methods often retrieve too few search results or no results despite many potential matches present in the database. We have developed a new method for representing protein-protein interactions and the Gene Ontology (GO) using modified Gödel numbers. This representation is hidden from users but enables a search engine using the representation to efficiently search protein-protein interactions in a biologically meaningful way. Given a query protein with optional search conditions expressed in one or more GO terms, the search engine finds all the interaction partners of the query protein by unique prime factorization of the modified Gödel numbers representing the query protein and the search conditions. Representing the biological relations of proteins and their GO annotations by modified Gödel numbers makes a search engine efficiently find all protein-protein interactions by prime factorization of the numbers. Keyword matching or ID matching search methods often miss the interactions involving a protein that has no explicit annotations matching the search condition, but our search engine retrieves such interactions as well if they satisfy the search condition with a more specific term in the ontology.

  10. Cooperative mobile agents search using beehive partitioned structure and Tabu Random search algorithm

    Science.gov (United States)

    Ramazani, Saba; Jackson, Delvin L.; Selmic, Rastko R.

    2013-05-01

    In search and surveillance operations, deploying a team of mobile agents provides a robust solution that has multiple advantages over using a single agent in efficiency and minimizing exploration time. This paper addresses the challenge of identifying a target in a given environment when using a team of mobile agents by proposing a novel method of mapping and movement of agent teams in a cooperative manner. The approach consists of two parts. First, the region is partitioned into a hexagonal beehive structure in order to provide equidistant movements in every direction and to allow for more natural and flexible environment mapping. Additionally, in search environments that are partitioned into hexagons, mobile agents have an efficient travel path while performing searches due to this partitioning approach. Second, we use a team of mobile agents that move in a cooperative manner and utilize the Tabu Random algorithm to search for the target. Due to the ever-increasing use of robotics and Unmanned Aerial Vehicle (UAV) platforms, the field of cooperative multi-agent search has developed many applications recently that would benefit from the use of the approach presented in this work, including: search and rescue operations, surveillance, data collection, and border patrol. In this paper, the increased efficiency of the Tabu Random Search algorithm method in combination with hexagonal partitioning is simulated, analyzed, and advantages of this approach are presented and discussed.

  11. Utilizing mixed methods research in analyzing Iranian researchers’ informarion search behaviour in the Web and presenting current pattern

    Directory of Open Access Journals (Sweden)

    Maryam Asadi

    2015-12-01

    Full Text Available Using mixed methods research design, the current study has analyzed Iranian researchers’ information searching behaviour on the Web.Then based on extracted concepts, the model of their information searching behavior was revealed. . Forty-four participants, including academic staff from universities and research centers were recruited for this study selected by purposive sampling. Data were gathered from questionnairs including ten questions and semi-structured interview. Each participant’s memos were analyzed using grounded theory methods adapted from Strauss & Corbin (1998. Results showed that the main objectives of subjects were doing a research, writing a paper, studying, doing assignments, downloading files and acquiring public information in using Web. The most important of learning about how to search and retrieve information were trial and error and get help from friends among the subjects. Information resources are identified by searching in information resources (e.g. search engines, references in papers, and search in Online database… communications facilities & tools (e.g. contact with colleagues, seminars & workshops, social networking..., and information services (e.g. RSS, Alerting, and SDI. Also, Findings indicated that searching by search engines, reviewing references, searching in online databases, and contact with colleagues and studying last issue of the electronic journals were the most important for searching. The most important strategies were using search engines and scientific tools such as Google Scholar. In addition, utilizing from simple (Quick search method was the most common among subjects. Using of topic, keywords, title of paper were most important of elements for retrieval information. Analysis of interview showed that there were nine stages in researchers’ information searching behaviour: topic selection, initiating search, formulating search query, information retrieval, access to information

  12. A three-term conjugate gradient method under the strong-Wolfe line search

    Science.gov (United States)

    Khadijah, Wan; Rivaie, Mohd; Mamat, Mustafa

    2017-08-01

    Recently, numerous studies have been concerned in conjugate gradient methods for solving large-scale unconstrained optimization method. In this paper, a three-term conjugate gradient method is proposed for unconstrained optimization which always satisfies sufficient descent direction and namely as Three-Term Rivaie-Mustafa-Ismail-Leong (TTRMIL). Under standard conditions, TTRMIL method is proved to be globally convergent under strong-Wolfe line search. Finally, numerical results are provided for the purpose of comparison.

  13. An adaptive bin framework search method for a beta-sheet protein homopolymer model

    Directory of Open Access Journals (Sweden)

    Hoos Holger H

    2007-04-01

    Full Text Available Abstract Background The problem of protein structure prediction consists of predicting the functional or native structure of a protein given its linear sequence of amino acids. This problem has played a prominent role in the fields of biomolecular physics and algorithm design for over 50 years. Additionally, its importance increases continually as a result of an exponential growth over time in the number of known protein sequences in contrast to a linear increase in the number of determined structures. Our work focuses on the problem of searching an exponentially large space of possible conformations as efficiently as possible, with the goal of finding a global optimum with respect to a given energy function. This problem plays an important role in the analysis of systems with complex search landscapes, and particularly in the context of ab initio protein structure prediction. Results In this work, we introduce a novel approach for solving this conformation search problem based on the use of a bin framework for adaptively storing and retrieving promising locally optimal solutions. Our approach provides a rich and general framework within which a broad range of adaptive or reactive search strategies can be realized. Here, we introduce adaptive mechanisms for choosing which conformations should be stored, based on the set of conformations already stored in memory, and for biasing choices when retrieving conformations from memory in order to overcome search stagnation. Conclusion We show that our bin framework combined with a widely used optimization method, Monte Carlo search, achieves significantly better performance than state-of-the-art generalized ensemble methods for a well-known protein-like homopolymer model on the face-centered cubic lattice.

  14. Hybridization of Sensing Methods of the Search Domain and Adaptive Weighted Sum in the Pareto Approximation Problem

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2015-01-01

    Full Text Available We consider the relatively new and rapidly developing class of methods to solve a problem of multi-objective optimization, based on the preliminary built finite-dimensional approximation of the set, and thereby, the Pareto front of this problem as well. The work investigates the efficiency of several modifications of the method of adaptive weighted sum (AWS. This method proposed in the paper of Ryu and Kim Van (JH. Ryu, S. Kim, H. Wan is intended to build Pareto approximation of the multi-objective optimization problem.The AWS method uses quadratic approximation of the objective functions in the current sub-domain of the search space (the area of trust based on the gradient and Hessian matrix of the objective functions. To build the (quadratic meta objective functions this work uses methods of the experimental design theory, which involves calculating the values of these functions in the grid nodes covering the area of trust (a sensing method of the search domain. There are two groups of the sensing methods under consideration: hypercube- and hyper-sphere-based methods. For each of these groups, a number of test multi-objective optimization tasks has been used to study the efficiency of the following grids: "Latin Hypercube"; grid, which is uniformly random for each measurement; grid, based on the LP  sequences.

  15. Research on perturbation based Monte Carlo reactor criticality search

    International Nuclear Information System (INIS)

    Li Zeguang; Wang Kan; Li Yangliu; Deng Jingkang

    2013-01-01

    Criticality search is a very important aspect in reactor physics analysis. Due to the advantages of Monte Carlo method and the development of computer technologies, Monte Carlo criticality search is becoming more and more necessary and feasible. Traditional Monte Carlo criticality search method is suffered from large amount of individual criticality runs and uncertainty and fluctuation of Monte Carlo results. A new Monte Carlo criticality search method based on perturbation calculation is put forward in this paper to overcome the disadvantages of traditional method. By using only one criticality run to get initial k_e_f_f and differential coefficients of concerned parameter, the polynomial estimator of k_e_f_f changing function is solved to get the critical value of concerned parameter. The feasibility of this method was tested. The results show that the accuracy and efficiency of perturbation based criticality search method are quite inspiring and the method overcomes the disadvantages of traditional one. (authors)

  16. Modification of the Armijo line search to satisfy the convergence properties of HS method

    Directory of Open Access Journals (Sweden)

    Mohammed Belloufi

    2013-07-01

    Full Text Available The Hestenes-Stiefel (HS conjugate gradient algorithm is a useful tool of unconstrainednumerical optimization, which has good numerical performance but no global convergence result under traditional line searches. This paper proposes a line search technique that guarantee the globalconvergence of the Hestenes-Stiefel (HS conjugate gradient method. Numerical tests are presented tovalidate the different approaches.

  17. Experience of Developing a Meta-Semantic Search Engine

    OpenAIRE

    Mukhopadhyay, Debajyoti; Sharma, Manoj; Joshi, Gajanan; Pagare, Trupti; Palwe, Adarsha

    2013-01-01

    Thinking of todays web search scenario which is mainly keyword based, leads to the need of effective and meaningful search provided by Semantic Web. Existing search engines are vulnerable to provide relevant answers to users query due to their dependency on simple data available in web pages. On other hand, semantic search engines provide efficient and relevant results as the semantic web manages information with well defined meaning using ontology. A Meta-Search engine is a search tool that ...

  18. A dynamic lattice searching method with rotation operation for optimization of large clusters

    International Nuclear Information System (INIS)

    Wu Xia; Cai Wensheng; Shao Xueguang

    2009-01-01

    Global optimization of large clusters has been a difficult task, though much effort has been paid and many efficient methods have been proposed. During our works, a rotation operation (RO) is designed to realize the structural transformation from decahedra to icosahedra for the optimization of large clusters, by rotating the atoms below the center atom with a definite degree around the fivefold axis. Based on the RO, a development of the previous dynamic lattice searching with constructed core (DLSc), named as DLSc-RO, is presented. With an investigation of the method for the optimization of Lennard-Jones (LJ) clusters, i.e., LJ 500 , LJ 561 , LJ 600 , LJ 665-667 , LJ 670 , LJ 685 , and LJ 923 , Morse clusters, silver clusters by Gupta potential, and aluminum clusters by NP-B potential, it was found that both the global minima with icosahedral and decahedral motifs can be obtained, and the method is proved to be efficient and universal.

  19. Rapid Automatic Lighting Control of a Mixed Light Source for Image Acquisition using Derivative Optimum Search Methods

    Directory of Open Access Journals (Sweden)

    Kim HyungTae

    2015-01-01

    Full Text Available Automatic lighting (auto-lighting is a function that maximizes the image quality of a vision inspection system by adjusting the light intensity and color.In most inspection systems, a single color light source is used, and an equal step search is employed to determine the maximum image quality. However, when a mixed light source is used, the number of iterations becomes large, and therefore, a rapid search method must be applied to reduce their number. Derivative optimum search methods follow the tangential direction of a function and are usually faster than other methods. In this study, multi-dimensional forms of derivative optimum search methods are applied to obtain the maximum image quality considering a mixed-light source. The auto-lighting algorithms were derived from the steepest descent and conjugate gradient methods, which have N-size inputs of driving voltage and one output of image quality. Experiments in which the proposed algorithm was applied to semiconductor patterns showed that a reduced number of iterations is required to determine the locally maximized image quality.

  20. The Search Performance Evaluation and Prediction in Exploratory Search

    OpenAIRE

    LIU, FEI

    2016-01-01

    The exploratory search for complex search tasks requires an effective search behavior model to evaluate and predict user search performance. Few studies have investigated the relationship between user search behavior and search performance in exploratory search. This research adopts a mixed approach combining search system development, user search experiment, search query log analysis, and multivariate regression analysis to resolve the knowledge gap. Through this study, it is shown that expl...

  1. Protein structural similarity search by Ramachandran codes

    Directory of Open Access Journals (Sweden)

    Chang Chih-Hung

    2007-08-01

    Full Text Available Abstract Background Protein structural data has increased exponentially, such that fast and accurate tools are necessary to access structure similarity search. To improve the search speed, several methods have been designed to reduce three-dimensional protein structures to one-dimensional text strings that are then analyzed by traditional sequence alignment methods; however, the accuracy is usually sacrificed and the speed is still unable to match sequence similarity search tools. Here, we aimed to improve the linear encoding methodology and develop efficient search tools that can rapidly retrieve structural homologs from large protein databases. Results We propose a new linear encoding method, SARST (Structural similarity search Aided by Ramachandran Sequential Transformation. SARST transforms protein structures into text strings through a Ramachandran map organized by nearest-neighbor clustering and uses a regenerative approach to produce substitution matrices. Then, classical sequence similarity search methods can be applied to the structural similarity search. Its accuracy is similar to Combinatorial Extension (CE and works over 243,000 times faster, searching 34,000 proteins in 0.34 sec with a 3.2-GHz CPU. SARST provides statistically meaningful expectation values to assess the retrieved information. It has been implemented into a web service and a stand-alone Java program that is able to run on many different platforms. Conclusion As a database search method, SARST can rapidly distinguish high from low similarities and efficiently retrieve homologous structures. It demonstrates that the easily accessible linear encoding methodology has the potential to serve as a foundation for efficient protein structural similarity search tools. These search tools are supposed applicable to automated and high-throughput functional annotations or predictions for the ever increasing number of published protein structures in this post-genomic era.

  2. IMPROVING NEAREST NEIGHBOUR SEARCH IN 3D SPATIAL ACCESS METHOD

    Directory of Open Access Journals (Sweden)

    A. Suhaibaha

    2016-10-01

    Full Text Available Nearest Neighbour (NN is one of the important queries and analyses for spatial application. In normal practice, spatial access method structure is used during the Nearest Neighbour query execution to retrieve information from the database. However, most of the spatial access method structures are still facing with unresolved issues such as overlapping among nodes and repetitive data entry. This situation will perform an excessive Input/Output (IO operation which is inefficient for data retrieval. The situation will become more crucial while dealing with 3D data. The size of 3D data is usually large due to its detail geometry and other attached information. In this research, a clustered 3D hierarchical structure is introduced as a 3D spatial access method structure. The structure is expected to improve the retrieval of Nearest Neighbour information for 3D objects. Several tests are performed in answering Single Nearest Neighbour search and k Nearest Neighbour (kNN search. The tests indicate that clustered hierarchical structure is efficient in handling Nearest Neighbour query compared to its competitor. From the results, clustered hierarchical structure reduced the repetitive data entry and the accessed page. The proposed structure also produced minimal Input/Output operation. The query response time is also outperformed compared to the other competitor. For future outlook of this research several possible applications are discussed and summarized.

  3. Perturbation based Monte Carlo criticality search in density, enrichment and concentration

    International Nuclear Information System (INIS)

    Li, Zeguang; Wang, Kan; Deng, Jingkang

    2015-01-01

    Highlights: • A new perturbation based Monte Carlo criticality search method is proposed. • The method could get accurate results with only one individual criticality run. • The method is used to solve density, enrichment and concentration search problems. • Results show the feasibility and good performances of this method. • The relationship between results’ accuracy and perturbation order is discussed. - Abstract: Criticality search is a very important aspect in reactor physics analysis. Due to the advantages of Monte Carlo method and the development of computer technologies, Monte Carlo criticality search is becoming more and more necessary and feasible. Existing Monte Carlo criticality search methods need large amount of individual criticality runs and may have unstable results because of the uncertainties of criticality results. In this paper, a new perturbation based Monte Carlo criticality search method is proposed and discussed. This method only needs one individual criticality calculation with perturbation tallies to estimate k eff changing function using initial k eff and differential coefficients results, and solves polynomial equations to get the criticality search results. The new perturbation based Monte Carlo criticality search method is implemented in the Monte Carlo code RMC, and criticality search problems in density, enrichment and concentration are taken out. Results show that this method is quite inspiring in accuracy and efficiency, and has advantages compared with other criticality search methods

  4. Developing energy forecasting model using hybrid artificial intelligence method

    Institute of Scientific and Technical Information of China (English)

    Shahram Mollaiy-Berneti

    2015-01-01

    An important problem in demand planning for energy consumption is developing an accurate energy forecasting model. In fact, it is not possible to allocate the energy resources in an optimal manner without having accurate demand value. A new energy forecasting model was proposed based on the back-propagation (BP) type neural network and imperialist competitive algorithm. The proposed method offers the advantage of local search ability of BP technique and global search ability of imperialist competitive algorithm. Two types of empirical data regarding the energy demand (gross domestic product (GDP), population, import, export and energy demand) in Turkey from 1979 to 2005 and electricity demand (population, GDP, total revenue from exporting industrial products and electricity consumption) in Thailand from 1986 to 2010 were investigated to demonstrate the applicability and merits of the present method. The performance of the proposed model is found to be better than that of conventional back-propagation neural network with low mean absolute error.

  5. Searching in the Context of a Task: A Review of Methods and Tools

    Directory of Open Access Journals (Sweden)

    Ana Maguitman

    2018-04-01

    Full Text Available Contextual information extracted from the user task can help to better target retrieval to task-relevant content. In particular, topical context can be exploited to identify the subject of the information needs, contributing to reduce the information overload problem. A great number of methods exist to extract raw context data and contextual interaction patterns from the user task and to model this information using higher-level representations. Context can then be used as a source for automatic query generation, or as a means to refine or disambiguate user-generated queries. It can also be used to filter and rank results as well as to select domain-specific search engines with better capabilities to satisfy specific information requests. This article reviews methods that have been applied to deal with the problem of reflecting the current and long-term interests of a user in the search process. It discusses major difficulties encountered in the research area of context-based information retrieval and presents an overview of tools proposed since the mid-nineties to deal with the problem of context-based search.

  6. Exploring genomic dark matter: A critical assessment of the performance of homology search methods on noncoding RNA

    DEFF Research Database (Denmark)

    Freyhult, E.; Bollback, J. P.; Gardner, P. P.

    2006-01-01

    Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer, and Infer......Homology search is one of the most ubiquitous bioinformatic tasks, yet it is unknown how effective the currently available tools are for identifying noncoding RNAs (ncRNAs). In this work, we use reliable ncRNA data sets to assess the effectiveness of methods such as BLAST, FASTA, HMMer......, and Infernal. Surprisingly, the most popular homology search methods are often the least accurate. As a result, many studies have used inappropriate tools for their analyses. On the basis of our results, we suggest homology search strategies using the currently available tools and some directions for future...

  7. Project SEARCH UK--Evaluating Its Employment Outcomes

    Science.gov (United States)

    Kaehne, Axel

    2016-01-01

    Background: The study reports the findings of an evaluation of Project SEARCH UK. The programme develops internships for young people with intellectual disabilities who are about to leave school or college. The aim of the evaluation was to investigate at what rate Project SEARCH provided employment opportunities to participants. Methods: The…

  8. A method for the design and development of medical or health care information websites to optimize search engine results page rankings on Google.

    LENUS (Irish Health Repository)

    Dunne, Suzanne

    2013-01-01

    The Internet is a widely used source of information for patients searching for medical\\/health care information. While many studies have assessed existing medical\\/health care information on the Internet, relatively few have examined methods for design and delivery of such websites, particularly those aimed at the general public.

  9. Visual search for features and conjunctions in development.

    Science.gov (United States)

    Lobaugh, N J; Cole, S; Rovet, J F

    1998-12-01

    Visual search performance was examined in three groups of children 7 to 12 years of age and in young adults. Colour and orientation feature searches and a conjunction search were conducted. Reaction time (RT) showed expected improvements in processing speed with age. Comparisons of RT's on target-present and target-absent trials were consistent with parallel search on the two feature conditions and with serial search in the conjunction condition. The RT results indicated searches for feature and conjunctions were treated similarly for children and adults. However, the youngest children missed more targets at the largest array sizes, most strikingly in conjunction search. Based on an analysis of speed/accuracy trade-offs, we suggest that low target-distractor discriminability leads to an undersampling of array elements, and is responsible for the high number of misses in the youngest children.

  10. Application of pattern search method to power system security constrained economic dispatch with non-smooth cost function

    International Nuclear Information System (INIS)

    Al-Othman, A.K.; El-Naggar, K.M.

    2008-01-01

    Direct search methods are evolutionary algorithms used to solve optimization problems. (DS) methods do not require any information about the gradient of the objective function at hand while searching for an optimum solution. One of such methods is Pattern Search (PS) algorithm. This paper presents a new approach based on a constrained pattern search algorithm to solve a security constrained power system economic dispatch problem (SCED) with non-smooth cost function. Operation of power systems demands a high degree of security to keep the system satisfactorily operating when subjected to disturbances, while and at the same time it is required to pay attention to the economic aspects. Pattern recognition technique is used first to assess dynamic security. Linear classifiers that determine the stability of electric power system are presented and added to other system stability and operational constraints. The problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Pattern search method is then applied to solve the constrained optimization formulation. In particular, the method is tested using three different test systems. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that pattern search (PS) is very applicable for solving security constrained power system economic dispatch problem (SCED). In addition, valve-point effect loading and total system losses are considered to further investigate the potential of the PS technique. Based on the results, it can be concluded that the PS has demonstrated ability in handling highly nonlinear discontinuous non-smooth cost function of the SCED. (author)

  11. The Development of Visual Search Strategies in Biscriptal Readers.

    Science.gov (United States)

    Liow, Susan Rikard; Green, David; Tam, Melissa

    1999-01-01

    To test whether cognitive processing in bilingual depends on script combinations and language proficiency, this study investigated the development of alphabetic and logographic visual search strategies in two kinds of biscriptals: (1) Malay-English and (2) Chinese-English readers. Results support the view that there are script implications of…

  12. Methodological developments in searching for studies for systematic reviews: past, present and future?

    Science.gov (United States)

    Lefebvre, Carol; Glanville, Julie; Wieland, L Susan; Coles, Bernadette; Weightman, Alison L

    2013-09-25

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies.Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory' and 'highly desirable' standards for various aspects of review conduct and reporting including searching, the development of Standard Training

  13. Ultrasonic inspection technology development and search units design examples of practical applications

    CERN Document Server

    Brook, Mark V

    2012-01-01

    "Ultrasonic testing is a relatively new branch of science and industry. The development of ultrasonic testing started in the late 1920s. At the beginning, the fundamentals of this method were borrowed from basic physics, geometrical and wave optics, acoustics and seismology. Later it became clear that some of these theories and calculation methods could not always explain the phenomena observed in many specific cases of ultrasonic testing. Without knowing the nuances of the ultrasonic wave propagation in the test object it is impossible to design effective inspection technique and search units for it realization. This book clarifies the theoretical differences of ultrasonics from the other wave propagation theories presenting both basics of physics in the wave propagation, elementary mathematic and advanced practical applications. Almost every specific technique presented in this book is proofed by actual experimental data and examples of calculations"--

  14. Discovery of Nine Gamma-Ray Pulsars in Fermi-Lat Data Using a New Blind Search Method

    Science.gov (United States)

    Celik-Tinmaz, Ozlem; Ferrara, E. C.; Pletsch, H. J.; Allen, B.; Aulbert, C.; Fehrmann, H.; Kramer, M.; Barr, E. D.; Champion, D. J.; Eatough, R. P.; hide

    2011-01-01

    We report the discovery of nine previously unknown gamma-ray pulsars in a blind search of data from the Fermi Large Area Telescope (LAT). The pulsars were found with a novel hierarchical search method originally developed for detecting continuous gravitational waves from rapidly rotating neutron stars. Designed to find isolated pulsars spinning at up to kHz frequencies, the new method is computationally efficient, and incorporates several advances, including a metric-based gridding of the search parameter space (frequency, frequency derivative and sky location) and the use of photon probability weights. The nine pulsars have spin frequencies between 3 and 12 Hz, and characteristic ages ranging from 17 kyr to 3 Myr. Two of them, PSRs Jl803-2149 and J2111+4606, are young and energetic Galactic-plane pulsars (spin-down power above 6 x 10(exp 35) ergs per second and ages below 100 kyr). The seven remaining pulsars, PSRs J0106+4855, J010622+3749, Jl620-4927, Jl746-3239, J2028+3332,J2030+4415, J2139+4716, are older and less energetic; two of them are located at higher Galactic latitudes (|b| greater than 10 degrees). PSR J0106+4855 has the largest characteristic age (3 Myr) and the smallest surface magnetic field (2x 10(exp 11)G) of all LAT blind-search pulsars. PSR J2139+4716 has the lowest spin-down power (3 x l0(exp 33) erg per second) among all non-recycled gamma-ray pulsars ever found. Despite extensive multi-frequency observations, only PSR J0106+4855 has detectable pulsations in the radio band. The other eight pulsars belong to the increasing population of radio-quiet gamma-ray pulsars.

  15. Sundanese ancient manuscripts search engine using probability approach

    Science.gov (United States)

    Suryani, Mira; Hadi, Setiawan; Paulus, Erick; Nurma Yulita, Intan; Supriatna, Asep K.

    2017-10-01

    Today, Information and Communication Technology (ICT) has become a regular thing for every aspect of live include cultural and heritage aspect. Sundanese ancient manuscripts as Sundanese heritage are in damage condition and also the information that containing on it. So in order to preserve the information in Sundanese ancient manuscripts and make them easier to search, a search engine has been developed. The search engine must has good computing ability. In order to get the best computation in developed search engine, three types of probabilistic approaches: Bayesian Networks Model, Divergence from Randomness with PL2 distribution, and DFR-PL2F as derivative form DFR-PL2 have been compared in this study. The three probabilistic approaches supported by index of documents and three different weighting methods: term occurrence, term frequency, and TF-IDF. The experiment involved 12 Sundanese ancient manuscripts. From 12 manuscripts there are 474 distinct terms. The developed search engine tested by 50 random queries for three types of query. The experiment results showed that for the single query and multiple query, the best searching performance given by the combination of PL2F approach and TF-IDF weighting method. The performance has been evaluated using average time responds with value about 0.08 second and Mean Average Precision (MAP) about 0.33.

  16. How to perform a systematic search

    DEFF Research Database (Denmark)

    Bartels, Else Marie

    2013-01-01

    All medical practice and research must be evidence-based, as far as this is possible. With medical knowledge constantly growing, it has become necessary to possess a high level of information literacy to stay competent and professional. Furthermore, as patients can now search information...... on the Internet, clinicians must be able to respond to this type of information in a professional way, when needed. Here, the development of viable systematic search strategies for journal articles, books, book chapters and other sources, selection of appropriate databases, search tools and selection methods...

  17. Missing Links in Middle School: Developing Use of Disciplinary Relatedness in Evaluating Internet Search Results.

    Directory of Open Access Journals (Sweden)

    Frank C Keil

    Full Text Available In the "digital native" generation, internet search engines are a commonly used source of information. However, adolescents may fail to recognize relevant search results when they are related in discipline to the search topic but lack other cues. Middle school students, high school students, and adults rated simulated search results for relevance to the search topic. The search results were designed to contrast deep discipline-based relationships with lexical similarity to the search topic. Results suggest that the ability to recognize disciplinary relatedness without supporting cues may continue to develop into high school. Despite frequent search engine usage, younger adolescents may require additional support to make the most of the information available to them.

  18. Social Search: A Taxonomy of, and a User-Centred Approach to, Social Web Search

    Science.gov (United States)

    McDonnell, Michael; Shiri, Ali

    2011-01-01

    Purpose: The purpose of this paper is to introduce the notion of social search as a new concept, drawing upon the patterns of web search behaviour. It aims to: define social search; present a taxonomy of social search; and propose a user-centred social search method. Design/methodology/approach: A mixed method approach was adopted to investigate…

  19. Job Search, Networks, and Labor Market Performance of Immigrants

    OpenAIRE

    Arceo-Gómez, Eva Olimpia

    2012-01-01

    We develop an on-the-job search model in which immigrants search for jobs through formal channels or networks, and the quality of job offers differs across search methods. The model predicts networks unambiguously lead to a larger share of network jobs in job-to-job transitions, whereas the effect is ambiguous in unemployment-to-job transitions.

  20. Searching for evidence or approval? A commentary on database search in systematic reviews and alternative information retrieval methodologies.

    Science.gov (United States)

    Delaney, Aogán; Tamás, Peter A

    2018-03-01

    Despite recognition that database search alone is inadequate even within the health sciences, it appears that reviewers in fields that have adopted systematic review are choosing to rely primarily, or only, on database search for information retrieval. This commentary reminds readers of factors that call into question the appropriateness of default reliance on database searches particularly as systematic review is adapted for use in new and lower consensus fields. It then discusses alternative methods for information retrieval that require development, formalisation, and evaluation. Our goals are to encourage reviewers to reflect critically and transparently on their choice of information retrieval methods and to encourage investment in research on alternatives. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Analytical Methods in Search Theory

    Science.gov (United States)

    1979-11-01

    X, t ) ,I pick g(x,t;E), *(x,tjc) and find the b necessary to satisfy the search equation. SOLUTION: This is an audience participation problem. It...Cnstotiaticon G11trant,’ ’I pp 2110 Path lestegsls,’ to pp., Jun IBM Iltetteol Pepsi pp., Ott 1313 (Tt o besubmitoet lot pubtinatteon l t Messino, Daidit

  2. Search for neutral leptons

    International Nuclear Information System (INIS)

    Perl, M.L.

    1984-12-01

    At present we know of three kinds of neutral leptons: the electron neutrino, the muon neutrino, and the tau neutrino. This paper reviews the search for additional neutral leptons. The method and significance of a search depends upon the model used for the neutral lepton being sought. Some models for the properties and decay modes of proposed neutral leptons are described. Past and present searches are reviewed. The limits obtained by some completed searches are given, and the methods of searches in progress are described. Future searches are discussed. 41 references

  3. A conjugate gradient method with descent properties under strong Wolfe line search

    Science.gov (United States)

    Zull, N.; ‘Aini, N.; Shoid, S.; Ghani, N. H. A.; Mohamed, N. S.; Rivaie, M.; Mamat, M.

    2017-09-01

    The conjugate gradient (CG) method is one of the optimization methods that are often used in practical applications. The continuous and numerous studies conducted on the CG method have led to vast improvements in its convergence properties and efficiency. In this paper, a new CG method possessing the sufficient descent and global convergence properties is proposed. The efficiency of the new CG algorithm relative to the existing CG methods is evaluated by testing them all on a set of test functions using MATLAB. The tests are measured in terms of iteration numbers and CPU time under strong Wolfe line search. Overall, this new method performs efficiently and comparable to the other famous methods.

  4. A cross-correlation method to search for gravitational wave bursts with AURIGA and Virgo

    NARCIS (Netherlands)

    Bignotto, M.; Bonaldi, M.; Camarda, M.; Cerdonio, M.; Conti, L.; Drago, M.; Falferi, P.; Liguori, N.; Longo, S.; Mezzena, R.; Mion, A.; Ortolan, A.; Prodi, G. A.; Re, V.; Salemi, F.; Taffarello, L.; Vedovato, G.; Vinante, A.; Vitale, S.; Zendri, J. -P.; Acernese, F.; Alshourbagy, Mohamed; Amico, Paolo; Antonucci, Federica; Aoudia, S.; Astone, P.; Avino, Saverio; Baggio, L.; Ballardin, G.; Barone, F.; Barsotti, L.; Barsuglia, M.; Bauer, Th. S.; Bigotta, Stefano; Birindelli, Simona; Boccara, Albert-Claude; Bondu, F.; Bosi, Leone; Braccini, Stefano; Bradaschia, C.; Brillet, A.; Brisson, V.; Buskulic, D.; Cagnoli, G.; Calloni, E.; Campagna, Enrico; Carbognani, F.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cesarini, E.; Chassande-Mottin, E.; Clapson, A-C; Cleva, F.; Coccia, E.; Corda, C.; Corsi, A.; Cottone, F.; Coulon, J. -P.; Cuoco, E.; D'Antonio, S.; Dari, A.; Dattilo, V.; Davier, M.; Rosa, R.; Del Prete, M.; Di Fiore, L.; Di Lieto, A.; Emilio, M. Di Paolo; Di Virgilio, A.; Evans, M.; Fafone, V.; Ferrante, I.; Fidecaro, F.; Fiori, I.; Flaminio, R.; Fournier, J. -D.; Frasca, S.; Frasconi, F.; Gammaitoni, L.; Garufi, F.; Genin, E.; Gennai, A.; Giazotto, A.; Giordano, L.; Granata, V.; Greverie, C.; Grosjean, D.; Guidi, G.; Hamdani, S.U.; Hebri, S.; Heitmann, H.; Hello, P.; Huet, D.; Kreckelbergh, S.; La Penna, P.; Laval, M.; Leroy, N.; Letendre, N.; Lopez, B.; Lorenzini, M.; Loriette, V.; Losurdo, G.; Mackowski, J. -M.; Majorana, E.; Man, C. N.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marque, J.; Martelli, F.; Masserot, A.; Menzinger, F.; Milano, L.; Minenkov, Y.; Moins, C.; Moreau, J.; Morgado, N.; Mosca, S.; Mours, B.; Neri, I.; Nocera, F.; Pagliaroli, G.; Palomba, C.; Paoletti, F.; Pardi, S.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Piergiovanni, F.; Pinard, L.; Poggiani, R.; Punturo, M.; Puppo, P.; Rapagnani, P.; Regimbau, T.; Remillieux, A.; Ricci, F.; Ricciardi, I.; Rocchi, A.; Rolland, L.; Romano, R.; Ruggi, P.; Russo, G.; Solimeno, S.; Spallicci, A.; Swinkels, B. L.; Tarallo, M.; Terenzi, R.; Toncelli, A.; Tonelli, M.; Tournefier, E.; Travasso, F.; Vajente, G.; van den Brand, J. F. J.; van der Putten, S.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinet, J. -Y.; Vocca, H.; Yvert, M.

    2008-01-01

    We present a method to search for transient gravitational waves using a network of detectors with different spectral and directional sensitivities: the interferometer Virgo and the bar detector AURIGA. The data analysis method is based on the measurements of the correlated energy in the network by

  5. Search for extraterrestrial life: recent developments. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Papagiannis, M D [ed.

    1985-01-01

    Seventy experts from 20 different countries discuss the many interrelated aspects of the search for extraterrestrial life, including the search for other planetary systems where life may originate and evolve, the widespread presence of complex prebiotic molecules in our Solar System and in interstellar space which could be precursors of life, and the universal aspects of the biological evolution on Earth. They also discuss the nearly 50 radio searches that were undertaken in the last 25 years, the technological progress that has occurred in this period, and the plans for the future including the comprehensive SETI search program that NASA is now preparing for the 1990's. Extensive introductions by the Editor to each of the 8 sections, make this volume friendly even to the non-specialist who has a genuine interest for this new field. 549 refs.; 84 figs.; 21 tabs.

  6. Frequency domain optical tomography using a conjugate gradient method without line search

    International Nuclear Information System (INIS)

    Kim, Hyun Keol; Charette, Andre

    2007-01-01

    A conjugate gradient method without line search (CGMWLS) is presented. This method is used to retrieve the local maps of absorption and scattering coefficients inside the tissue-like test medium, with the synthetic data. The forward problem is solved with a discrete-ordinates finite-difference method based on the frequency domain formulation of radiative transfer equation. The inversion results demonstrate that the CGMWLS can retrieve simultaneously the spatial distributions of optical properties inside the medium within a reasonable accuracy, by reducing cross-talk between absorption and scattering coefficients

  7. A study of certain Monte Carlo search and optimisation methods

    International Nuclear Information System (INIS)

    Budd, C.

    1984-11-01

    Studies are described which might lead to the development of a search and optimisation facility for the Monte Carlo criticality code MONK. The facility envisaged could be used to maximise a function of k-effective with respect to certain parameters of the system or, alternatively, to find the system (in a given range of systems) for which that function takes a given value. (UK)

  8. Efficient and accurate Greedy Search Methods for mining functional modules in protein interaction networks.

    Science.gov (United States)

    He, Jieyue; Li, Chaojun; Ye, Baoliu; Zhong, Wei

    2012-06-25

    Most computational algorithms mainly focus on detecting highly connected subgraphs in PPI networks as protein complexes but ignore their inherent organization. Furthermore, many of these algorithms are computationally expensive. However, recent analysis indicates that experimentally detected protein complexes generally contain Core/attachment structures. In this paper, a Greedy Search Method based on Core-Attachment structure (GSM-CA) is proposed. The GSM-CA method detects densely connected regions in large protein-protein interaction networks based on the edge weight and two criteria for determining core nodes and attachment nodes. The GSM-CA method improves the prediction accuracy compared to other similar module detection approaches, however it is computationally expensive. Many module detection approaches are based on the traditional hierarchical methods, which is also computationally inefficient because the hierarchical tree structure produced by these approaches cannot provide adequate information to identify whether a network belongs to a module structure or not. In order to speed up the computational process, the Greedy Search Method based on Fast Clustering (GSM-FC) is proposed in this work. The edge weight based GSM-FC method uses a greedy procedure to traverse all edges just once to separate the network into the suitable set of modules. The proposed methods are applied to the protein interaction network of S. cerevisiae. Experimental results indicate that many significant functional modules are detected, most of which match the known complexes. Results also demonstrate that the GSM-FC algorithm is faster and more accurate as compared to other competing algorithms. Based on the new edge weight definition, the proposed algorithm takes advantages of the greedy search procedure to separate the network into the suitable set of modules. Experimental analysis shows that the identified modules are statistically significant. The algorithm can reduce the

  9. Pep-3D-Search: a method for B-cell epitope prediction based on mimotope analysis.

    Science.gov (United States)

    Huang, Yan Xin; Bao, Yong Li; Guo, Shu Yan; Wang, Yan; Zhou, Chun Guang; Li, Yu Xin

    2008-12-16

    The prediction of conformational B-cell epitopes is one of the most important goals in immunoinformatics. The solution to this problem, even if approximate, would help in designing experiments to precisely map the residues of interaction between an antigen and an antibody. Consequently, this area of research has received considerable attention from immunologists, structural biologists and computational biologists. Phage-displayed random peptide libraries are powerful tools used to obtain mimotopes that are selected by binding to a given monoclonal antibody (mAb) in a similar way to the native epitope. These mimotopes can be considered as functional epitope mimics. Mimotope analysis based methods can predict not only linear but also conformational epitopes and this has been the focus of much research in recent years. Though some algorithms based on mimotope analysis have been proposed, the precise localization of the interaction site mimicked by the mimotopes is still a challenging task. In this study, we propose a method for B-cell epitope prediction based on mimotope analysis called Pep-3D-Search. Given the 3D structure of an antigen and a set of mimotopes (or a motif sequence derived from the set of mimotopes), Pep-3D-Search can be used in two modes: mimotope or motif. To evaluate the performance of Pep-3D-Search to predict epitopes from a set of mimotopes, 10 epitopes defined by crystallography were compared with the predicted results from a Pep-3D-Search: the average Matthews correlation coefficient (MCC), sensitivity and precision were 0.1758, 0.3642 and 0.6948. Compared with other available prediction algorithms, Pep-3D-Search showed comparable MCC, specificity and precision, and could provide novel, rational results. To verify the capability of Pep-3D-Search to align a motif sequence to a 3D structure for predicting epitopes, 6 test cases were used. The predictive performance of Pep-3D-Search was demonstrated to be superior to that of other similar programs

  10. Evidence-based Medicine Search: a customizable federated search engine.

    Science.gov (United States)

    Bracke, Paul J; Howse, David K; Keim, Samuel M

    2008-04-01

    This paper reports on the development of a tool by the Arizona Health Sciences Library (AHSL) for searching clinical evidence that can be customized for different user groups. The AHSL provides services to the University of Arizona's (UA's) health sciences programs and to the University Medical Center. Librarians at AHSL collaborated with UA College of Medicine faculty to create an innovative search engine, Evidence-based Medicine (EBM) Search, that provides users with a simple search interface to EBM resources and presents results organized according to an evidence pyramid. EBM Search was developed with a web-based configuration component that allows the tool to be customized for different specialties. Informal and anecdotal feedback from physicians indicates that EBM Search is a useful tool with potential in teaching evidence-based decision making. While formal evaluation is still being planned, a tool such as EBM Search, which can be configured for specific user populations, may help lower barriers to information resources in an academic health sciences center.

  11. An Efficient Method to Search Real-Time Bulk Data for an Information Processing System

    International Nuclear Information System (INIS)

    Kim, Seong Jin; Kim, Jong Myung; Suh, Yong Suk; Keum, Jong Yong; Park, Heui Youn

    2005-01-01

    The Man Machine Interface System (MMIS) of System-integrated Modular Advanced ReacTor (SMART) is designed with fully digitalized features. The Information Processing System (IPS) of the MMIS acquires and processes plant data from other systems. In addition, the IPS provides plant operation information to operators in the control room. The IPS is required to process bulky data in a real-time. So, it is necessary to consider a special processing method with regards to flexibility and performance because more than a few thousands of Plant Information converges on the IPS. Among other things, the processing time for searching for data from the bulk data consumes much more than other the processing times. Thus, this paper explores an efficient method for the search and examines its feasibility

  12. Fast and accurate protein substructure searching with simulated annealing and GPUs

    Directory of Open Access Journals (Sweden)

    Stivala Alex D

    2010-09-01

    Full Text Available Abstract Background Searching a database of protein structures for matches to a query structure, or occurrences of a structural motif, is an important task in structural biology and bioinformatics. While there are many existing methods for structural similarity searching, faster and more accurate approaches are still required, and few current methods are capable of substructure (motif searching. Results We developed an improved heuristic for tableau-based protein structure and substructure searching using simulated annealing, that is as fast or faster and comparable in accuracy, with some widely used existing methods. Furthermore, we created a parallel implementation on a modern graphics processing unit (GPU. Conclusions The GPU implementation achieves up to 34 times speedup over the CPU implementation of tableau-based structure search with simulated annealing, making it one of the fastest available methods. To the best of our knowledge, this is the first application of a GPU to the protein structural search problem.

  13. Adaptive order search and tangent-weighted trade-off for motion estimation in H.264

    Directory of Open Access Journals (Sweden)

    Srinivas Bachu

    2018-04-01

    Full Text Available Motion estimation and compensation play a major role in video compression to reduce the temporal redundancies of the input videos. A variety of block search patterns have been developed for matching the blocks with reduced computational complexity, without affecting the visual quality. In this paper, block motion estimation is achieved through integrating the square as well as the hexagonal search patterns with adaptive order. The proposed algorithm is called, AOSH (Adaptive Order Square Hexagonal Search algorithm, and it finds the best matching block with a reduced number of search points. The searching function is formulated as a trade-off criterion here. Hence, the tangent-weighted function is newly developed to evaluate the matching point. The proposed AOSH search algorithm and the tangent-weighted trade-off criterion are effectively applied to the block estimation process to enhance the visual quality and the compression performance. The proposed method is validated using three videos namely, football, garden and tennis. The quantitative performance of the proposed method and the existing methods is analysed using the Structural SImilarity Index (SSIM and the Peak Signal to Noise Ratio (PSNR. The results prove that the proposed method offers good visual quality than the existing methods. Keywords: Block motion estimation, Square search, Hexagon search, H.264, Video coding

  14. System and method for improving video recorder performance in a search mode

    NARCIS (Netherlands)

    2000-01-01

    A method and apparatus wherein video images are recorded on a plurality of tracks of a tape such that, for playback in a search mode at a speed, higher than the recording speed the displayed image will consist of a plurality of contiguous parts, some of the parts being read out from tracks each

  15. System and method for improving video recorder performance in a search mode

    NARCIS (Netherlands)

    1991-01-01

    A method and apparatus wherein video images are recorded on a plurality of tracks of a tape such that, for playback in a search mode at a speed higher than the recording speed the displayed image will consist of a plurality of contiguous parts, some of the parts being read out from tracks each

  16. Web-based information search and retrieval: effects of strategy use and age on search success.

    Science.gov (United States)

    Stronge, Aideen J; Rogers, Wendy A; Fisk, Arthur D

    2006-01-01

    The purpose of this study was to investigate the relationship between strategy use and search success on the World Wide Web (i.e., the Web) for experienced Web users. An additional goal was to extend understanding of how the age of the searcher may influence strategy use. Current investigations of information search and retrieval on the Web have provided an incomplete picture of Web strategy use because participants have not been given the opportunity to demonstrate their knowledge of Web strategies while also searching for information on the Web. Using both behavioral and knowledge-engineering methods, we investigated searching behavior and system knowledge for 16 younger adults (M = 20.88 years of age) and 16 older adults (M = 67.88 years). Older adults were less successful than younger adults in finding correct answers to the search tasks. Knowledge engineering revealed that the age-related effect resulted from ineffective search strategies and amount of Web experience rather than age per se. Our analysis led to the development of a decision-action diagram representing search behavior for both age groups. Older adults had more difficulty than younger adults when searching for information on the Web. However, this difficulty was related to the selection of inefficient search strategies, which may have been attributable to a lack of knowledge about available Web search strategies. Actual or potential applications of this research include training Web users to search more effectively and suggestions to improve the design of search engines.

  17. Development and evaluation of a biomedical search engine using a predicate-based vector space model.

    Science.gov (United States)

    Kwak, Myungjae; Leroy, Gondy; Martinez, Jesse D; Harwell, Jeffrey

    2013-10-01

    Although biomedical information available in articles and patents is increasing exponentially, we continue to rely on the same information retrieval methods and use very few keywords to search millions of documents. We are developing a fundamentally different approach for finding much more precise and complete information with a single query using predicates instead of keywords for both query and document representation. Predicates are triples that are more complex datastructures than keywords and contain more structured information. To make optimal use of them, we developed a new predicate-based vector space model and query-document similarity function with adjusted tf-idf and boost function. Using a test bed of 107,367 PubMed abstracts, we evaluated the first essential function: retrieving information. Cancer researchers provided 20 realistic queries, for which the top 15 abstracts were retrieved using a predicate-based (new) and keyword-based (baseline) approach. Each abstract was evaluated, double-blind, by cancer researchers on a 0-5 point scale to calculate precision (0 versus higher) and relevance (0-5 score). Precision was significantly higher (psearching than keywords, laying the foundation for rich and sophisticated information search. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. [Development and current situation of reconstruction methods following total sacrectomy].

    Science.gov (United States)

    Huang, Siyi; Ji, Tao; Guo, Wei

    2018-05-01

    To review the development of the reconstruction methods following total sacrectomy, and to provide reference for finding a better reconstruction method following total sacrectomy. The case reports and biomechanical and finite element studies of reconstruction following total sacrectomy at home and abroad were searched. Development and current situation were summarized. After developing for nearly 30 years, great progress has been made in the reconstruction concept and fixation techniques. The fixation methods can be summarized as the following three strategies: spinopelvic fixation (SPF), posterior pelvic ring fixation (PPRF), and anterior spinal column fixation (ASCF). SPF has undergone technical progress from intrapelvic rod and hook constructs to pedicle and iliac screw-rod systems. PPRF and ASCF could improve the stability of the reconstruction system. Reconstruction following total sacrectomy remains a challenge. Reconstruction combining SPF, PPRF, and ASCF is the developmental direction to achieve mechanical stability. How to gain biological fixation to improve the long-term stability is an urgent problem to be solved.

  19. Technical development of PubMed Interact: an improved interface for MEDLINE/PubMed searches

    OpenAIRE

    Muin, Michael; Fontelo, Paul

    2006-01-01

    Abstract Background The project aims to create an alternative search interface for MEDLINE/PubMed that may provide assistance to the novice user and added convenience to the advanced user. An earlier version of the project was the 'Slider Interface for MEDLINE/PubMed searches' (SLIM) which provided JavaScript slider bars to control search parameters. In this new version, recent developments in Web-based technologies were implemented. These changes may prove to be even more valuable in enhanci...

  20. INTERFACING GOOGLE SEARCH ENGINE TO CAPTURE USER WEB SEARCH BEHAVIOR

    OpenAIRE

    Fadhilah Mat Yamin; T. Ramayah

    2013-01-01

    The behaviour of the searcher when using the search engine especially during the query formulation is crucial. Search engines capture users’ activities in the search log, which is stored at the search engine server. Due to the difficulty of obtaining this search log, this paper proposed and develops an interface framework to interface a Google search engine. This interface will capture users’ queries before redirect them to Google. The analysis of the search log will show that users are utili...

  1. Top-k Keyword Search Over Graphs Based On Backward Search

    Directory of Open Access Journals (Sweden)

    Zeng Jia-Hui

    2017-01-01

    Full Text Available Keyword search is one of the most friendly and intuitive information retrieval methods. Using the keyword search to get the connected subgraph has a lot of application in the graph-based cognitive computation, and it is a basic technology. This paper focuses on the top-k keyword searching over graphs. We implemented a keyword search algorithm which applies the backward search idea. The algorithm locates the keyword vertices firstly, and then applies backward search to find rooted trees that contain query keywords. The experiment shows that query time is affected by the iteration number of the algorithm.

  2. QCD processes and search for supersymmetry at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Schum, Torben

    2012-07-15

    In this thesis, a data-driven method to estimate the number of QCD background events in a multijet search for supersymmetry at the LHC was developed. The method makes use of two models which predict the correlation of two key search variables, the missing transverse momentum and an angular variable, in order to extrapolate from a QCD dominated control region to the signal region. A good performance of the method was demonstrated by its application to 36 pb{sup -1} data, taken by the CMS experiment in 2010, and by the comparison with an alternative method. Comparing the number of data events to a combined background expectation of QCD and data-driven estimates of the electroweak and top background, no statistically significant excess was observed for three pre-defined search regions. Limits were calculated for the (m{sub 0},m{sub 1/2}) parameter space of the cMSSM, exceeding previous measurements. The expected sensitivity for further refined search regions was investigated.

  3. QCD processes and search for supersymmetry at the LHC

    International Nuclear Information System (INIS)

    Schum, Torben

    2012-07-01

    In this thesis, a data-driven method to estimate the number of QCD background events in a multijet search for supersymmetry at the LHC was developed. The method makes use of two models which predict the correlation of two key search variables, the missing transverse momentum and an angular variable, in order to extrapolate from a QCD dominated control region to the signal region. A good performance of the method was demonstrated by its application to 36 pb -1 data, taken by the CMS experiment in 2010, and by the comparison with an alternative method. Comparing the number of data events to a combined background expectation of QCD and data-driven estimates of the electroweak and top background, no statistically significant excess was observed for three pre-defined search regions. Limits were calculated for the (m 0 ,m 1/2 ) parameter space of the cMSSM, exceeding previous measurements. The expected sensitivity for further refined search regions was investigated.

  4. Developing a search engine for pharmacotherapeutic information that is not published in biomedical journals.

    Science.gov (United States)

    Do Pazo-Oubiña, F; Calvo Pita, C; Puigventós Latorre, F; Periañez-Párraga, L; Ventayol Bosch, P

    2011-01-01

    To identify publishers of pharmacotherapeutic information not found in biomedical journals that focuses on evaluating and providing advice on medicines and to develop a search engine to access this information. Compiling web sites that publish information on the rational use of medicines and have no commercial interests. Free-access web sites in Spanish, Galician, Catalan or English. Designing a search engine using the Google "custom search" application. Overall 159 internet addresses were compiled and were classified into 9 labels. We were able to recover the information from the selected sources using a search engine, which is called "AlquimiA" and available from http://www.elcomprimido.com/FARHSD/AlquimiA.htm. The main sources of pharmacotherapeutic information not published in biomedical journals were identified. The search engine is a useful tool for searching and accessing "grey literature" on the internet. Copyright © 2010 SEFH. Published by Elsevier Espana. All rights reserved.

  5. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    Science.gov (United States)

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  6. Assessing the search for information on Three Rs methods, and their subsequent implementation: a national survey among scientists in the Netherlands.

    Science.gov (United States)

    van Luijk, Judith; Cuijpers, Yvonne; van der Vaart, Lilian; Leenaars, Marlies; Ritskes-Hoitinga, Merel

    2011-10-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify these findings on a national level, we conducted a survey among scientists throughout The Netherlands. Due to the low response rate, the results give an impression of opinions, rather than being representative of The Netherlands as a whole. The findings of both surveys complement each other, and indicate that there is room for improvement. Scientists perceive searching the literature for information on Three Rs methods to be a difficult task, and specific Three Rs search skills and knowledge of Three Rs databases are limited. Rather than using a literature search, many researchers obtain information on these methods through personal communication, which means that published information on possible Three Rs methods often remains unfound and unused. A solution might be to move beyond the direct search for information on Three Rs methods and choose another approach. One approach that seems rather appropriate is that of systematic review. This provides insight into the necessity for any new animal studies, as well as optimal implementation of available data and the prevention of unnecessary animal use in the future. 2011 FRAME.

  7. Nigerian Journal of Technological Development: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  8. Ghana Journal of Development Studies: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  9. Tanzania Journal of Development Studies: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  10. The development of PubMed search strategies for patient preferences for treatment outcomes.

    Science.gov (United States)

    van Hoorn, Ralph; Kievit, Wietske; Booth, Andrew; Mozygemba, Kati; Lysdahl, Kristin Bakke; Refolo, Pietro; Sacchini, Dario; Gerhardus, Ansgar; van der Wilt, Gert Jan; Tummers, Marcia

    2016-07-29

    The importance of respecting patients' preferences when making treatment decisions is increasingly recognized. Efficiently retrieving papers from the scientific literature reporting on the presence and nature of such preferences can help to achieve this goal. The objective of this study was to create a search filter for PubMed to help retrieve evidence on patient preferences for treatment outcomes. A total of 27 journals were hand-searched for articles on patient preferences for treatment outcomes published in 2011. Selected articles served as a reference set. To develop optimal search strategies to retrieve this set, all articles in the reference set were randomly split into a development and a validation set. MeSH-terms and keywords retrieved using PubReMiner were tested individually and as combinations in PubMed and evaluated for retrieval performance (e.g. sensitivity (Se) and specificity (Sp)). Of 8238 articles, 22 were considered to report empirical evidence on patient preferences for specific treatment outcomes. The best search filters reached Se of 100 % [95 % CI 100-100] with Sp of 95 % [94-95 %] and Sp of 97 % [97-98 %] with 75 % Se [74-76 %]. In the validation set these queries reached values of Se of 90 % [89-91 %] with Sp 94 % [93-95 %] and Se of 80 % [79-81 %] with Sp of 97 % [96-96 %], respectively. Narrow and broad search queries were developed which can help in retrieving literature on patient preferences for treatment outcomes. Identifying such evidence may in turn enhance the incorporation of patient preferences in clinical decision making and health technology assessment.

  11. Search for minimal paths in modified networks

    International Nuclear Information System (INIS)

    Yeh, W.-C.

    2002-01-01

    The problem of searching for all minimal paths (MPs) in a network obtained by modifying the original network, e.g. for network expansion or reinforcement, is discussed and solved in this study. The existing best-known method to solve this problem was a straightforward approach. It needed extensive comparison and verification, and failed to solve some special but important cases. Therefore, a more efficient, intuitive and generalized method to search for all MPs without an extensive research procedure is proposed. In this presentation, first we develop an intuitive algorithm based upon the reformation of all MPs in the original network to search for all MPs in a modified network. Next, the computational complexity of the proposed algorithm is analyzed and compared with the existing methods. Finally, examples illustrate how all MPs are generated in a modified network based upon the reformation of all of the MPs in the corresponding original network

  12. Personalized Search

    CERN Document Server

    AUTHOR|(SzGeCERN)749939

    2015-01-01

    As the volume of electronically available information grows, relevant items become harder to find. This work presents an approach to personalizing search results in scientific publication databases. This work focuses on re-ranking search results from existing search engines like Solr or ElasticSearch. This work also includes the development of Obelix, a new recommendation system used to re-rank search results. The project was proposed and performed at CERN, using the scientific publications available on the CERN Document Server (CDS). This work experiments with re-ranking using offline and online evaluation of users and documents in CDS. The experiments conclude that the personalized search result outperform both latest first and word similarity in terms of click position in the search result for global search in CDS.

  13. A summary report on the search for current technologies and developers to develop depth profiling/physical parameter end effectors

    International Nuclear Information System (INIS)

    Nguyen, Q.H.

    1994-01-01

    This report documents the search strategies and results for available technologies and developers to develop tank waste depth profiling/physical parameter sensors. Sources searched include worldwide research reports, technical papers, journals, private industries, and work at Westinghouse Hanford Company (WHC) at Richland site. Tank waste physical parameters of interest are: abrasiveness, compressive strength, corrosiveness, density, pH, particle size/shape, porosity, radiation, settling velocity, shear strength, shear wave velocity, tensile strength, temperature, viscosity, and viscoelasticity. A list of related articles or sources for each physical parameters is provided

  14. A summary report on the search for current technologies and developers to develop depth profiling/physical parameter end effectors

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Q.H.

    1994-09-12

    This report documents the search strategies and results for available technologies and developers to develop tank waste depth profiling/physical parameter sensors. Sources searched include worldwide research reports, technical papers, journals, private industries, and work at Westinghouse Hanford Company (WHC) at Richland site. Tank waste physical parameters of interest are: abrasiveness, compressive strength, corrosiveness, density, pH, particle size/shape, porosity, radiation, settling velocity, shear strength, shear wave velocity, tensile strength, temperature, viscosity, and viscoelasticity. A list of related articles or sources for each physical parameters is provided.

  15. Topology optimization based on the harmony search method

    International Nuclear Information System (INIS)

    Lee, Seung-Min; Han, Seog-Young

    2017-01-01

    A new topology optimization scheme based on a Harmony search (HS) as a metaheuristic method was proposed and applied to static stiffness topology optimization problems. To apply the HS to topology optimization, the variables in HS were transformed to those in topology optimization. Compliance was used as an objective function, and harmony memory was defined as the set of the optimized topology. Also, a parametric study for Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), and Bandwidth (BW) was performed to find the appropriate range for topology optimization. Various techniques were employed such as a filtering scheme, simple average scheme and harmony rate. To provide a robust optimized topology, the concept of the harmony rate update rule was also implemented. Numerical examples are provided to verify the effectiveness of the HS by comparing the optimal layouts of the HS with those of Bidirectional evolutionary structural optimization (BESO) and Artificial bee colony algorithm (ABCA). The following conclu- sions could be made: (1) The proposed topology scheme is very effective for static stiffness topology optimization problems in terms of stability, robustness and convergence rate. (2) The suggested method provides a symmetric optimized topology despite the fact that the HS is a stochastic method like the ABCA. (3) The proposed scheme is applicable and practical in manufacturing since it produces a solid-void design of the optimized topology. (4) The suggested method appears to be very effective for large scale problems like topology optimization.

  16. Topology optimization based on the harmony search method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung-Min; Han, Seog-Young [Hanyang University, Seoul (Korea, Republic of)

    2017-06-15

    A new topology optimization scheme based on a Harmony search (HS) as a metaheuristic method was proposed and applied to static stiffness topology optimization problems. To apply the HS to topology optimization, the variables in HS were transformed to those in topology optimization. Compliance was used as an objective function, and harmony memory was defined as the set of the optimized topology. Also, a parametric study for Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), and Bandwidth (BW) was performed to find the appropriate range for topology optimization. Various techniques were employed such as a filtering scheme, simple average scheme and harmony rate. To provide a robust optimized topology, the concept of the harmony rate update rule was also implemented. Numerical examples are provided to verify the effectiveness of the HS by comparing the optimal layouts of the HS with those of Bidirectional evolutionary structural optimization (BESO) and Artificial bee colony algorithm (ABCA). The following conclu- sions could be made: (1) The proposed topology scheme is very effective for static stiffness topology optimization problems in terms of stability, robustness and convergence rate. (2) The suggested method provides a symmetric optimized topology despite the fact that the HS is a stochastic method like the ABCA. (3) The proposed scheme is applicable and practical in manufacturing since it produces a solid-void design of the optimized topology. (4) The suggested method appears to be very effective for large scale problems like topology optimization.

  17. Competing intelligent search agents in global optimization

    Energy Technology Data Exchange (ETDEWEB)

    Streltsov, S.; Vakili, P. [Boston Univ., MA (United States); Muchnik, I. [Rutgers Univ., Piscataway, NJ (United States)

    1996-12-31

    In this paper we present a new search methodology that we view as a development of intelligent agent approach to the analysis of complex system. The main idea is to consider search process as a competition mechanism between concurrent adaptive intelligent agents. Agents cooperate in achieving a common search goal and at the same time compete with each other for computational resources. We propose a statistical selection approach to resource allocation between agents that leads to simple and efficient on average index allocation policies. We use global optimization as the most general setting that encompasses many types of search problems, and show how proposed selection policies can be used to improve and combine various global optimization methods.

  18. Pressurized water reactor in-core nuclear fuel management by tabu search

    International Nuclear Information System (INIS)

    Hill, Natasha J.; Parks, Geoffrey T.

    2015-01-01

    Highlights: • We develop a tabu search implementation for PWR reload core design. • We conduct computational experiments to find optimal parameter values. • We test the performance of the algorithm on two representative PWR geometries. • We compare this performance with that given by established optimization methods. • Our tabu search implementation outperforms these methods in all cases. - Abstract: Optimization of the arrangement of fuel assemblies and burnable poisons when reloading pressurized water reactors has, in the past, been performed with many different algorithms in an attempt to make reactors more economic and fuel efficient. The use of the tabu search algorithm in tackling reload core design problems is investigated further here after limited, but promising, previous investigations. The performance of the tabu search implementation developed was compared with established genetic algorithm and simulated annealing optimization routines. Tabu search outperformed these existing programs for a number of different objective functions on two different representative core geometries

  19. How doctors search

    DEFF Research Database (Denmark)

    Lykke, Marianne; Price, Susan; Delcambre, Lois

    2012-01-01

    Professional, workplace searching is different from general searching, because it is typically limited to specific facets and targeted to a single answer. We have developed the semantic component (SC) model, which is a search feature that allows searchers to structure and specify the search to co...

  20. A Teaching Approach from the Exhaustive Search Method to the Needleman-Wunsch Algorithm

    Science.gov (United States)

    Xu, Zhongneng; Yang, Yayun; Huang, Beibei

    2017-01-01

    The Needleman-Wunsch algorithm has become one of the core algorithms in bioinformatics; however, this programming requires more suitable explanations for students with different major backgrounds. In supposing sample sequences and using a simple store system, the connection between the exhaustive search method and the Needleman-Wunsch algorithm…

  1. An Iterated Tabu Search Approach for the Clique Partitioning Problem

    Directory of Open Access Journals (Sweden)

    Gintaras Palubeckis

    2014-01-01

    all cliques induced by the subsets is as small as possible. We develop an iterated tabu search (ITS algorithm for solving this problem. The proposed algorithm incorporates tabu search, local search, and solution perturbation procedures. We report computational results on CPP instances of size up to 2000 vertices. Performance comparisons of ITS against state-of-the-art methods from the literature demonstrate the competitiveness of our approach.

  2. In vitro detection of circulating tumor cells compared by the CytoTrack and CellSearch methods

    DEFF Research Database (Denmark)

    Hillig, T.; Horn, P.; Nygaard, Ann-Britt

    2015-01-01

    .23/p = 0.09). Overall, the recovery of CytoTrack and CellSearch was 68.8 +/- 3.9 %/71.1 +/- 2.9 %, respectively (p = 0.58). In spite of different methodologies, CytoTrack and CellSearch found similar number of CTCs, when spiking was performed with the EpCAM and pan cytokeratin-positive cell line MCF-7......Comparison of two methods to detect circulating tumor cells (CTC) CytoTrack and CellSearch through recovery of MCF-7 breast cancer cells, spiked into blood collected from healthy donors. Spiking of a fixed number of EpCAM and pan-cytokeratin positive MCF-7 cells into 7.5 mL donor blood...... was performed by FACSAria flow sorting. The samples were shipped to either CytoTrack or CellSearch research facilities within 48 h, where evaluation of MCF-7 recovery was performed. CytoTrack and CellSearch analyses were performed simultaneously. Recoveries of MCF-7 single cells, cells in clusters, and clusters...

  3. DISCOVERY OF NINE GAMMA-RAY PULSARS IN FERMI LARGE AREA TELESCOPE DATA USING A NEW BLIND SEARCH METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Pletsch, H. J.; Allen, B.; Aulbert, C.; Fehrmann, H. [Albert-Einstein-Institut, Max-Planck-Institut fuer Gravitationsphysik, D-30167 Hannover (Germany); Guillemot, L.; Kramer, M.; Barr, E. D.; Champion, D. J.; Eatough, R. P.; Freire, P. C. C. [Max-Planck-Institut fuer Radioastronomie, Auf dem Huegel 69, D-53121 Bonn (Germany); Ray, P. S. [Space Science Division, Naval Research Laboratory, Washington, DC 20375-5352 (United States); Belfiore, A.; Dormody, M. [Santa Cruz Institute for Particle Physics, Department of Physics and Department of Astronomy and Astrophysics, University of California at Santa Cruz, Santa Cruz, CA 95064 (United States); Camilo, F. [Columbia Astrophysics Laboratory, Columbia University, New York, NY 10027 (United States); Caraveo, P. A. [INAF-Istituto di Astrofisica Spaziale e Fisica Cosmica, I-20133 Milano (Italy); Celik, Oe.; Ferrara, E. C. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Hessels, J. W. T. [Astronomical Institute ' Anton Pannekoek' , University of Amsterdam, Postbus 94249, 1090 GE Amsterdam (Netherlands); Keith, M. [CSIRO Astronomy and Space Science, Australia Telescope National Facility, Epping NSW 1710 (Australia); Kerr, M., E-mail: holger.pletsch@aei.mpg.de, E-mail: guillemo@mpifr-bonn.mpg.de [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States); and others

    2012-01-10

    We report the discovery of nine previously unknown gamma-ray pulsars in a blind search of data from the Fermi Large Area Telescope (LAT). The pulsars were found with a novel hierarchical search method originally developed for detecting continuous gravitational waves from rapidly rotating neutron stars. Designed to find isolated pulsars spinning at up to kHz frequencies, the new method is computationally efficient and incorporates several advances, including a metric-based gridding of the search parameter space (frequency, frequency derivative, and sky location) and the use of photon probability weights. The nine pulsars have spin frequencies between 3 and 12 Hz, and characteristic ages ranging from 17 kyr to 3 Myr. Two of them, PSRs J1803-2149 and J2111+ 4606, are young and energetic Galactic-plane pulsars (spin-down power above 6 Multiplication-Sign 10{sup 35} erg s{sup -1} and ages below 100 kyr). The seven remaining pulsars, PSRs J0106+4855, J0622+3749, J1620-4927, J1746-3239, J2028+3332, J2030+4415, and J2139+4716, are older and less energetic; two of them are located at higher Galactic latitudes (|b| > 10 Degree-Sign ). PSR J0106+4855 has the largest characteristic age (3 Myr) and the smallest surface magnetic field (2 Multiplication-Sign 10{sup 11} G) of all LAT blind-search pulsars. PSR J2139+4716 has the lowest spin-down power (3 Multiplication-Sign 10{sup 33} erg s{sup -1}) among all non-recycled gamma-ray pulsars ever found. Despite extensive multi-frequency observations, only PSR J0106+4855 has detectable pulsations in the radio band. The other eight pulsars belong to the increasing population of radio-quiet gamma-ray pulsars.

  4. Combining of Direct Search and Signal-to-Noise Ratio for economic dispatch optimization

    International Nuclear Information System (INIS)

    Lin, Whei-Min; Gow, Hong-Jey; Tsai, Ming-Tang

    2011-01-01

    This paper integrated the ideas of Direct Search and Signal-to-Noise Ratio (SNR) to develop a Novel Direct Search (NDS) method for solving the non-convex economic dispatch problems. NDS consists of three stages: Direct Search (DS), Global SNR (GSNR) and Marginal Compensation (MC) stages. DS provides a basic solution. GSNR searches the point with optimization strategy. MC fulfills the power balance requirement. With NDS, the infinite solution space becomes finite. Furthermore, a same optimum solution can be repeatedly reached. Effectiveness of NDS is demonstrated with three examples and the solutions were compared with previously published results. Test results show that the proposed method is simple, robust, and more effective than many other previously developed algorithms.

  5. Methods for the guideline-based development of quality indicators--a systematic review

    Science.gov (United States)

    2012-01-01

    Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067

  6. Protein structure database search and evolutionary classification.

    Science.gov (United States)

    Yang, Jinn-Moon; Tung, Chi-Hua

    2006-01-01

    As more protein structures become available and structural genomics efforts provide structural models in a genome-wide strategy, there is a growing need for fast and accurate methods for discovering homologous proteins and evolutionary classifications of newly determined structures. We have developed 3D-BLAST, in part, to address these issues. 3D-BLAST is as fast as BLAST and calculates the statistical significance (E-value) of an alignment to indicate the reliability of the prediction. Using this method, we first identified 23 states of the structural alphabet that represent pattern profiles of the backbone fragments and then used them to represent protein structure databases as structural alphabet sequence databases (SADB). Our method enhanced BLAST as a search method, using a new structural alphabet substitution matrix (SASM) to find the longest common substructures with high-scoring structured segment pairs from an SADB database. Using personal computers with Intel Pentium4 (2.8 GHz) processors, our method searched more than 10 000 protein structures in 1.3 s and achieved a good agreement with search results from detailed structure alignment methods. [3D-BLAST is available at http://3d-blast.life.nctu.edu.tw].

  7. Neural Based Tabu Search method for solving unit commitment problem with cooling-banking constraints

    Directory of Open Access Journals (Sweden)

    Rajan Asir Christober Gnanakkan Charles

    2009-01-01

    Full Text Available This paper presents a new approach to solve short-term unit commitment problem (UCP using Neural Based Tabu Search (NBTS with cooling and banking constraints. The objective of this paper is to find the generation scheduling such that the total operating cost can be minimized, when subjected to a variety of constraints. This also means that it is desirable to find the optimal generating unit commitment in the power system for next H hours. A 7-unit utility power system in India demonstrates the effectiveness of the proposed approach; extensive studies have also been performed for different IEEE test systems consist of 10, 26 and 34 units. Numerical results are shown to compare the superiority of the cost solutions obtained using the Tabu Search (TS method, Dynamic Programming (DP and Lagrangian Relaxation (LR methods in reaching proper unit commitment.

  8. Classical algorithms for automated parameter-search methods in compartmental neural models - A critical survey based on simulations using neuron

    International Nuclear Information System (INIS)

    Mutihac, R.; Mutihac, R.C.; Cicuttin, A.

    2001-09-01

    Parameter-search methods are problem-sensitive. All methods depend on some meta-parameters of their own, which must be determined experimentally in advance. A better choice of these intrinsic parameters for a certain parameter-search method may improve its performance. Moreover, there are various implementations of the same method, which may also affect its performance. The choice of the matching (error) function has a great impact on the search process in terms of finding the optimal parameter set and minimizing the computational cost. An initial assessment of the matching function ability to distinguish between good and bad models is recommended, before launching exhaustive computations. However, different runs of a parameter search method may result in the same optimal parameter set or in different parameter sets (the model is insufficiently constrained to accurately characterize the real system). Robustness of the parameter set is expressed by the extent to which small perturbations in the parameter values are not affecting the best solution. A parameter set that is not robust is unlikely to be physiologically relevant. Robustness can also be defined as the stability of the optimal parameter set to small variations of the inputs. When trying to estimate things like the minimum, or the least-squares optimal parameters of a nonlinear system, the existence of multiple local minima can cause problems with the determination of the global optimum. Techniques such as Newton's method, the Simplex method and Least-squares Linear Taylor Differential correction technique can be useful provided that one is lucky enough to start sufficiently close to the global minimum. All these methods suffer from the inability to distinguish a local minimum from a global one because they follow the local gradients towards the minimum, even if some methods are resetting the search direction when it is likely to get stuck in presumably a local minimum. Deterministic methods based on

  9. The medline UK filter: development and validation of a geographic search filter to retrieve research about the UK from OVID medline.

    Science.gov (United States)

    Ayiku, Lynda; Levay, Paul; Hudson, Tom; Craven, Jenny; Barrett, Elizabeth; Finnegan, Amy; Adams, Rachel

    2017-07-13

    A validated geographic search filter for the retrieval of research about the United Kingdom (UK) from bibliographic databases had not previously been published. To develop and validate a geographic search filter to retrieve research about the UK from OVID medline with high recall and precision. Three gold standard sets of references were generated using the relative recall method. The sets contained references to studies about the UK which had informed National Institute for Health and Care Excellence (NICE) guidance. The first and second sets were used to develop and refine the medline UK filter. The third set was used to validate the filter. Recall, precision and number-needed-to-read (NNR) were calculated using a case study. The validated medline UK filter demonstrated 87.6% relative recall against the third gold standard set. In the case study, the medline UK filter demonstrated 100% recall, 11.4% precision and a NNR of nine. A validated geographic search filter to retrieve research about the UK with high recall and precision has been developed. The medline UK filter can be applied to systematic literature searches in OVID medline for topics with a UK focus. © 2017 Crown copyright. Health Information and Libraries Journal © 2017 Health Libraries GroupThis article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  10. Search methods that people use to find owners of lost pets.

    Science.gov (United States)

    Lord, Linda K; Wittum, Thomas E; Ferketich, Amy K; Funk, Julie A; Rajala-Schultz, Päivi J

    2007-06-15

    To characterize the process by which people who find lost pets search for the owners. Cross-sectional study. Sample Population-188 individuals who found a lost pet in Dayton, Ohio, between March 1 and June 30, 2006. Procedures-Potential participants were identified as a result of contact with a local animal agency or placement of an advertisement in the local newspaper. A telephone survey was conducted to identify methods participants used to find the pets' owners. 156 of 188 (83%) individuals completed the survey. Fifty-nine of the 156 (38%) pets were reunited with their owners; median time to reunification was 2 days (range, 0.5 to 45 days). Only 1 (3%) cat owner was found, compared with 58 (46%) dog owners. Pet owners were found as a result of information provided by an animal agency (25%), placement of a newspaper advertisement (24%), walking the neighborhood (19%), signs in the neighborhood (15%), information on a pet tag (10%), and other methods (7%). Most finders (87%) considered it extremely important to find the owner, yet only 13 (8%) initially surrendered the found pet to an animal agency. The primary reason people did not surrender found pets was fear of euthanasia (57%). Only 97 (62%) individuals were aware they could run a found-pet advertisement in the newspaper at no charge, and only 1 person who was unaware of the no-charge policy placed an advertisement. Veterinarians and shelters can help educate people who find lost pets about methods to search for the pets' owners.

  11. Development of a PubMed Based Search Tool for Identifying Sex and Gender Specific Health Literature.

    Science.gov (United States)

    Song, Michael M; Simonsen, Cheryl K; Wilson, Joanna D; Jenkins, Marjorie R

    2016-02-01

    An effective literature search strategy is critical to achieving the aims of Sex and Gender Specific Health (SGSH): to understand sex and gender differences through research and to effectively incorporate the new knowledge into the clinical decision making process to benefit both male and female patients. The goal of this project was to develop and validate an SGSH literature search tool that is readily and freely available to clinical researchers and practitioners. PubMed, a freely available search engine for the Medline database, was selected as the platform to build the SGSH literature search tool. Combinations of Medical Subject Heading terms, text words, and title words were evaluated for optimal specificity and sensitivity. The search tool was then validated against reference bases compiled for two disease states, diabetes and stroke. Key sex and gender terms and limits were bundled to create a search tool to facilitate PubMed SGSH literature searches. During validation, the search tool retrieved 50 of 94 (53.2%) stroke and 62 of 95 (65.3%) diabetes reference articles selected for validation. A general keyword search of stroke or diabetes combined with sex difference retrieved 33 of 94 (35.1%) stroke and 22 of 95 (23.2%) diabetes reference base articles, with lower sensitivity and specificity for SGSH content. The Texas Tech University Health Sciences Center SGSH PubMed Search Tool provides higher sensitivity and specificity to sex and gender specific health literature. The tool will facilitate research, clinical decision-making, and guideline development relevant to SGSH.

  12. Searching for rigour in the reporting of mixed methods population health research: a methodological review.

    Science.gov (United States)

    Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J

    2015-12-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  13. Optimal search behavior and classic foraging theory

    International Nuclear Information System (INIS)

    Bartumeus, F; Catalan, J

    2009-01-01

    Random walk methods and diffusion theory pervaded ecological sciences as methods to analyze and describe animal movement. Consequently, statistical physics was mostly seen as a toolbox rather than as a conceptual framework that could contribute to theory on evolutionary biology and ecology. However, the existence of mechanistic relationships and feedbacks between behavioral processes and statistical patterns of movement suggests that, beyond movement quantification, statistical physics may prove to be an adequate framework to understand animal behavior across scales from an ecological and evolutionary perspective. Recently developed random search theory has served to critically re-evaluate classic ecological questions on animal foraging. For instance, during the last few years, there has been a growing debate on whether search behavior can include traits that improve success by optimizing random (stochastic) searches. Here, we stress the need to bring together the general encounter problem within foraging theory, as a mean for making progress in the biological understanding of random searching. By sketching the assumptions of optimal foraging theory (OFT) and by summarizing recent results on random search strategies, we pinpoint ways to extend classic OFT, and integrate the study of search strategies and its main results into the more general theory of optimal foraging.

  14. A peak value searching method of the MCA based on digital logic devices

    International Nuclear Information System (INIS)

    Sang Ziru; Huang Shanshan; Chen Lian; Jin Ge

    2010-01-01

    Digital multi-channel analyzers play a more important role in multi-channel pulse height analysis technique. The direction of digitalization are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper introduces a method of searching peak value of waveform based on digital logic with FPGA. This method reduce the dead time. Then data correction offline can improvement the non-linearity of MCA. It gives the α energy spectrum of 241 Am. (authors)

  15. Recent developments in imaging system assessment methodology, FROC analysis and the search model.

    Science.gov (United States)

    Chakraborty, Dev P

    2011-08-21

    A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search-model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.

  16. Recent developments in imaging system assessment methodology, FROC analysis and the search model

    International Nuclear Information System (INIS)

    Chakraborty, Dev P.

    2011-01-01

    A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.

  17. A proposed heuristic methodology for searching reloading pattern

    International Nuclear Information System (INIS)

    Choi, K. Y.; Yoon, Y. K.

    1993-01-01

    A new heuristic method for loading pattern search has been developed to overcome shortcomings of the algorithmic approach. To reduce the size of vast solution space, general shuffling rules, a regionwise shuffling method, and a pattern grouping method were introduced. The entropy theory was applied to classify possible loading patterns into groups with similarity between them. The pattern search program was implemented with use of the PROLOG language. A two-group nodal code MEDIUM-2D was used for analysis of power distribution in the core. The above mentioned methodology has been tested to show effectiveness in reducing of solution space down to a few hundred pattern groups. Burnable poison rods were then arranged in each pattern group in accordance with burnable poison distribution rules, which led to further reduction of the solution space to several scores of acceptable pattern groups. The method of maximizing cycle length (MCL) and minimizing power-peaking factor (MPF) were applied to search for specific useful loading patterns from the acceptable pattern groups. Thus, several specific loading patterns that have low power-peaking factor and large cycle length were successfully searched from the selected pattern groups. (Author)

  18. Electricity price forecast using Combinatorial Neural Network trained by a new stochastic search method

    International Nuclear Information System (INIS)

    Abedinia, O.; Amjady, N.; Shafie-khah, M.; Catalão, J.P.S.

    2015-01-01

    Highlights: • Presenting a Combinatorial Neural Network. • Suggesting a new stochastic search method. • Adapting the suggested method as a training mechanism. • Proposing a new forecast strategy. • Testing the proposed strategy on real-world electricity markets. - Abstract: Electricity price forecast is key information for successful operation of electricity market participants. However, the time series of electricity price has nonlinear, non-stationary and volatile behaviour and so its forecast method should have high learning capability to extract the complex input/output mapping function of electricity price. In this paper, a Combinatorial Neural Network (CNN) based forecasting engine is proposed to predict the future values of price data. The CNN-based forecasting engine is equipped with a new training mechanism for optimizing the weights of the CNN. This training mechanism is based on an efficient stochastic search method, which is a modified version of chemical reaction optimization algorithm, giving high learning ability to the CNN. The proposed price forecast strategy is tested on the real-world electricity markets of Pennsylvania–New Jersey–Maryland (PJM) and mainland Spain and its obtained results are extensively compared with the results obtained from several other forecast methods. These comparisons illustrate effectiveness of the proposed strategy.

  19. Citation searches are more sensitive than keyword searches to identify studies using specific measurement instruments.

    Science.gov (United States)

    Linder, Suzanne K; Kamath, Geetanjali R; Pratt, Gregory F; Saraykar, Smita S; Volk, Robert J

    2015-04-01

    To compare the effectiveness of two search methods in identifying studies that used the Control Preferences Scale (CPS), a health care decision-making instrument commonly used in clinical settings. We searched the literature using two methods: (1) keyword searching using variations of "Control Preferences Scale" and (2) cited reference searching using two seminal CPS publications. We searched three bibliographic databases [PubMed, Scopus, and Web of Science (WOS)] and one full-text database (Google Scholar). We report precision and sensitivity as measures of effectiveness. Keyword searches in bibliographic databases yielded high average precision (90%) but low average sensitivity (16%). PubMed was the most precise, followed closely by Scopus and WOS. The Google Scholar keyword search had low precision (54%) but provided the highest sensitivity (70%). Cited reference searches in all databases yielded moderate sensitivity (45-54%), but precision ranged from 35% to 75% with Scopus being the most precise. Cited reference searches were more sensitive than keyword searches, making it a more comprehensive strategy to identify all studies that use a particular instrument. Keyword searches provide a quick way of finding some but not all relevant articles. Goals, time, and resources should dictate the combination of which methods and databases are used. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    Science.gov (United States)

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  1. Phylogenetic search through partial tree mixing

    Science.gov (United States)

    2012-01-01

    Background Recent advances in sequencing technology have created large data sets upon which phylogenetic inference can be performed. Current research is limited by the prohibitive time necessary to perform tree search on a reasonable number of individuals. This research develops new phylogenetic algorithms that can operate on tens of thousands of species in a reasonable amount of time through several innovative search techniques. Results When compared to popular phylogenetic search algorithms, better trees are found much more quickly for large data sets. These algorithms are incorporated in the PSODA application available at http://dna.cs.byu.edu/psoda Conclusions The use of Partial Tree Mixing in a partition based tree space allows the algorithm to quickly converge on near optimal tree regions. These regions can then be searched in a methodical way to determine the overall optimal phylogenetic solution. PMID:23320449

  2. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  3. An automated and efficient conformation search of L-cysteine and L,L-cystine using the scaled hypersphere search method

    Science.gov (United States)

    Kishimoto, Naoki; Waizumi, Hiroki

    2017-10-01

    Stable conformers of L-cysteine and L,L-cystine were explored using an automated and efficient conformational searching method. The Gibbs energies of the stable conformers of L-cysteine and L,L-cystine were calculated with G4 and MP2 methods, respectively, at 450, 298.15, and 150 K. By assuming thermodynamic equilibrium and the barrier energies for the conformational isomerization pathways, the estimated ratios of the stable conformers of L-cysteine were compared with those determined by microwave spectroscopy in a previous study. Equilibrium structures of 1:1 and 2:1 cystine-Fe complexes were also calculated, and the energy of insertion of Fe into the disulfide bond was obtained.

  4. Earthquake effect on volcano and the geological structure in central java using tomography travel time method and relocation hypocenter by grid search method

    International Nuclear Information System (INIS)

    Suharsono; Nurdian, S. W; Palupi, I. R.

    2016-01-01

    Relocating hypocenter is a way to improve the velocity model of the subsurface. One of the method is Grid Search. To perform the distribution of the velocity in subsurface by tomography method, it is used the result of relocating hypocenter to be a reference for subsurface analysis in volcanic and major structural patterns, such as in Central Java. The main data of this study is the earthquake data recorded from 1952 to 2012 with the P wave number is 9162, the number of events is 2426 were recorded by 30 stations located in the vicinity of Central Java. Grid search method has some advantages they are: it can relocate the hypocenter more accurate because this method is dividing space lattice model into blocks, and each grid block can only be occupied by one point hypocenter. Tomography technique is done by travel time data that has had relocated with inversion pseudo bending method. Grid search relocated method show that the hypocenter's depth is shallower than before and the direction is to the south, the hypocenter distribution is modeled into the subduction zone between the continent of Eurasia with the Indo-Australian with an average angle of 14 °. The tomography results show the low velocity value is contained under volcanoes with value of -8% to -10%, then the pattern of the main fault structure in Central Java can be description by the results of tomography at high velocity that is from 8% to 10% with the direction is northwest and northeast-southwest. (paper)

  5. Integration of first-principles methods and crystallographic database searches for new ferroelectrics: Strategies and explorations

    International Nuclear Information System (INIS)

    Bennett, Joseph W.; Rabe, Karin M.

    2012-01-01

    In this concept paper, the development of strategies for the integration of first-principles methods with crystallographic database mining for the discovery and design of novel ferroelectric materials is discussed, drawing on the results and experience derived from exploratory investigations on three different systems: (1) the double perovskite Sr(Sb 1/2 Mn 1/2 )O 3 as a candidate semiconducting ferroelectric; (2) polar derivatives of schafarzikite MSb 2 O 4 ; and (3) ferroelectric semiconductors with formula M 2 P 2 (S,Se) 6 . A variety of avenues for further research and investigation are suggested, including automated structure type classification, low-symmetry improper ferroelectrics, and high-throughput first-principles searches for additional representatives of structural families with desirable functional properties. - Graphical abstract: Integration of first-principles methods with crystallographic database mining, for the discovery and design of novel ferroelectric materials, could potentially lead to new classes of multifunctional materials. Highlights: ► Integration of first-principles methods and database mining. ► Minor structural families with desirable functional properties. ► Survey of polar entries in the Inorganic Crystal Structural Database.

  6. An improved harmony search algorithm for power economic load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, PPGEPS, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)], E-mail: leandro.coelho@pucpr.br; Mariani, Viviana Cocco [Pontifical Catholic University of Parana, PUCPR, Department of Mechanical Engineering, PPGEM, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)], E-mail: viviana.mariani@pucpr.br

    2009-10-15

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature.

  7. An improved harmony search algorithm for power economic load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Coelho, Leandro dos Santos [Pontifical Catholic Univ. of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, PPGEPS, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil); Mariani, Viviana Cocco [Pontifical Catholic Univ. of Parana, PUCPR, Dept. of Mechanical Engineering, PPGEM, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)

    2009-10-15

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature. (author)

  8. An improved harmony search algorithm for power economic load dispatch

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos; Mariani, Viviana Cocco

    2009-01-01

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature.

  9. Mathematical programming models for solving in equal-sized facilities layout problems. A genetic search method

    International Nuclear Information System (INIS)

    Tavakkoli-Moghaddam, R.

    1999-01-01

    This paper present unequal-sized facilities layout solutions generated by a genetic search program. named Layout Design using a Genetic Algorithm) 9. The generalized quadratic assignment problem requiring pre-determined distance and material flow matrices as the input data and the continuous plane model employing a dynamic distance measure and a material flow matrix are discussed. Computational results on test problems are reported as compared with layout solutions generated by the branch - and bound algorithm a hybrid method merging simulated annealing and local search techniques, and an optimization process of an enveloped block

  10. Mapping online consumer search

    NARCIS (Netherlands)

    Bronnenberg, B.J.; Kim, J.; Albuquerque, P.

    2011-01-01

    The authors propose a new method to visualize browsing behavior in so-called product search maps. Manufacturers can use these maps to understand how consumers search for competing products before choice, including how information acquisition and product search are organized along brands, product

  11. A new family of Polak-Ribiere-Polyak conjugate gradient method with the strong-Wolfe line search

    Science.gov (United States)

    Ghani, Nur Hamizah Abdul; Mamat, Mustafa; Rivaie, Mohd

    2017-08-01

    Conjugate gradient (CG) method is an important technique in unconstrained optimization, due to its effectiveness and low memory requirements. The focus of this paper is to introduce a new CG method for solving large scale unconstrained optimization. Theoretical proofs show that the new method fulfills sufficient descent condition if strong Wolfe-Powell inexact line search is used. Besides, computational results show that our proposed method outperforms to other existing CG methods.

  12. A novel optimization method, Gravitational Search Algorithm (GSA), for PWR core optimization

    International Nuclear Information System (INIS)

    Mahmoudi, S.M.; Aghaie, M.; Bahonar, M.; Poursalehi, N.

    2016-01-01

    Highlights: • The Gravitational Search Algorithm (GSA) is introduced. • The advantage of GSA is verified in Shekel’s Foxholes. • Reload optimizing in WWER-1000 and WWER-440 cases are performed. • Maximizing K eff , minimizing PPFs and flattening power density is considered. - Abstract: In-core fuel management optimization (ICFMO) is one of the most challenging concepts of nuclear engineering. In recent decades several meta-heuristic algorithms or computational intelligence methods have been expanded to optimize reactor core loading pattern. This paper presents a new method of using Gravitational Search Algorithm (GSA) for in-core fuel management optimization. The GSA is constructed based on the law of gravity and the notion of mass interactions. It uses the theory of Newtonian physics and searcher agents are the collection of masses. In this work, at the first step, GSA method is compared with other meta-heuristic algorithms on Shekel’s Foxholes problem. In the second step for finding the best core, the GSA algorithm has been performed for three PWR test cases including WWER-1000 and WWER-440 reactors. In these cases, Multi objective optimizations with the following goals are considered, increment of multiplication factor (K eff ), decrement of power peaking factor (PPF) and power density flattening. It is notable that for neutronic calculation, PARCS (Purdue Advanced Reactor Core Simulator) code is used. The results demonstrate that GSA algorithm have promising performance and could be proposed for other optimization problems of nuclear engineering field.

  13. Assessing the search for information on three Rs methods, and their subsequent implementation: a national survey among scientists in the Netherlands

    NARCIS (Netherlands)

    Luijk, J. van; Cuijpers, Y.M.; Vaart, L. van der; Leenaars, M; Ritskes-Hoitinga, M.

    2011-01-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify

  14. Assessing the Search for Information on Three Rs Methods, and their Subsequent Implementation: A National Survey among Scientists in The Netherlands.

    NARCIS (Netherlands)

    Luijk, J. van; Cuijpers, Y.M.; Vaart, L. van der; Leenaars, M.; Ritskes-Hoitinga, M.

    2011-01-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify

  15. Faceted Search

    CERN Document Server

    Tunkelang, Daniel

    2009-01-01

    We live in an information age that requires us, more than ever, to represent, access, and use information. Over the last several decades, we have developed a modern science and technology for information retrieval, relentlessly pursuing the vision of a "memex" that Vannevar Bush proposed in his seminal article, "As We May Think." Faceted search plays a key role in this program. Faceted search addresses weaknesses of conventional search approaches and has emerged as a foundation for interactive information retrieval. User studies demonstrate that faceted search provides more

  16. The Development of a Combined Search for a Heterogeneous Chemistry Database

    Directory of Open Access Journals (Sweden)

    Lulu Jiang

    2015-05-01

    Full Text Available A combined search, which joins a slow molecule structure search with a fast compound property search, results in more accurate search results and has been applied in several chemistry databases. However, the problems of search speed differences and combining the two separate search results are two major challenges. In this paper, two kinds of search strategies, synchronous search and asynchronous search, are proposed to solve these problems in the heterogeneous structure database and the property database found in ChemDB, a chemistry database owned by the Institute of Process Engineering, CAS. Their advantages and disadvantages under different conditions are discussed in detail. Furthermore, we applied these two searches to ChemDB and used them to screen for potential molecules that can work as CO2 absorbents. The results reveal that this combined search discovers reasonable target molecules within an acceptable time frame.

  17. Search times and probability of detection in time-limited search

    Science.gov (United States)

    Wilson, David; Devitt, Nicole; Maurer, Tana

    2005-05-01

    When modeling the search and target acquisition process, probability of detection as a function of time is important to war games and physical entity simulations. Recent US Army RDECOM CERDEC Night Vision and Electronics Sensor Directorate modeling of search and detection has focused on time-limited search. Developing the relationship between detection probability and time of search as a differential equation is explored. One of the parameters in the current formula for probability of detection in time-limited search corresponds to the mean time to detect in time-unlimited search. However, the mean time to detect in time-limited search is shorter than the mean time to detect in time-unlimited search and the relationship between them is a mathematical relationship between these two mean times. This simple relationship is derived.

  18. Methods to filter out spurious disturbances in continuous-wave searches from gravitational-wave detectors

    International Nuclear Information System (INIS)

    Leaci, Paola

    2015-01-01

    Semicoherent all-sky searches over year-long observation times for continuous gravitational wave signals produce various thousands of potential periodic source candidates. Efficient methods able to discard false candidate events are crucial in order to put all the efforts into a computationally intensive follow-up analysis for the remaining most promising candidates (Shaltev et al 2014 Phys. Rev. D 89 124030). In this paper we present a set of techniques able to fulfill such requirements, identifying and eliminating false candidate events, reducing thus the bulk of candidate sets that need to be further investigated. Some of these techniques were also used to streamline the candidate sets returned by the Einstein@Home hierarchical searches presented in (Aasi J et al (The LIGO Scientific Collaboration and the Virgo Collaboration) 2013 Phys. Rev. D 87 042001). These powerful methods and the benefits originating from their application to both simulated and on detector data from the fifth LIGO science run are illustrated and discussed. (paper)

  19. Searching for Life with Rovers: Exploration Methods & Science Results from the 2004 Field Campaign of the "Life in the Atacama" Project and Applications to Future Mars Missions

    Science.gov (United States)

    Cabrol, N. A.a; Wettergreen, D. S.; Whittaker, R.; Grin, E. A.; Moersch, J.; Diaz, G. Chong; Cockell, C.; Coppin, P.; Dohm, J. M.; Fisher, G.

    2005-01-01

    The Life In The Atacama (LITA) project develops and field tests a long-range, solarpowered, automated rover platform (Zo ) and a science payload assembled to search for microbial life in the Atacama desert. Life is barely detectable over most of the driest desert on Earth. Its unique geological, climatic, and biological evolution have created a unique training site for designing and testing exploration strategies and life detection methods for the robotic search for life on Mars.

  20. Development of a Google-based search engine for data mining radiology reports.

    Science.gov (United States)

    Erinjeri, Joseph P; Picus, Daniel; Prior, Fred W; Rubin, David A; Koppel, Paul

    2009-08-01

    The aim of this study is to develop a secure, Google-based data-mining tool for radiology reports using free and open source technologies and to explore its use within an academic radiology department. A Health Insurance Portability and Accountability Act (HIPAA)-compliant data repository, search engine and user interface were created to facilitate treatment, operations, and reviews preparatory to research. The Institutional Review Board waived review of the project, and informed consent was not required. Comprising 7.9 GB of disk space, 2.9 million text reports were downloaded from our radiology information system to a fileserver. Extensible markup language (XML) representations of the reports were indexed using Google Desktop Enterprise search engine software. A hypertext markup language (HTML) form allowed users to submit queries to Google Desktop, and Google's XML response was interpreted by a practical extraction and report language (PERL) script, presenting ranked results in a web browser window. The query, reason for search, results, and documents visited were logged to maintain HIPAA compliance. Indexing averaged approximately 25,000 reports per hour. Keyword search of a common term like "pneumothorax" yielded the first ten most relevant results of 705,550 total results in 1.36 s. Keyword search of a rare term like "hemangioendothelioma" yielded the first ten most relevant results of 167 total results in 0.23 s; retrieval of all 167 results took 0.26 s. Data mining tools for radiology reports will improve the productivity of academic radiologists in clinical, educational, research, and administrative tasks. By leveraging existing knowledge of Google's interface, radiologists can quickly perform useful searches.

  1. Searching for Suicide Information on Web Search Engines in Chinese

    Directory of Open Access Journals (Sweden)

    Yen-Feng Lee

    2017-01-01

    Full Text Available Introduction: Recently, suicide prevention has been an important public health issue. However, with the growing access to information in cyberspace, the harmful information is easily accessible online. To investigate the accessibility of potentially harmful suicide-related information on the internet, we discuss the following issue about searching suicide information on the internet to draw attention to it. Methods: We use five search engines (Google, Yahoo, Bing, Yam, and Sina and four suicide-related search queries (suicide, how to suicide, suicide methods, and want to die in traditional Chinese in April 2016. We classified the first thirty linkages of the search results on each search engine by a psychiatric doctor into suicide prevention, pro-suicide, neutral, unrelated to suicide, or error websites. Results: Among the total 352 unique websites generated, the suicide prevention websites were the most frequent among the search results (37.8%, followed by websites unrelated to suicide (25.9% and neutral websites (23.0%. However, pro-suicide websites were still easily accessible (9.7%. Besides, compared with the USA and China, the search engine originating in Taiwan had the lowest accessibility to pro-suicide information. The results of ANOVA showed a significant difference between the groups, F = 8.772, P < 0.001. Conclusions: This study results suggest a need for further restrictions and regulations of pro-suicide information on the internet. Providing more supportive information online may be an effective plan for suicidal prevention.

  2. Journal of Social Development in Africa: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  3. Journal of Research in National Development: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  4. Journal of Agricultural Research and Development: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  5. African Journal of Governance and Development: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  6. Journal of Science and Sustainable Development: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  7. Eastern Africa Journal of Rural Development: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  8. Learning Search Algorithms: An Educational View

    Directory of Open Access Journals (Sweden)

    Ales Janota

    2014-12-01

    Full Text Available Artificial intelligence methods find their practical usage in many applications including maritime industry. The paper concentrates on the methods of uninformed and informed search, potentially usable in solving of complex problems based on the state space representation. The problem of introducing the search algorithms to newcomers has its technical and psychological dimensions. The authors show how it is possible to cope with both of them through design and use of specialized authoring systems. A typical example of searching a path through the maze is used to demonstrate how to test, observe and compare properties of various search strategies. Performance of search methods is evaluated based on the common criteria.

  9. A Method for Estimating View Transformations from Image Correspondences Based on the Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Erik Cuevas

    2015-01-01

    Full Text Available In this paper, a new method for robustly estimating multiple view relations from point correspondences is presented. The approach combines the popular random sampling consensus (RANSAC algorithm and the evolutionary method harmony search (HS. With this combination, the proposed method adopts a different sampling strategy than RANSAC to generate putative solutions. Under the new mechanism, at each iteration, new candidate solutions are built taking into account the quality of the models generated by previous candidate solutions, rather than purely random as it is the case of RANSAC. The rules for the generation of candidate solutions (samples are motivated by the improvisation process that occurs when a musician searches for a better state of harmony. As a result, the proposed approach can substantially reduce the number of iterations still preserving the robust capabilities of RANSAC. The method is generic and its use is illustrated by the estimation of homographies, considering synthetic and real images. Additionally, in order to demonstrate the performance of the proposed approach within a real engineering application, it is employed to solve the problem of position estimation in a humanoid robot. Experimental results validate the efficiency of the proposed method in terms of accuracy, speed, and robustness.

  10. Panoramic Search: The Interaction of Memory and Vision in Search through a Familiar Scene

    Science.gov (United States)

    Oliva, Aude; Wolfe, Jeremy M. Arsenio, Helga C.

    2004-01-01

    How do observers search through familiar scenes? A novel panoramic search method is used to study the interaction of memory and vision in natural search behavior. In panoramic search, observers see part of an unchanging scene larger than their current field of view. A target object can be visible, present in the display but hidden from view, or…

  11. Development and Pilot Implementation of a Search Protocol to Improve Patient Safety on a Psychiatric Inpatient Unit.

    Science.gov (United States)

    Abela-Dimech, Frances; Johnston, Kim; Strudwick, Gillian

    A mental health organization in Ontario, Canada, noted an increase in unsafe items entering locked inpatient units. The purpose of this project was to develop and implement a search protocol to improve patient, staff, and visitor safety by preventing unsafe items from entering a locked inpatient unit. Under the guidance of a clinical nurse specialist, an interprofessional team used the Failure Mode and Effects Analysis framework to identify what items were considered unsafe, how these unsafe items were entering the unit, and what strategies could be used to prevent these items from entering the unit. A standardized search protocol was identified as a strategy to prevent items from entering the unit. The standardized search protocol was developed and piloted on 1 unit. To support the search protocol, an interprofessional team created a poster using a mnemonic aid to educate patients, staff, and visitors about which items could not be brought onto the unit. Educational sessions on the search protocol were provided for staff. The difference between the number of incidents before and after the implementation of the search protocol was statistically significant. Safety on an inpatient unit was increased as incidents of unsafe items entering the unit decreased.

  12. Theoretical Investigation of Combined Use of PSO, Tabu Search and Lagrangian Relaxation methods to solve the Unit Commitment Problem

    Directory of Open Access Journals (Sweden)

    Sahbi Marrouchi

    2018-02-01

    Full Text Available Solving the Unit Commitment problem (UCP optimizes the combination of production units operations and determines the appropriate operational scheduling of each production units to satisfy the expected consumption which varies from one day to one month. Besides, each production unit is conducted to constraints that render this problem complex, combinatorial and nonlinear. In this paper, we proposed a new strategy based on the combination three optimization methods: Tabu search, Particle swarm optimization and Lagrangian relaxation methods in order to develop a proper unit commitment scheduling of the production units while reducing the production cost during a definite period. The proposed strategy has been implemented on a the IEEE 9 bus test system containing 3 production unit and the results were promising compared to strategies based on meta-heuristic and deterministic methods.

  13. Getting satisfied with "satisfaction of search": How to measure errors during multiple-target visual search.

    Science.gov (United States)

    Biggs, Adam T

    2017-07-01

    Visual search studies are common in cognitive psychology, and the results generally focus upon accuracy, response times, or both. Most research has focused upon search scenarios where no more than 1 target will be present for any single trial. However, if multiple targets can be present on a single trial, it introduces an additional source of error because the found target can interfere with subsequent search performance. These errors have been studied thoroughly in radiology for decades, although their emphasis in cognitive psychology studies has been more recent. One particular issue with multiple-target search is that these subsequent search errors (i.e., specific errors which occur following a found target) are measured differently by different studies. There is currently no guidance as to which measurement method is best or what impact different measurement methods could have upon various results and conclusions. The current investigation provides two efforts to address these issues. First, the existing literature is reviewed to clarify the appropriate scenarios where subsequent search errors could be observed. Second, several different measurement methods are used with several existing datasets to contrast and compare how each method would have affected the results and conclusions of those studies. The evidence is then used to provide appropriate guidelines for measuring multiple-target search errors in future studies.

  14. Budget constraints and optimization in sponsored search auctions

    CERN Document Server

    Yang, Yanwu

    2013-01-01

    The Intelligent Systems Series publishes reference works and handbooks in three core sub-topic areas: Intelligent Automation, Intelligent Transportation Systems, and Intelligent Computing. They include theoretical studies, design methods, and real-world implementations and applications. The series' readership is broad, but focuses on engineering, electronics, and computer science. Budget constraints and optimization in sponsored search auctions takes into account consideration of the entire life cycle of campaigns for researchers and developers working on search systems and ROI maximization

  15. SpEnD: Linked Data SPARQL Endpoints Discovery Using Search Engines

    Science.gov (United States)

    Yumusak, Semih; Dogdu, Erdogan; Kodaz, Halife; Kamilaris, Andreas; Vandenbussche, Pierre-Yves

    In this study, a novel metacrawling method is proposed for discovering and monitoring linked data sources on the Web. We implemented the method in a prototype system, named SPARQL Endpoints Discovery (SpEnD). SpEnD starts with a "search keyword" discovery process for finding relevant keywords for the linked data domain and specifically SPARQL endpoints. Then, these search keywords are utilized to find linked data sources via popular search engines (Google, Bing, Yahoo, Yandex). By using this method, most of the currently listed SPARQL endpoints in existing endpoint repositories, as well as a significant number of new SPARQL endpoints, have been discovered. Finally, we have developed a new SPARQL endpoint crawler (SpEC) for crawling and link analysis.

  16. The use of atmogeochemistry in search for uranium

    International Nuclear Information System (INIS)

    Oleksiak, J.

    1985-01-01

    Surface geophysics methods hitherto used in search for uranium are presented. Prospecting potentials of individual methods which involve recording of emmited radiation or emanations originating in the course of decay of uranium are analysed and their advantageous and disadvantageous aspects are discussed. Moreover, there is presented so-called atmogeochemical method. Prospecting potential, range of usability and disadvantageous aspects of this new method are discussed. Moreover, there is given comparison of results obtained with its use on some experimental objects and those of geochemical mapping of water creek alluvia. The atmogeochemical method is shown to be highly promising in further search for uranium deposits occuring at depths down to several hundred meters. Therefore, it deserves to be further developed and wider used in prospecting. This should be accompanied by improvement of laboratory methods, to make identification of uranium in atmogeochemical samples less time-consuming. (author)

  17. Phase boundary estimation in electrical impedance tomography using the Hooke and Jeeves pattern search method

    International Nuclear Information System (INIS)

    Khambampati, Anil Kumar; Kim, Kyung Youn; Ijaz, Umer Zeeshan; Lee, Jeong Seong; Kim, Sin

    2010-01-01

    In industrial processes, monitoring of heterogeneous phases is crucial to the safety and operation of the engineering structures. Particularly, the visualization of voids and air bubbles is advantageous. As a result many studies have appeared in the literature that offer varying degrees of functionality. Electrical impedance tomography (EIT) has already been proved to be a hallmark for process monitoring and offers not only the visualization of the resistivity profile for a given flow mixture but is also used for detection of phase boundaries. Iterative image reconstruction algorithms, such as the modified Newton–Raphson (mNR) method, are commonly used as inverse solvers. However, their utility is problematic in a sense that they require the initial solution in close proximity of the ground truth. Furthermore, they also rely on the gradient information of the objective function to be minimized. Therefore, in this paper, we address all these issues by employing a direct search algorithm, namely the Hooke and Jeeves pattern search method, to estimate the phase boundaries that directly minimizes the cost function and does not require the gradient information. It is assumed that the resistivity profile is known a priori and therefore the unknown information will be the size and location of the object. The boundary coefficients are parameterized using truncated Fourier series and are estimated using the relationship between the measured voltages and injected currents. Through extensive simulation and experimental result and by comparison with mNR, we show that the Hooke and Jeeves pattern search method offers a promising prospect for process monitoring

  18. A modified harmony search method for environmental/economic load dispatch of real-world power systems

    International Nuclear Information System (INIS)

    Jeddi, Babak; Vahidinasab, Vahid

    2014-01-01

    Highlights: • A combined economic and emission load dispatch (CEELD) model is proposed. • The proposed model considers practical constraints of real-world power systems. • A new modified harmony search algorithm proposed to solve non-convex CEELD. • The proposed algorithm is tested by applying it to solve seven test systems. - Abstract: Economic load dispatch (ELD) problem is one of the basic and important optimization problems in a power system. However, considering practical constraints of real-world power systems such as ramp rate limits, prohibited operating zones, valve loading effects, multi-fuel options, spinning reserve and transmission system losses in ELD problem makes it a non-convex optimization problem, which is a challenging one and cannot be solved by traditional methods. Moreover, considering environmental issues, results in combined economic and emission load dispatch (CEELD) problem that is a multiobjective optimization model with two non-commensurable and contradictory objectives. In this paper, a modified harmony search algorithm (MHSA) proposed and applied to solve ELD and CEELD problem considering the abovementioned constraints. In the proposed MHSA, a new improvising method based on wavelet mutation together with a new memory consideration scheme based on the roulette wheel mechanism are proposed which improves the accuracy, convergence speed, and robustness of the classical HSA. Performance of the proposed algorithm is investigated by applying it to solve various test systems having non-convex solution spaces. To Show the effectiveness of the proposed method, obtained results compared with classical harmony search algorithm (HSA) and some of the most recently published papers in the area

  19. The HMMER Web Server for Protein Sequence Similarity Search.

    Science.gov (United States)

    Prakash, Ananth; Jeffryes, Matt; Bateman, Alex; Finn, Robert D

    2017-12-08

    Protein sequence similarity search is one of the most commonly used bioinformatics methods for identifying evolutionarily related proteins. In general, sequences that are evolutionarily related share some degree of similarity, and sequence-search algorithms use this principle to identify homologs. The requirement for a fast and sensitive sequence search method led to the development of the HMMER software, which in the latest version (v3.1) uses a combination of sophisticated acceleration heuristics and mathematical and computational optimizations to enable the use of profile hidden Markov models (HMMs) for sequence analysis. The HMMER Web server provides a common platform by linking the HMMER algorithms to databases, thereby enabling the search for homologs, as well as providing sequence and functional annotation by linking external databases. This unit describes three basic protocols and two alternate protocols that explain how to use the HMMER Web server using various input formats and user defined parameters. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  20. Millennial Students’ Online Search Strategies are Associated With Their Mental Models of Search. A Review of: Holman, L. (2011. Millennial students’ mental models of search: Implications for academic librarians and database developers. Journal of Academic Librarianship, 37(1, 19-27. doi:10.1016/j.acalib.2010.10.003

    Directory of Open Access Journals (Sweden)

    Leslie Bussert

    2011-09-01

    Full Text Available Objective – To examine first-year college students’ information seeking behaviours and determine whether their mental models of the search process influence their ability to effectively search for and find scholarly materials.Design – Mixed methods including contextual inquiry, concept mapping, observation, and interviews.Setting – University of Baltimore, a public institution in Maryland, United States of America, offering undergraduate, graduate, and professional degrees.Subjects – A total of 21 first-year undergraduate students, ages 16 to 19 years, undertaking research assignments for which they chose to use online resources.Methods – First-year students were recruited in the fall of 2008 and met with the researcher in a university usability lab for about one hour over a three week period. The researcher observed and videotaped the students as they conducted research in their chosen search engines or article databases. The searches were captured using software, and students were encouraged to think aloud about their research process, search strategies, and anticipated search results. Observation sessions concluded with a 10-question interview incorporating a review of the keywords the student used, the student’s reflection on the success of his or her searches, and possible alternate keywords. The interview also offered prompts to help the researcher learn about students’ conceptualizations of search tools’ utilization of keywords to generate results. The researcher then asked the students to provide a visual diagram of the relationship between their search terms and the items retrieved in the search tool.Data were analyzed by identifying the 21 different search tools used by the students and categorizing all 210 searches and student diagrams for further analysis. A scheme similar to Guinee, Eagleton, and Hall’s (2003 characterized the student searches into four categories: simple single-term searches, topic plus focus

  1. Developing a Test Collection for the Evaluation of Integrated Search

    DEFF Research Database (Denmark)

    Lykke, Marianne; Larsen, Birger; Lund, Haakon

    2010-01-01

    he poster discusses the characteristics needed in an information retrieval (IR) test collection to facilitate the evaluation of integrated search, i.e. search across a range of different sources but with one search box and one ranked result list, and describes and analyses a new test collection c...... assessments. The test collection may be used for systems- as well as user-oriented evaluation.......he poster discusses the characteristics needed in an information retrieval (IR) test collection to facilitate the evaluation of integrated search, i.e. search across a range of different sources but with one search box and one ranked result list, and describes and analyses a new test collection...... constructed for this purpose. The test collection consists of approx. 18,000 monographic records, 160,000 papers and journal articles in PDF and 275,000 abstracts with a varied set of metadata and vocabularies from the physics domain, 65 topics based on real work tasks and corresponding graded relevance...

  2. Cooperative method development

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Rönkkö, Kari; Eriksson, Jeanette

    2008-01-01

    The development of methods tools and process improvements is best to be based on the understanding of the development practice to be supported. Qualitative research has been proposed as a method for understanding the social and cooperative aspects of software development. However, qualitative...... research is not easily combined with the improvement orientation of an engineering discipline. During the last 6 years, we have applied an approach we call `cooperative method development', which combines qualitative social science fieldwork, with problem-oriented method, technique and process improvement....... The action research based approach focusing on shop floor software development practices allows an understanding of how contextual contingencies influence the deployment and applicability of methods, processes and techniques. This article summarizes the experiences and discusses the further development...

  3. Pathway Detection from Protein Interaction Networks and Gene Expression Data Using Color-Coding Methods and A* Search Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yu Yeh

    2012-01-01

    Full Text Available With the large availability of protein interaction networks and microarray data supported, to identify the linear paths that have biological significance in search of a potential pathway is a challenge issue. We proposed a color-coding method based on the characteristics of biological network topology and applied heuristic search to speed up color-coding method. In the experiments, we tested our methods by applying to two datasets: yeast and human prostate cancer networks and gene expression data set. The comparisons of our method with other existing methods on known yeast MAPK pathways in terms of precision and recall show that we can find maximum number of the proteins and perform comparably well. On the other hand, our method is more efficient than previous ones and detects the paths of length 10 within 40 seconds using CPU Intel 1.73GHz and 1GB main memory running under windows operating system.

  4. Query transformations and their role in Web searching by the members of the general public

    Directory of Open Access Journals (Sweden)

    Martin Whittle

    2006-01-01

    Full Text Available Introduction. This paper reports preliminary research in a primarily experimental study of how the general public search for information on the Web. The focus is on the query transformation patterns that characterise searching. Method. In this work, we have used transaction logs from the Excite search engine to develop methods for analysing query transformations that should aid the analysis of our ongoing experimental work. Our methods involve the use of similarity techniques to link queries with the most similar previous query in a train. The resulting query transformations are represented as a list of codes representing a whole search. Analysis. It is shown how query transformation sequences can be represented as graphical networks and some basic statistical results are shown. A correlation analysis is performed to examine the co-occurrence of Boolean and quotation mark changes with the syntactic changes. Results. A frequency analysis of the occurrence of query transformation codes is presented. The connectivity of graphs obtained from the query transformation is investigated and found to follow an exponential scaling law. The correlation analysis reveals a number of patterns that provide some interesting insights into Web searching by the general public. Conclusion. We have developed analytical methods based on query similarity that can be applied to our current experimental work with volunteer subjects. The results of these will form part of a database with the aim of developing an improved understanding of how the public search the Web.

  5. 2nd International Conference on Harmony Search Algorithm

    CERN Document Server

    Geem, Zong

    2016-01-01

    The Harmony Search Algorithm (HSA) is one of the most well-known techniques in the field of soft computing, an important paradigm in the science and engineering community.  This volume, the proceedings of the 2nd International Conference on Harmony Search Algorithm 2015 (ICHSA 2015), brings together contributions describing the latest developments in the field of soft computing with a special focus on HSA techniques. It includes coverage of new methods that have potentially immense application in various fields. Contributed articles cover aspects of the following topics related to the Harmony Search Algorithm: analytical studies; improved, hybrid and multi-objective variants; parameter tuning; and large-scale applications.  The book also contains papers discussing recent advances on the following topics: genetic algorithms; evolutionary strategies; the firefly algorithm and cuckoo search; particle swarm optimization and ant colony optimization; simulated annealing; and local search techniques.   This book ...

  6. Hybrid Differential Dynamic Programming with Stochastic Search

    Science.gov (United States)

    Aziz, Jonathan; Parker, Jeffrey; Englander, Jacob

    2016-01-01

    Differential dynamic programming (DDP) has been demonstrated as a viable approach to low-thrust trajectory optimization, namely with the recent success of NASAs Dawn mission. The Dawn trajectory was designed with the DDP-based Static Dynamic Optimal Control algorithm used in the Mystic software. Another recently developed method, Hybrid Differential Dynamic Programming (HDDP) is a variant of the standard DDP formulation that leverages both first-order and second-order state transition matrices in addition to nonlinear programming (NLP) techniques. Areas of improvement over standard DDP include constraint handling, convergence properties, continuous dynamics, and multi-phase capability. DDP is a gradient based method and will converge to a solution nearby an initial guess. In this study, monotonic basin hopping (MBH) is employed as a stochastic search method to overcome this limitation, by augmenting the HDDP algorithm for a wider search of the solution space.

  7. Search for a planet

    International Nuclear Information System (INIS)

    Tokovinin, A.A.

    1986-01-01

    The problem of search for star planets is discussed in a popular form. Two methods of search for planets are considered: astrometric and spectral. Both methods complement one another. An assumption is made that potential possessors of planets are in the first place yellow and red dwarfs with slow axial rotation. These stars are the most numerous representatives of Galaxy population

  8. TX-Kw: An Effective Temporal XML Keyword Search

    OpenAIRE

    Rasha Bin-Thalab; Neamat El-Tazi; Mohamed E.El-Sharkawi

    2013-01-01

    Inspired by the great success of information retrieval (IR) style keyword search on the web, keyword search on XML has emerged recently. Existing methods cannot resolve challenges addressed by using keyword search in Temporal XML documents. We propose a way to evaluate temporal keyword search queries over Temporal XML documents. Moreover, we propose a new ranking method based on the time-aware IR ranking methods to rank temporal keyword search queries results. Extensive experiments have been ...

  9. Untargeted metabolomic profiling plasma samples of patients with lung cancer for searching significant metabolites by HPLC-MS method

    Science.gov (United States)

    Dementeva, N.; Ivanova, K.; Kokova, D.; Kurzina, I.; Ponomaryova, A.; Kzhyshkowska, J.

    2017-09-01

    Lung cancer is one of the most common types of cancer leading to death. Consequently, the search and the identification of the metabolites associated with the risk of developing cancer are very valuable. For the purpose, untargeted metabolic profiling of the plasma samples collected from the patients with lung cancer (n = 100) and the control group (n = 100) was conducted. After sample preparation, the plasma samples were analyzed using LC-MS method. Biostatistics methods were applied to pre-process the data for elicitation of dominating metabolites which responded to the difference between the case and the control groups. At least seven significant metabolites were evaluated and annotated. The most part of identified metabolites are connected with lipid metabolism and their combination could be useful for follow-up studies of lung cancer pathogenesis.

  10. Global OpenSearch

    Science.gov (United States)

    Newman, D. J.; Mitchell, A. E.

    2015-12-01

    At AGU 2014, NASA EOSDIS demonstrated a case-study of an OpenSearch framework for Earth science data discovery. That framework leverages the IDN and CWIC OpenSearch API implementations to provide seamless discovery of data through the 'two-step' discovery process as outlined by the Federation for Earth Sciences (ESIP) OpenSearch Best Practices. But how would an Earth Scientist leverage this framework and what are the benefits? Using a client that understands the OpenSearch specification and, for further clarity, the various best practices and extensions, a scientist can discovery a plethora of data not normally accessible either by traditional methods (NASA Earth Data Search, Reverb, etc) or direct methods (going to the source of the data) We will demonstrate, via the CWICSmart web client, how an earth scientist can access regional data on a regional phenomena in a uniform and aggregated manner. We will demonstrate how an earth scientist can 'globalize' their discovery. You want to find local data on 'sea surface temperature of the Indian Ocean'? We can help you with that. 'European meteorological data'? Yes. 'Brazilian rainforest satellite imagery'? That too. CWIC allows you to get earth science data in a uniform fashion from a large number of disparate, world-wide agencies. This is what we mean by Global OpenSearch.

  11. Modified Cuckoo Search Algorithm for Solving Nonconvex Economic Load Dispatch Problems

    Directory of Open Access Journals (Sweden)

    Thang Trung Nguyen

    2016-01-01

    Full Text Available This paper presents the application of modified cuckoo search algorithm (MCSA for solving economic load dispatch (ELD problems. The MCSA method is developed to improve the search ability and solution quality of the conventional CSA method. In the MCSA, the evaluation of eggs has divided the initial eggs into two groups, the top egg group with good quality and the abandoned group with worse quality. Moreover, the value of the updated step size in MCSA is adapted as generating a new solution for the abandoned group and the top group via the Levy flights so that a large zone is searched at the beginning and a local zone is foraged as the maximum number of iterations is nearly reached. The MCSA method has been tested on different systems with different characteristics of thermal units and constraints. The result comparison with other methods in the literature has indicated that the MCSA method can be a powerful method for solving the ELD.

  12. Protocol: a systematic review of studies developing and/or evaluating search strategies to identify prognosis studies.

    Science.gov (United States)

    Corp, Nadia; Jordan, Joanne L; Hayden, Jill A; Irvin, Emma; Parker, Robin; Smith, Andrea; van der Windt, Danielle A

    2017-04-20

    Prognosis research is on the rise, its importance recognised because chronic health conditions and diseases are increasingly common and costly. Prognosis systematic reviews are needed to collate and synthesise these research findings, especially to help inform effective clinical decision-making and healthcare policy. A detailed, comprehensive search strategy is central to any systematic review. However, within prognosis research, this is challenging due to poor reporting and inconsistent use of available indexing terms in electronic databases. Whilst many published search filters exist for finding clinical trials, this is not the case for prognosis studies. This systematic review aims to identify and compare existing methodological filters developed and evaluated to identify prognosis studies of any of the three main types: overall prognosis, prognostic factors, and prognostic [risk prediction] models. Primary studies reporting the development and/or evaluation of methodological search filters to retrieve any type of prognosis study will be included in this systematic review. Multiple electronic bibliographic databases will be searched, grey literature will be sought from relevant organisations and websites, experts will be contacted, and citation tracking of key papers and reference list checking of all included papers will be undertaken. Titles will be screened by one person, and abstracts and full articles will be reviewed for inclusion independently by two reviewers. Data extraction and quality assessment will also be undertaken independently by two reviewers with disagreements resolved by discussion or by a third reviewer if necessary. Filters' characteristics and performance metrics reported in the included studies will be extracted and tabulated. To enable comparisons, filters will be grouped according to database, platform, type of prognosis study, and type of filter for which it was intended. This systematic review will identify all existing validated

  13. A hybrid search algorithm for swarm robots searching in an unknown environment.

    Science.gov (United States)

    Li, Shoutao; Li, Lina; Lee, Gordon; Zhang, Hao

    2014-01-01

    This paper proposes a novel method to improve the efficiency of a swarm of robots searching in an unknown environment. The approach focuses on the process of feeding and individual coordination characteristics inspired by the foraging behavior in nature. A predatory strategy was used for searching; hence, this hybrid approach integrated a random search technique with a dynamic particle swarm optimization (DPSO) search algorithm. If a search robot could not find any target information, it used a random search algorithm for a global search. If the robot found any target information in a region, the DPSO search algorithm was used for a local search. This particle swarm optimization search algorithm is dynamic as all the parameters in the algorithm are refreshed synchronously through a communication mechanism until the robots find the target position, after which, the robots fall back to a random searching mode. Thus, in this searching strategy, the robots alternated between two searching algorithms until the whole area was covered. During the searching process, the robots used a local communication mechanism to share map information and DPSO parameters to reduce the communication burden and overcome hardware limitations. If the search area is very large, search efficiency may be greatly reduced if only one robot searches an entire region given the limited resources available and time constraints. In this research we divided the entire search area into several subregions, selected a target utility function to determine which subregion should be initially searched and thereby reduced the residence time of the target to improve search efficiency.

  14. Job shop scheduling by local search

    NARCIS (Netherlands)

    Vaessens, R.J.M.; Aarts, E.H.L.; Lenstra, J.K.

    1994-01-01

    We survey solution methods for the job shop scheduling problem with an emphasis on local search. We discuss both cleterministic and randomized local search methods as well as the applied neighborhoods. We compare the computational performance of the various methods in terms of their effectiveness

  15. Beyond the search surface: visual search and attentional engagement.

    Science.gov (United States)

    Duncan, J; Humphreys, G

    1992-05-01

    Treisman (1991) described a series of visual search studies testing feature integration theory against an alternative (Duncan & Humphreys, 1989) in which feature and conjunction search are basically similar. Here the latter account is noted to have 2 distinct levels: (a) a summary of search findings in terms of stimulus similarities, and (b) a theory of how visual attention is brought to bear on relevant objects. Working at the 1st level, Treisman found that even when similarities were calibrated and controlled, conjunction search was much harder than feature search. The theory, however, can only really be tested at the 2nd level, because the 1st is an approximation. An account of the findings is developed at the 2nd level, based on the 2 processes of input-template matching and spreading suppression. New data show that, when both of these factors are controlled, feature and conjunction search are equally difficult. Possibilities for unification of the alternative views are considered.

  16. A meta-heuristic method for solving scheduling problem: crow search algorithm

    Science.gov (United States)

    Adhi, Antono; Santosa, Budi; Siswanto, Nurhadi

    2018-04-01

    Scheduling is one of the most important processes in an industry both in manufacturingand services. The scheduling process is the process of selecting resources to perform an operation on tasks. Resources can be machines, peoples, tasks, jobs or operations.. The selection of optimum sequence of jobs from a permutation is an essential issue in every research in scheduling problem. Optimum sequence becomes optimum solution to resolve scheduling problem. Scheduling problem becomes NP-hard problem since the number of job in the sequence is more than normal number can be processed by exact algorithm. In order to obtain optimum results, it needs a method with capability to solve complex scheduling problems in an acceptable time. Meta-heuristic is a method usually used to solve scheduling problem. The recently published method called Crow Search Algorithm (CSA) is adopted in this research to solve scheduling problem. CSA is an evolutionary meta-heuristic method which is based on the behavior in flocks of crow. The calculation result of CSA for solving scheduling problem is compared with other algorithms. From the comparison, it is found that CSA has better performance in term of optimum solution and time calculation than other algorithms.

  17. A Simple Time Domain Collocation Method to Precisely Search for the Periodic Orbits of Satellite Relative Motion

    Directory of Open Access Journals (Sweden)

    Xiaokui Yue

    2014-01-01

    Full Text Available A numerical approach for obtaining periodic orbits of satellite relative motion is proposed, based on using the time domain collocation (TDC method to search for the periodic solutions of an exact J2 nonlinear relative model. The initial conditions for periodic relative orbits of the Clohessy-Wiltshire (C-W equations or Tschauner-Hempel (T-H equations can be refined with this approach to generate nearly bounded orbits. With these orbits, a method based on the least-squares principle is then proposed to generate projected closed orbit (PCO, which is a reference for the relative motion control. Numerical simulations reveal that the presented TDC searching scheme is effective and simple, and the projected closed orbit is very fuel saving.

  18. ElasticSearch cookbook

    CERN Document Server

    Paro, Alberto

    2013-01-01

    Written in an engaging, easy-to-follow style, the recipes will help you to extend the capabilities of ElasticSearch to manage your data effectively.If you are a developer who implements ElasticSearch in your web applications, manage data, or have decided to start using ElasticSearch, this book is ideal for you. This book assumes that you've got working knowledge of JSON and Java

  19. A Lifelog Browser for Visualization and Search of Mobile Everyday-Life

    Directory of Open Access Journals (Sweden)

    Keum-Sung Hwang

    2014-01-01

    Full Text Available Mobile devices can now handle a great deal of information thanks to the convergence of diverse functionalities. Mobile environments have already shown great potential in terms of providing customized service to users because they can record meaningful and private information continually for long periods of time. The research for understanding, searching and summarizing the everyday-life of human has received increasing attention in recent years due to the digital convergence. In this paper, we propose a mobile life browser, which visualizes and searches human's mobile life based on the contents and context of lifelog data. The mobile life browser is for searching the personal information effectively collected on his/her mobile device and for supporting the concept-based searching method by using concept networks and Bayesian networks. In the experiments, we collected the real mobile log data from three users for a month and visualized the mobile lives of the users with the mobile life browser developed. Some tests on searching tasks confirmed that the result using the proposed concept-based searching method is promising.

  20. Combined heat and power economic dispatch by harmony search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Vasebi, A.; Bathaee, S.M.T. [Power System Research Laboratory, Department of Electrical and Electronic Engineering, K.N.Toosi University of Technology, 322-Mirdamad Avenue West, 19697 Tehran (Iran); Fesanghary, M. [Department of Mechanical Engineering, Amirkabir University of Technology, 424-Hafez Avenue, Tehran (Iran)

    2007-12-15

    The optimal utilization of multiple combined heat and power (CHP) systems is a complicated problem that needs powerful methods to solve. This paper presents a harmony search (HS) algorithm to solve the combined heat and power economic dispatch (CHPED) problem. The HS algorithm is a recently developed meta-heuristic algorithm, and has been very successful in a wide variety of optimization problems. The method is illustrated using a test case taken from the literature as well as a new one proposed by authors. Numerical results reveal that the proposed algorithm can find better solutions when compared to conventional methods and is an efficient search algorithm for CHPED problem. (author)

  1. Real-Time Search in Clouds

    OpenAIRE

    Uddin, Misbah; Skinner, Amy; Stadler, Rolf; Clemm, Alexander

    2013-01-01

    We developed a novel approach for management of networks/networked systems based on network search [4]. Network search provides a simple, uniform interface, through which human administrators and management applications can obtain network information, configuration or operational, without knowing its schema and location. We believe that the capability of network search will spur the development of new tools for human administrators and enable the rapid development of new classes of network co...

  2. Generalized Pattern Search methods for a class of nonsmooth optimization problems with structure

    Science.gov (United States)

    Bogani, C.; Gasparo, M. G.; Papini, A.

    2009-07-01

    We propose a Generalized Pattern Search (GPS) method to solve a class of nonsmooth minimization problems, where the set of nondifferentiability is included in the union of known hyperplanes and, therefore, is highly structured. Both unconstrained and linearly constrained problems are considered. At each iteration the set of poll directions is enforced to conform to the geometry of both the nondifferentiability set and the boundary of the feasible region, near the current iterate. This is the key issue to guarantee the convergence of certain subsequences of iterates to points which satisfy first-order optimality conditions. Numerical experiments on some classical problems validate the method.

  3. Serendipity in dark photon searches

    Science.gov (United States)

    Ilten, Philip; Soreq, Yotam; Williams, Mike; Xue, Wei

    2018-06-01

    Searches for dark photons provide serendipitous discovery potential for other types of vector particles. We develop a framework for recasting dark photon searches to obtain constraints on more general theories, which includes a data-driven method for determining hadronic decay rates. We demonstrate our approach by deriving constraints on a vector that couples to the B-L current, a leptophobic B boson that couples directly to baryon number and to leptons via B- γ kinetic mixing, and on a vector that mediates a protophobic force. Our approach can easily be generalized to any massive gauge boson with vector couplings to the Standard Model fermions, and software to perform any such recasting is provided at https://gitlab.com/philten/darkcast https://gitlab.com/philten/darkcast" TargetType="URL"/> .

  4. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  5. In search of new methods. Qigong in stuttering therapy

    Directory of Open Access Journals (Sweden)

    Paweł Półrola

    2013-10-01

    Full Text Available Introduction : Even though stuttering is probably as old a phenomenon as the human speech itself, the stuttering therapy is still a challenge for the therapist and requires constant searching for new methods. Qigong may prove to be one of them. Aim of the research: The research paper presents the results of an experimental investigation evaluating the usefulness of qigong practice in stuttering therapy. Material and methods: Two groups of stuttering adults underwent 6-month therapy. In group I – the experimental one (n = 11 – the therapy consisted of speech fluency training, psychotherapy and qigong practice. In group II – the control one (n = 12 – it included speech fluency training and psychotherapy. In both groups 2-hour sessions of speech fluency training and psychotherapy were conducted twice a week. Two-hour qigong sessions took place once a week. Results: After 6 months the therapy results were compared with regard to the basic stuttering parameters, such as the degree of speech disfluency, the level of logophobia and speech disfluency symptoms. Improvement was observed in both groups, the beneficial effects, however, being more prominent in the qigong-practising group. Conclusions : Qigong exercises used in the therapy of stuttering people along with speech fluency training and psychotherapy give beneficial effects.

  6. Recent developments in dark matter searches

    Indian Academy of Sciences (India)

    results from indirect and direct detection dark matter search experiments is given. .... Such particles can be very light but still be CDM since their interaction was so extremely weak that they could not thermalize in the early Universe. ..... was caused by the report of two events in the signal region, the first time direct detection.

  7. Global Optimization Based on the Hybridization of Harmony Search and Particle Swarm Optimization Methods

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2014-01-01

    Full Text Available We consider a class of stochastic search algorithms of global optimization which in various publications are called behavioural, intellectual, metaheuristic, inspired by the nature, swarm, multi-agent, population, etc. We use the last term.Experience in using the population algorithms to solve challenges of global optimization shows that application of one such algorithm may not always effective. Therefore now great attention is paid to hybridization of population algorithms of global optimization. Hybrid algorithms unite various algorithms or identical algorithms, but with various values of free parameters. Thus efficiency of one algorithm can compensate weakness of another.The purposes of the work are development of hybrid algorithm of global optimization based on known algorithms of harmony search (HS and swarm of particles (PSO, software implementation of algorithm, study of its efficiency using a number of known benchmark problems, and a problem of dimensional optimization of truss structure.We set a problem of global optimization, consider basic algorithms of HS and PSO, give a flow chart of the offered hybrid algorithm called PSO HS , present results of computing experiments with developed algorithm and software, formulate main results of work and prospects of its development.

  8. Search for promising compositions for developing new multiphase casting alloys based on Al-Cu-Mg matrix using thermodynamic calculations and mathematic simulation

    Science.gov (United States)

    Zolotorevskii, V. S.; Pozdnyakov, A. V.; Churyumov, A. Yu.

    2012-11-01

    A calculation-experimental study is carried out to improve the concept of searching for new alloying systems in order to develop new casting alloys using mathematical simulation methods in combination with thermodynamic calculations. The results show the high effectiveness of the applied methods. The real possibility of selecting the promising compositions with the required set of casting and mechanical properties is exemplified by alloys with thermally hardened Al-Cu and Al-Cu-Mg matrices, as well as poorly soluble additives that form eutectic components using mainly the calculation study methods and the minimum number of experiments.

  9. Information-Fusion Methods Based Simultaneous Localization and Mapping for Robot Adapting to Search and Rescue Postdisaster Environments

    Directory of Open Access Journals (Sweden)

    Hongling Wang

    2018-01-01

    Full Text Available The first application of utilizing unique information-fusion SLAM (IF-SLAM methods is developed for mobile robots performing simultaneous localization and mapping (SLAM adapting to search and rescue (SAR environments in this paper. Several fusion approaches, parallel measurements filtering, exploration trajectories fusing, and combination sensors’ measurements and mobile robots’ trajectories, are proposed. The novel integration particle filter (IPF and optimal improved EKF (IEKF algorithms are derived for information-fusion systems to perform SLAM task in SAR scenarios. The information-fusion architecture consists of multirobots and multisensors (MAM; multiple robots mount on-board laser range finder (LRF sensors, localization sonars, gyro odometry, Kinect-sensor, RGB-D camera, and other proprioceptive sensors. This information-fusion SLAM (IF-SLAM is compared with conventional methods, which indicates that fusion trajectory is more consistent with estimated trajectories and real observation trajectories. The simulations and experiments of SLAM process are conducted in both cluttered indoor environment and outdoor collapsed unstructured scenario, and experimental results validate the effectiveness of the proposed information-fusion methods in improving SLAM performances adapting to SAR scenarios.

  10. Developing as new search engine and browser for libraries to search and organize the World Wide Web library resources

    OpenAIRE

    Sreenivasulu, V.

    2000-01-01

    Internet Granthalaya urges world wide advocates and targets at the task of creating a new search engine and dedicated browseer. Internet Granthalaya may be the ultimate search engine exclusively dedicated for every library use to search and organize the world wide web libary resources

  11. Towards Semantic Search and Inference in Electronic Medical Records

    Directory of Open Access Journals (Sweden)

    Bevan Koopman

    2012-09-01

    Full Text Available Background This paper presents a novel approach to searching electronic medical records that is based on concept matching rather than keyword matching. Aims The concept-based approach is intended to overcome specific challenges we identified in searching medical records. Method Queries and documents were transformed from their term-based originals into medical concepts as defined by the SNOMED-CT ontology. Results Evaluation on a real-world collection of medical records showed our concept-based approach outperformed a keyword baseline by 25% in Mean Average Precision. Conclusion The concept-based approach provides a framework for further development of inference based search systems for dealing with medical data.

  12. Representation Methods in AI. Searching by Graphs

    Directory of Open Access Journals (Sweden)

    Angel GARRIDO

    2012-12-01

    Full Text Available The historical origin of the Artificial Intelligence (A I is usually established in the Darmouth Conference, of 1956. But we can find many more arcane origins [1]. Also, we can consider, in more recent times, very great thinkers, as Janos Neumann (then, John von Neumann, arrived in USA, Norbert Wiener, Alan Mathison Turing, or Lofti Zadehfor instance [6, 7]. Frequently A I requires Logic. But its classical version shows too many insufficiencies. So, it was necessary to introduce more sophisticated tools, as fuzzy logic, modal logic, non-monotonic logic and so on [2]. Among the things that A I needs to represent are: categories, objects, properties, relations between objects, situations, states, time, events, causes and effects, knowledge about knowledge, and so on. The problems in A I can be classified in two general types [3, 4]: search problems and representation problems. In this last “mountain”, there exist different ways to reach their summit. So, we have [3]: logics, rules, frames, associative nets, scripts and so on, many times connectedamong them. We attempt, in this paper, a panoramic vision of the scope of application of such Representation Methods in A I. The two more disputable questions of both modern philosophy of mind and A I will be Turing Test and The Chinese Room Argument. To elucidate these very difficult questions, see both final Appendices.

  13. Managing the Grey Literature of a Discipline through Collaboration: AgEcon Search

    Science.gov (United States)

    Kelly, Julia; Letnes, Louise

    2005-01-01

    AgEcon Search, http://www.agecon.lib.umn.edu, is an important and ground-breaking example of an alternative method of delivering current research results to many potential users. AgEcon Search, through a distributed model, collects and disseminates the grey literature of the fields of agricultural and resource economics. The development of this…

  14. Coherent search of continuous gravitational wave signals: extension of the 5-vectors method to a network of detectors

    International Nuclear Information System (INIS)

    Astone, P; Colla, A; Frasca, S; Palomba, C; D'Antonio, S

    2012-01-01

    We describe the extension to multiple datasets of a coherent method for the search of continuous gravitational wave signals, based on the computation of 5-vectors. In particular, we show how to coherently combine different datasets belonging to the same detector or to different detectors. In the latter case the coherent combination is the way to have the maximum increase in signal-to-noise ratio. If the datasets belong to the same detector the advantage comes mainly from the properties of a quantity called coherence which is helpful (in both cases, in fact) in rejecting false candidates. The method has been tested searching for simulated signals injected in Gaussian noise and the results of the simulations are discussed.

  15. The impact of PICO as a search strategy tool on literature search quality

    DEFF Research Database (Denmark)

    Eriksen, Mette Brandt; Frandsen, Tove Faber

    2018-01-01

    Objective: This review aimed to determine, if the use of the PICO model (Patient Intervention Comparison Outcome) as a search strategy tool affects the quality of the literature search. Methods: A comprehensive literature search was conducted in: PubMed, Embase, CINAHL, PsycInfo, Cochrane Library...... and three studies were included, data was extracted, risk of bias was assessed and a qualitative analysis was conducted. The included studies compared PICO to PIC or link to related articles in PubMed; PICOS and SPIDER. One study compared PICO to unguided searching. Due to differences in intervention...

  16. Supporting inter-topic entity search for biomedical Linked Data based on heterogeneous relationships.

    Science.gov (United States)

    Zong, Nansu; Lee, Sungin; Ahn, Jinhyun; Kim, Hong-Gee

    2017-08-01

    The keyword-based entity search restricts search space based on the preference of search. When given keywords and preferences are not related to the same biomedical topic, existing biomedical Linked Data search engines fail to deliver satisfactory results. This research aims to tackle this issue by supporting an inter-topic search-improving search with inputs, keywords and preferences, under different topics. This study developed an effective algorithm in which the relations between biomedical entities were used in tandem with a keyword-based entity search, Siren. The algorithm, PERank, which is an adaptation of Personalized PageRank (PPR), uses a pair of input: (1) search preferences, and (2) entities from a keyword-based entity search with a keyword query, to formalize the search results on-the-fly based on the index of the precomputed Individual Personalized PageRank Vectors (IPPVs). Our experiments were performed over ten linked life datasets for two query sets, one with keyword-preference topic correspondence (intra-topic search), and the other without (inter-topic search). The experiments showed that the proposed method achieved better search results, for example a 14% increase in precision for the inter-topic search than the baseline keyword-based search engine. The proposed method improved the keyword-based biomedical entity search by supporting the inter-topic search without affecting the intra-topic search based on the relations between different entities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Four-Dimensional Golden Search

    Energy Technology Data Exchange (ETDEWEB)

    Fenimore, Edward E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-25

    The Golden search technique is a method to search a multiple-dimension space to find the minimum. It basically subdivides the possible ranges of parameters until it brackets, to within an arbitrarily small distance, the minimum. It has the advantages that (1) the function to be minimized can be non-linear, (2) it does not require derivatives of the function, (3) the convergence criterion does not depend on the magnitude of the function. Thus, if the function is a goodness of fit parameter such as chi-square, the convergence does not depend on the noise being correctly estimated or the function correctly following the chi-square statistic. And, (4) the convergence criterion does not depend on the shape of the function. Thus, long shallow surfaces can be searched without the problem of premature convergence. As with many methods, the Golden search technique can be confused by surfaces with multiple minima.

  18. Assessment of methods for computing the closest point projection, penetration, and gap functions in contact searching problems

    Czech Academy of Sciences Publication Activity Database

    Kopačka, Ján; Gabriel, Dušan; Plešek, Jiří; Ulbin, M.

    2016-01-01

    Roč. 105, č. 11 (2016), s. 803-833 ISSN 0029-5981 R&D Projects: GA ČR(CZ) GAP101/12/2315; GA MŠk(CZ) ME10114 Institutional support: RVO:61388998 Keywords : closest point projection * local contact search * quadratic elements * Newtons methods * geometric iteration methods * simplex method Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.162, year: 2016 http://onlinelibrary.wiley.com/doi/10.1002/nme.4994/abstract

  19. Clinician search behaviors may be influenced by search engine design.

    Science.gov (United States)

    Lau, Annie Y S; Coiera, Enrico; Zrimec, Tatjana; Compton, Paul

    2010-06-30

    Searching the Web for documents using information retrieval systems plays an important part in clinicians' practice of evidence-based medicine. While much research focuses on the design of methods to retrieve documents, there has been little examination of the way different search engine capabilities influence clinician search behaviors. Previous studies have shown that use of task-based search engines allows for faster searches with no loss of decision accuracy compared with resource-based engines. We hypothesized that changes in search behaviors may explain these differences. In all, 75 clinicians (44 doctors and 31 clinical nurse consultants) were randomized to use either a resource-based or a task-based version of a clinical information retrieval system to answer questions about 8 clinical scenarios in a controlled setting in a university computer laboratory. Clinicians using the resource-based system could select 1 of 6 resources, such as PubMed; clinicians using the task-based system could select 1 of 6 clinical tasks, such as diagnosis. Clinicians in both systems could reformulate search queries. System logs unobtrusively capturing clinicians' interactions with the systems were coded and analyzed for clinicians' search actions and query reformulation strategies. The most frequent search action of clinicians using the resource-based system was to explore a new resource with the same query, that is, these clinicians exhibited a "breadth-first" search behaviour. Of 1398 search actions, clinicians using the resource-based system conducted 401 (28.7%, 95% confidence interval [CI] 26.37-31.11) in this way. In contrast, the majority of clinicians using the task-based system exhibited a "depth-first" search behavior in which they reformulated query keywords while keeping to the same task profiles. Of 585 search actions conducted by clinicians using the task-based system, 379 (64.8%, 95% CI 60.83-68.55) were conducted in this way. This study provides evidence that

  20. Hybrid Multistarting GA-Tabu Search Method for the Placement of BtB Converters for Korean Metropolitan Ring Grid

    Directory of Open Access Journals (Sweden)

    Remund J. Labios

    2016-01-01

    Full Text Available This paper presents a method to determine the optimal locations for installing back-to-back (BtB converters in a power grid as a countermeasure to reduce fault current levels. The installation of BtB converters can be regarded as network reconfiguration. For the purpose, a hybrid multistarting GA-tabu search method was used to determine the best locations from a preselected list of candidate locations. The constraints used in determining the best locations include circuit breaker fault current limits, proximity of proposed locations, and capability of the solution to reach power flow convergence. A simple power injection model after applying line-opening on selected branches was used as a means for power flows with BtB converters. Kron reduction was also applied as a method for network reduction for fast evaluation of fault currents with a given topology. Simulations of the search method were performed on the Korean power system, particularly the Seoul metropolitan area.

  1. Search Results | Page 538 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Search Results. Showing 5371 - 5380 of 9604 results. Reports ... Business training for microfinance clients : how it matters and for whom? ... building a regional platform for political engagement and strategic action; final technical report.

  2. Keyword Search in Databases

    CERN Document Server

    Yu, Jeffrey Xu; Chang, Lijun

    2009-01-01

    It has become highly desirable to provide users with flexible ways to query/search information over databases as simple as keyword search like Google search. This book surveys the recent developments on keyword search over databases, and focuses on finding structural information among objects in a database using a set of keywords. Such structural information to be returned can be either trees or subgraphs representing how the objects, that contain the required keywords, are interconnected in a relational database or in an XML database. The structural keyword search is completely different from

  3. Identifying quality improvement intervention publications - A comparison of electronic search strategies

    Directory of Open Access Journals (Sweden)

    Rubenstein Lisa V

    2011-08-01

    Full Text Available Abstract Background The evidence base for quality improvement (QI interventions is expanding rapidly. The diversity of the initiatives and the inconsistency in labeling these as QI interventions makes it challenging for researchers, policymakers, and QI practitioners to access the literature systematically and to identify relevant publications. Methods We evaluated search strategies developed for MEDLINE (Ovid and PubMed based on free text words, Medical subject headings (MeSH, QI intervention components, continuous quality improvement (CQI methods, and combinations of the strategies. Three sets of pertinent QI intervention publications were used for validation. Two independent expert reviewers screened publications for relevance. We compared the yield, recall rate, and precision of the search strategies for the identification of QI publications and for a subset of empirical studies on effects of QI interventions. Results The search yields ranged from 2,221 to 216,167 publications. Mean recall rates for reference publications ranged from 5% to 53% for strategies with yields of 50,000 publications or fewer. The 'best case' strategy, a simple text word search with high face validity ('quality' AND 'improv*' AND 'intervention*' identified 44%, 24%, and 62% of influential intervention articles selected by Agency for Healthcare Research and Quality (AHRQ experts, a set of exemplar articles provided by members of the Standards for Quality Improvement Reporting Excellence (SQUIRE group, and a sample from the Cochrane Effective Practice and Organization of Care Group (EPOC register of studies, respectively. We applied the search strategy to a PubMed search for articles published in 10 pertinent journals in a three-year period which retrieved 183 publications. Among these, 67% were deemed relevant to QI by at least one of two independent raters. Forty percent were classified as empirical studies reporting on a QI intervention. Conclusions The presented

  4. Search for intervalmodels

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar

    1996-01-01

    Methods are presented that carry out sorting of data according to some criteria, and investigate the possibilities of finding intervals that give separate models relative to the given data. The methods presented are more reliable than related clustering methods, because the search is carried out...

  5. Search Results | Page 763 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Search Results. Showing 7621 - 7630 of 8490 results. Books ... and practice in natural resource management are often depicted as a cyclical and rational process. ... to provide quality education, improve health care, allow open government, ...

  6. Local Path Planning of Driverless Car Navigation Based on Jump Point Search Method Under Urban Environment

    Directory of Open Access Journals (Sweden)

    Kaijun Zhou

    2017-09-01

    Full Text Available The Jump Point Search (JPS algorithm is adopted for local path planning of the driverless car under urban environment, and it is a fast search method applied in path planning. Firstly, a vector Geographic Information System (GIS map, including Global Positioning System (GPS position, direction, and lane information, is built for global path planning. Secondly, the GIS map database is utilized in global path planning for the driverless car. Then, the JPS algorithm is adopted to avoid the front obstacle, and to find an optimal local path for the driverless car in the urban environment. Finally, 125 different simulation experiments in the urban environment demonstrate that JPS can search out the optimal and safety path successfully, and meanwhile, it has a lower time complexity compared with the Vector Field Histogram (VFH, the Rapidly Exploring Random Tree (RRT, A*, and the Probabilistic Roadmaps (PRM algorithms. Furthermore, JPS is validated usefully in the structured urban environment.

  7. User-generated Exploratory Search Routes: ENCONTRAR UN TÉRMINO in the Accounting Dictionaries

    DEFF Research Database (Denmark)

    Fuertes-Olivera, Pedro A.; Leroyer, Patrick

    2014-01-01

    This paper presents the theories and methods that have been used to develop a specific user-generated search route in the Spanish Accounting Dictionary, which consists in offering users the search button Encontrar un término. This allows users who are uncertain of the exact form of the term...

  8. Inverse atmospheric radiative transfer problems - A nonlinear minimization search method of solution. [aerosol pollution monitoring

    Science.gov (United States)

    Fymat, A. L.

    1976-01-01

    The paper studies the inversion of the radiative transfer equation describing the interaction of electromagnetic radiation with atmospheric aerosols. The interaction can be considered as the propagation in the aerosol medium of two light beams: the direct beam in the line-of-sight attenuated by absorption and scattering, and the diffuse beam arising from scattering into the viewing direction, which propagates more or less in random fashion. The latter beam has single scattering and multiple scattering contributions. In the former case and for single scattering, the problem is reducible to first-kind Fredholm equations, while for multiple scattering it is necessary to invert partial integrodifferential equations. A nonlinear minimization search method, applicable to the solution of both types of problems has been developed, and is applied here to the problem of monitoring aerosol pollution, namely the complex refractive index and size distribution of aerosol particles.

  9. Development and testing of a medline search filter for identifying patient and public involvement in health research.

    Science.gov (United States)

    Rogers, Morwenna; Bethel, Alison; Boddy, Kate

    2017-06-01

    Research involving the public as partners often proves difficult to locate due to the variations in terms used to describe public involvement, and inability of medical databases to index this concept effectively. To design a search filter to identify literature where patient and public involvement (PPI) was used in health research. A reference standard of 172 PPI papers was formed. The references were divided into a development set and a test set. Search terms were identified from common words, phrases and synonyms in the development set. These terms were combined as a search strategy for medline via OvidSP, which was then tested for sensitivity against the test set. The resultant search filter was then assessed for sensitivity, specificity and precision using a previously published systematic review. The search filter was found to be highly sensitive 98.5% in initial testing. When tested against results generated by a 'real-life' systematic review, the filter had a specificity of 81%. However, sensitivity dropped to 58%. Adjustments to the population group of terms increased the sensitivity to 73%. The PPI filter designed for medline via OvidSP could aid information specialists and researchers trying to find literature specific to PPI. © 2016 Health Libraries Group.

  10. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    International Nuclear Information System (INIS)

    Gora, D.; Bernardini, E.; Cruz Silva, A.H.

    2011-04-01

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  11. A method for untriggered time-dependent searches for multiple flares from neutrino point sources

    Energy Technology Data Exchange (ETDEWEB)

    Gora, D. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institute of Nuclear Physics PAN, Cracow (Poland); Bernardini, E.; Cruz Silva, A.H. [Institute of Nuclear Physics PAN, Cracow (Poland)

    2011-04-15

    A method for a time-dependent search for flaring astrophysical sources which can be potentially detected by large neutrino experiments is presented. The method uses a time-clustering algorithm combined with an unbinned likelihood procedure. By including in the likelihood function a signal term which describes the contribution of many small clusters of signal-like events, this method provides an effective way for looking for weak neutrino flares over different time-scales. The method is sensitive to an overall excess of events distributed over several flares which are not individually detectable. For standard cases (one flare) the discovery potential of the method is worse than a standard time-dependent point source analysis with unknown duration of the flare by a factor depending on the signal-to-background level. However, for flares sufficiently shorter than the total observation period, the method is more sensitive than a time-integrated analysis. (orig.)

  12. Interactive searching of facial image databases

    Science.gov (United States)

    Nicholls, Robert A.; Shepherd, John W.; Shepherd, Jean

    1995-09-01

    A set of psychological facial descriptors has been devised to enable computerized searching of criminal photograph albums. The descriptors have been used to encode image databased of up to twelve thousand images. Using a system called FACES, the databases are searched by translating a witness' verbal description into corresponding facial descriptors. Trials of FACES have shown that this coding scheme is more productive and efficient than searching traditional photograph albums. An alternative method of searching the encoded database using a genetic algorithm is currenly being tested. The genetic search method does not require the witness to verbalize a description of the target but merely to indicate a degree of similarity between the target and a limited selection of images from the database. The major drawback of FACES is that is requires a manual encoding of images. Research is being undertaken to automate the process, however, it will require an algorithm which can predict human descriptive values. Alternatives to human derived coding schemes exist using statistical classifications of images. Since databases encoded using statistical classifiers do not have an obvious direct mapping to human derived descriptors, a search method which does not require the entry of human descriptors is required. A genetic search algorithm is being tested for such a purpose.

  13. Composite Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2014-01-01

    Full Text Available Differential search algorithm (DS is a relatively new evolutionary algorithm inspired by the Brownian-like random-walk movement which is used by an organism to migrate. It has been verified to be more effective than ABC, JDE, JADE, SADE, EPSDE, GSA, PSO2011, and CMA-ES. In this paper, we propose four improved solution search algorithms, namely “DS/rand/1,” “DS/rand/2,” “DS/current to rand/1,” and “DS/current to rand/2” to search the new space and enhance the convergence rate for the global optimization problem. In order to verify the performance of different solution search methods, 23 benchmark functions are employed. Experimental results indicate that the proposed algorithm performs better than, or at least comparable to, the original algorithm when considering the quality of the solution obtained. However, these schemes cannot still achieve the best solution for all functions. In order to further enhance the convergence rate and the diversity of the algorithm, a composite differential search algorithm (CDS is proposed in this paper. This new algorithm combines three new proposed search schemes including “DS/rand/1,” “DS/rand/2,” and “DS/current to rand/1” with three control parameters using a random method to generate the offspring. Experiment results show that CDS has a faster convergence rate and better search ability based on the 23 benchmark functions.

  14. Efficient heuristics for maximum common substructure search.

    Science.gov (United States)

    Englert, Péter; Kovács, Péter

    2015-05-26

    Maximum common substructure search is a computationally hard optimization problem with diverse applications in the field of cheminformatics, including similarity search, lead optimization, molecule alignment, and clustering. Most of these applications have strict constraints on running time, so heuristic methods are often preferred. However, the development of an algorithm that is both fast enough and accurate enough for most practical purposes is still a challenge. Moreover, in some applications, the quality of a common substructure depends not only on its size but also on various topological features of the one-to-one atom correspondence it defines. Two state-of-the-art heuristic algorithms for finding maximum common substructures have been implemented at ChemAxon Ltd., and effective heuristics have been developed to improve both their efficiency and the relevance of the atom mappings they provide. The implementations have been thoroughly evaluated and compared with existing solutions (KCOMBU and Indigo). The heuristics have been found to greatly improve the performance and applicability of the algorithms. The purpose of this paper is to introduce the applied methods and present the experimental results.

  15. Search Results | Page 776 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 7751 - 7760 of 9175 ... Enhancing food production in semi arid coastal lowlands Kenya through ... nonlocal search trajectories, and ties with service intermediaries ... the role of social networks is identified as one of the research frontiers ...

  16. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    Science.gov (United States)

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework

  17. Searching methods for biometric identification systems: Fundamental limits

    NARCIS (Netherlands)

    Willems, F.M.J.

    2009-01-01

    We study two-stage search procedures for biometric identification systems in an information-theoretical setting. Our main conclusion is that clustering based on vector-quantization achieves the optimum trade-off between the number of clusters (cluster rate) and the number of individuals within a

  18. Querying archetype-based EHRs by search ontology-based XPath engineering.

    Science.gov (United States)

    Kropf, Stefan; Uciteli, Alexandr; Schierle, Katrin; Krücken, Peter; Denecke, Kerstin; Herre, Heinrich

    2018-05-11

    Legacy data and new structured data can be stored in a standardized format as XML-based EHRs on XML databases. Querying documents on these databases is crucial for answering research questions. Instead of using free text searches, that lead to false positive results, the precision can be increased by constraining the search to certain parts of documents. A search ontology-based specification of queries on XML documents defines search concepts and relates them to parts in the XML document structure. Such query specification method is practically introduced and evaluated by applying concrete research questions formulated in natural language on a data collection for information retrieval purposes. The search is performed by search ontology-based XPath engineering that reuses ontologies and XML-related W3C standards. The key result is that the specification of research questions can be supported by the usage of search ontology-based XPath engineering. A deeper recognition of entities and a semantic understanding of the content is necessary for a further improvement of precision and recall. Key limitation is that the application of the introduced process requires skills in ontology and software development. In future, the time consuming ontology development could be overcome by implementing a new clinical role: the clinical ontologist. The introduced Search Ontology XML extension connects Search Terms to certain parts in XML documents and enables an ontology-based definition of queries. Search ontology-based XPath engineering can support research question answering by the specification of complex XPath expressions without deep syntax knowledge about XPaths.

  19. Comparison of genetic algorithm and harmony search for generator maintenance scheduling

    International Nuclear Information System (INIS)

    Khan, L.; Mumtaz, S.; Khattak, A.

    2012-01-01

    GMS (Generator Maintenance Scheduling) ranks very high in decision making of power generation management. Generators maintenance schedule decides the time period of maintenance tasks and a reliable reserve margin is also maintained during this time period. In this paper, a comparison of GA (Genetic Algorithm) and US (Harmony Search) algorithm is presented to solve generators maintenance scheduling problem for WAPDA (Water And Power Development Authority) Pakistan. GA is a search procedure, which is used in search problems to compute exact and optimized solution. GA is considered as global search heuristic technique. HS algorithm is quite efficient, because the convergence rate of this algorithm is very fast. HS algorithm is based on the concept of music improvisation process of searching for a perfect state of harmony. The two algorithms generate feasible and optimal solutions and overcome the limitations of the conventional methods including extensive computational effort, which increases exponentially as the size of the problem increases. The proposed methods are tested, validated and compared on the WAPDA electric system. (author)

  20. BIOMedical Search Engine Framework: Lightweight and customized implementation of domain-specific biomedical search engines.

    Science.gov (United States)

    Jácome, Alberto G; Fdez-Riverola, Florentino; Lourenço, Anália

    2016-07-01

    Text mining and semantic analysis approaches can be applied to the construction of biomedical domain-specific search engines and provide an attractive alternative to create personalized and enhanced search experiences. Therefore, this work introduces the new open-source BIOMedical Search Engine Framework for the fast and lightweight development of domain-specific search engines. The rationale behind this framework is to incorporate core features typically available in search engine frameworks with flexible and extensible technologies to retrieve biomedical documents, annotate meaningful domain concepts, and develop highly customized Web search interfaces. The BIOMedical Search Engine Framework integrates taggers for major biomedical concepts, such as diseases, drugs, genes, proteins, compounds and organisms, and enables the use of domain-specific controlled vocabulary. Technologies from the Typesafe Reactive Platform, the AngularJS JavaScript framework and the Bootstrap HTML/CSS framework support the customization of the domain-oriented search application. Moreover, the RESTful API of the BIOMedical Search Engine Framework allows the integration of the search engine into existing systems or a complete web interface personalization. The construction of the Smart Drug Search is described as proof-of-concept of the BIOMedical Search Engine Framework. This public search engine catalogs scientific literature about antimicrobial resistance, microbial virulence and topics alike. The keyword-based queries of the users are transformed into concepts and search results are presented and ranked accordingly. The semantic graph view portraits all the concepts found in the results, and the researcher may look into the relevance of different concepts, the strength of direct relations, and non-trivial, indirect relations. The number of occurrences of the concept shows its importance to the query, and the frequency of concept co-occurrence is indicative of biological relations

  1. RNA motif search with data-driven element ordering.

    Science.gov (United States)

    Rampášek, Ladislav; Jimenez, Randi M; Lupták, Andrej; Vinař, Tomáš; Brejová, Broňa

    2016-05-18

    In this paper, we study the problem of RNA motif search in long genomic sequences. This approach uses a combination of sequence and structure constraints to uncover new distant homologs of known functional RNAs. The problem is NP-hard and is traditionally solved by backtracking algorithms. We have designed a new algorithm for RNA motif search and implemented a new motif search tool RNArobo. The tool enhances the RNAbob descriptor language, allowing insertions in helices, which enables better characterization of ribozymes and aptamers. A typical RNA motif consists of multiple elements and the running time of the algorithm is highly dependent on their ordering. By approaching the element ordering problem in a principled way, we demonstrate more than 100-fold speedup of the search for complex motifs compared to previously published tools. We have developed a new method for RNA motif search that allows for a significant speedup of the search of complex motifs that include pseudoknots. Such speed improvements are crucial at a time when the rate of DNA sequencing outpaces growth in computing. RNArobo is available at http://compbio.fmph.uniba.sk/rnarobo .

  2. Search Results | Page 11 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 101 - 110 of 8530 ... ... searching for ways of building international climate agreements between ... Supporting indigenous women in science, technology, engineering and mathematics careers in Mexico and Central America ... and international organizations to address challenges faced by indigenous ... Knowledge.

  3. Search Results | Page 918 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 9171 - 9180 of 9601 ... ... for public policies advocacy work and to build a more equitable society. ... mark for organic smallholders- representing family agriculture, ... the attention on the search for a balance to protect plant breeder''s and ...

  4. A Comparison of Local Search Methods for the Multicriteria Police Districting Problem on Graph

    Directory of Open Access Journals (Sweden)

    F. Liberatore

    2016-01-01

    Full Text Available In the current economic climate, law enforcement agencies are facing resource shortages. The effective and efficient use of scarce resources is therefore of the utmost importance to provide a high standard public safety service. Optimization models specifically tailored to the necessity of police agencies can help to ameliorate their use. The Multicriteria Police Districting Problem (MC-PDP on a graph concerns the definition of sound patrolling sectors in a police district. The objective of this problem is to partition a graph into convex and continuous subsets, while ensuring efficiency and workload balance among the subsets. The model was originally formulated in collaboration with the Spanish National Police Corps. We propose for its solution three local search algorithms: a Simple Hill Climbing, a Steepest Descent Hill Climbing, and a Tabu Search. To improve their diversification capabilities, all the algorithms implement a multistart procedure, initialized by randomized greedy solutions. The algorithms are empirically tested on a case study on the Central District of Madrid. Our experiments show that the solutions identified by the novel Tabu Search outperform the other algorithms. Finally, research guidelines for future developments on the MC-PDP are given.

  5. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    Science.gov (United States)

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  6. Intelligent methods for data retrieval in fusion databases

    International Nuclear Information System (INIS)

    Vega, J.

    2008-01-01

    The plasma behaviour is identified through the recognition of patterns inside signals. The search for patterns is usually a manual and tedious procedure in which signals need to be examined individually. A breakthrough in data retrieval for fusion databases is the development of intelligent methods to search for patterns. A pattern (in the broadest sense) could be a single segment of a waveform, a set of pixels within an image or even a heterogeneous set of features made up of waveforms, images and any kind of experimental data. Intelligent methods will allow searching for data according to technical, scientific and structural criteria instead of an identifiable time interval or pulse number. Such search algorithms should be intelligent enough to avoid passing over the entire database. Benefits of such access methods are discussed and several available techniques are reviewed. In addition, the applicability of the methods from general purpose searching systems to ad hoc developments is covered

  7. Searching for Stable SinCn Clusters: Combination of Stochastic Potential Surface Search and Pseudopotential Plane-Wave Car-Parinello Simulated Annealing Simulations

    Directory of Open Access Journals (Sweden)

    Larry W. Burggraf

    2013-07-01

    Full Text Available To find low energy SinCn structures out of hundreds to thousands of isomers we have developed a general method to search for stable isomeric structures that combines Stochastic Potential Surface Search and Pseudopotential Plane-Wave Density Functional Theory Car-Parinello Molecular Dynamics simulated annealing (PSPW-CPMD-SA. We enhanced the Sunders stochastic search method to generate random cluster structures used as seed structures for PSPW-CPMD-SA simulations. This method ensures that each SA simulation samples a different potential surface region to find the regional minimum structure. By iterations of this automated, parallel process on a high performance computer we located hundreds to more than a thousand stable isomers for each SinCn cluster. Among these, five to 10 of the lowest energy isomers were further optimized using B3LYP/cc-pVTZ method. We applied this method to SinCn (n = 4–12 clusters and found the lowest energy structures, most not previously reported. By analyzing the bonding patterns of low energy structures of each SinCn cluster, we observed that carbon segregations tend to form condensed conjugated rings while Si connects to unsaturated bonds at the periphery of the carbon segregation as single atoms or clusters when n is small and when n is large a silicon network spans over the carbon segregation region.

  8. Search Results | Page 11 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 101 - 110 of 8531 ... ... climate policy, negotiations and implementation in Latin America ... body searching for ways of building international climate agreements ... women in science, technology, engineering and mathematics careers in ... and international organizations to address challenges faced by ... Knowledge.

  9. Search Results | Page 10 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 91 - 100 of 8523 ... ... to inform climate policy, negotiations and implementation in Latin America ... body searching for ways of building international climate agreements ... in science, technology, engineering and mathematics careers in ... and international organizations to address challenges faced by ... Knowledge.

  10. Search Results | Page 7 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2013-11-06

    Nov 6, 2013 ... Mobile Nav Footer Links. Careers · Contact Us · Subscribe · Unsubscribe · Copyright · Open Access Policy · Privacy Policy · Research Ethics · Transparency · Website/Usage. Search. Home · South of Sahara; Cameroon ...

  11. PROPOSAL OF METHOD FOR AN AUTOMATIC COMPLEMENTARITIES SEARCH BETWEEN COMPANIES' R&D

    OpenAIRE

    PAULO VINÍCIUS MARCONDES CORDEIRO; DARIO EDUARDO AMARAL DERGINT; KAZUO HATAKEYAMA

    2014-01-01

    Open innovation model is the best choice for the firms that cannot afford R&D costs but intent to continue playing the innovation game. This model offers to any firm the possibility to have companies spread worldwide and in all research fields as partners in R&D. However, the possible partnership can be restricted to the manager's know-who. Patent documents can be the source of rich information about technical development and innovation from a huge amount of firms. Search through all these da...

  12. Use of Web Search Engines and Personalisation in Information Searching for Educational Purposes

    Science.gov (United States)

    Salehi, Sara; Du, Jia Tina; Ashman, Helen

    2018-01-01

    Introduction: Students increasingly depend on Web search for educational purposes. This causes concerns among education providers as some evidence indicates that in higher education, the disadvantages of Web search and personalised information are not justified by the benefits. Method: One hundred and twenty university students were surveyed about…

  13. ElasticSearch server

    CERN Document Server

    Rogozinski, Marek

    2014-01-01

    This book is a detailed, practical, hands-on guide packed with real-life scenarios and examples which will show you how to implement an ElasticSearch search engine on your own websites.If you are a web developer or a user who wants to learn more about ElasticSearch, then this is the book for you. You do not need to know anything about ElastiSeach, Java, or Apache Lucene in order to use this book, though basic knowledge about databases and queries is required.

  14. Searching for supersymmetry scalelessly

    Energy Technology Data Exchange (ETDEWEB)

    Schlaffer, M. [DESY, Hamburg (Germany); Weizmann Institute of Science, Department of Particle Physics and Astrophysics, Rehovot (Israel); Spannowsky, M. [Durham University, Department of Physics, Institute for Particle Physics Phenomenology, Durham (United Kingdom); Weiler, A. [Technische Universitaet Muenchen, Physik Department T75, Garching (Germany)

    2016-08-15

    In this paper we propose a scale invariant search strategy for hadronic top or bottom plus missing energy final states. We present a method which shows flat efficiencies and background rejection factors over broad ranges of parameters and masses. The resulting search can easily be recast into a limit on alternative models. We show the strength of the method in a natural SUSY setup where stop and sbottom squarks are pair produced and decay into hadronically decaying top quarks or bottom quarks and higgsinos. (orig.)

  15. VPD residue search by monitoring scattered x-rays

    International Nuclear Information System (INIS)

    Mori, Y.; Yamagami, M.; Yamada, T.

    2000-01-01

    Recently, VPD-TXRF has come into wide use for semiconductor analysis. In VPD-TXRF technique, adjusting the mechanical measuring point to the center of dried residue is of importance for accurate determination. Until now, the following searching methods have been used: monitoring light scattering under bright illumination, using laser scattering particle mapper, applying internal standard as a marker. However, each method has individual disadvantage. For example, interference of Kβ line (ex. Sc-Kβ to Ti-Kα) occurs in the internal standard method. We propose a new searching method 'scattered x-ray search' which utilizes x-ray scattering form the dried residue as a marker. Since the line profile of x-ray scattering agrees with that of fluorescent x-rays, scattered x-ray can be used as an alternative marker instead of internal standard. According to our experimental results, this search method shows the same accuracy as internal standard method. The merits are as follows: 1) no need to add internal standard, 2) rapid search because of high intensity of scattered x-rays, 3) searching software for internal standard can be applied without any modification. In this method, diffraction of incident x-rays by substrate causes irregular change over the detected scattering x-rays. Therefore, this method works better under x-y controlled stage than r-Θ one. (author)

  16. Search Results | Page 24 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2007-01-01

    Results 231 - 240 of 1368 ... Search. Home; South Asia .... Critical linkages between land use transition and human health in the Himalayan region. Published date. January 1, 2007. Papers. Civil society ISLAMIC CULTURE RADICALISM DEMOCRATIZATION ... Why studying attitudes and perceptions towards GMOs in India is ...

  17. Search Results | Page 843 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 8421 - 8430 of 8489 ... IDRC permits reading, downloading, copying, redistributing, printing, linking and searching, for non-commercial or academic purposes, of any of its content, provided that credit and reference is given to IDRC and the original source page and, in the. Webpage.

  18. Epsilon-Q: An Automated Analyzer Interface for Mass Spectral Library Search and Label-Free Protein Quantification.

    Science.gov (United States)

    Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Paik, Young-Ki

    2017-12-01

    Mass spectrometry (MS) is a widely used proteome analysis tool for biomedical science. In an MS-based bottom-up proteomic approach to protein identification, sequence database (DB) searching has been routinely used because of its simplicity and convenience. However, searching a sequence DB with multiple variable modification options can increase processing time, false-positive errors in large and complicated MS data sets. Spectral library searching is an alternative solution, avoiding the limitations of sequence DB searching and allowing the detection of more peptides with high sensitivity. Unfortunately, this technique has less proteome coverage, resulting in limitations in the detection of novel and whole peptide sequences in biological samples. To solve these problems, we previously developed the "Combo-Spec Search" method, which uses manually multiple references and simulated spectral library searching to analyze whole proteomes in a biological sample. In this study, we have developed a new analytical interface tool called "Epsilon-Q" to enhance the functions of both the Combo-Spec Search method and label-free protein quantification. Epsilon-Q performs automatically multiple spectral library searching, class-specific false-discovery rate control, and result integration. It has a user-friendly graphical interface and demonstrates good performance in identifying and quantifying proteins by supporting standard MS data formats and spectrum-to-spectrum matching powered by SpectraST. Furthermore, when the Epsilon-Q interface is combined with the Combo-Spec search method, called the Epsilon-Q system, it shows a synergistic function by outperforming other sequence DB search engines for identifying and quantifying low-abundance proteins in biological samples. The Epsilon-Q system can be a versatile tool for comparative proteome analysis based on multiple spectral libraries and label-free quantification.

  19. Investigations on search methods for speech recognition using weighted finite state transducers

    OpenAIRE

    Rybach, David

    2014-01-01

    The search problem in the statistical approach to speech recognition is to find the most likely word sequence for an observed speech signal using a combination of knowledge sources, i.e. the language model, the pronunciation model, and the acoustic models of phones. The resulting search space is enormous. Therefore, an efficient search strategy is required to compute the result with a feasible amount of time and memory. The structured statistical models as well as their combination, the searc...

  20. Comparisons of peak-search and photopeak-integration methods in the computer analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Baedecker, P.A.

    1980-01-01

    Myriad methods have been devised for extracting quantitative information from gamma-ray spectra by means of a computer, and a critical evaluation of the relative merits of the various programs that have been written would represent a Herculean, if not an impossible, task. The results from the International Atomic Energy Agency (IAEA) intercomparison, which may represent the most straightforward approach to making such an evaluation, showed a wide range in the quality of the results - even among laboratories where similar methods were used. The most clear-cut way of differentiating between programs is by the method used to evaluate peak areas: by the iterative fitting of the spectral features to an often complex model, or by a simple summation procedure. Previous comparisons have shown that relatively simple algorithms can compete favorably with fitting procedures, although fitting holds the greatest promise for the detection and measurement of complex peaks. However, fitting algorithms, which are generally complex and time consuming, are often ruled out by practical limitations based on the type of computing equipment available, cost limitations, the number of spectra to be processed in a given time period, and the ultimate goal of the analysis. Comparisons of methods can be useful, however, in helping to illustrate the limitations of the various algorithms that have been devised. This paper presents a limited review of some of the more common peak-search and peak-integration methods, along with Peak-search procedures

  1. ElasticSearch cookbook

    CERN Document Server

    Paro, Alberto

    2015-01-01

    If you are a developer who implements ElasticSearch in your web applications and want to sharpen your understanding of the core elements and applications, this is the book for you. It is assumed that you've got working knowledge of JSON and, if you want to extend ElasticSearch, of Java and related technologies.

  2. PWR loading pattern optimization using Harmony Search algorithm

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.

    2013-01-01

    Highlights: ► Numerical results reveal that the HS method is reliable. ► The great advantage of HS is significant gain in computational cost. ► On the average, the final band width of search fitness values is narrow. ► Our experiments show that the search approaches the optimal value fast. - Abstract: In this paper a core reloading technique using Harmony Search, HS, is presented in the context of finding an optimal configuration of fuel assemblies, FA, in pressurized water reactors. To implement and evaluate the proposed technique a Harmony Search along Nodal Expansion Code for 2-D geometry, HSNEC2D, is developed to obtain nearly optimal arrangement of fuel assemblies in PWR cores. This code consists of two sections including Harmony Search algorithm and Nodal Expansion modules using fourth degree flux expansion which solves two dimensional-multi group diffusion equations with one node per fuel assembly. Two optimization test problems are investigated to demonstrate the HS algorithm capability in converging to near optimal loading pattern in the fuel management field and other subjects. Results, convergence rate and reliability of the method are quite promising and show the HS algorithm performs very well and is comparable to other competitive algorithms such as Genetic Algorithm and Particle Swarm Intelligence. Furthermore, implementation of nodal expansion technique along HS causes considerable reduction of computational time to process and analysis optimization in the core fuel management problems

  3. All roads lead to Rome - New search methods for the optimal triangulation problem

    Czech Academy of Sciences Publication Activity Database

    Ottosen, T. J.; Vomlel, Jiří

    2012-01-01

    Roč. 53, č. 9 (2012), s. 1350-1366 ISSN 0888-613X R&D Projects: GA MŠk 1M0572; GA ČR GEICC/08/E010; GA ČR GA201/09/1891 Grant - others:GA MŠk(CZ) 2C06019 Institutional support: RVO:67985556 Keywords : Bayesian networks * Optimal triangulation * Probabilistic inference * Cliques in a graph Subject RIV: BD - Theory of Information Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/vomlel-all roads lead to rome - new search methods for the optimal triangulation problem.pdf

  4. A SIMPLE Bubble Chamber for Dark Matter Searches: Testing and Development

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, A.R.; Fernandes, A.C.; Marques, J.G.; Kling, A. [C2TN, Instituto Superior Tecnico, Universidade de Lisboa, E.N. 10 - km 139.7, 2695-066 Bobadela, LRS (Portugal); Felizardo, M.; Girard, T.A. [Centro de Fisica Nuclear, Universidade de Lisboa, Av. Prof. Gama Pinto 2, 1649-003, Lisbon (Portugal); Lazaro, I. [Laboratoire Souterrain a Bas Bruit, UMS 3538 UNS/UAPV/CNRS, 84400 Rustrel-Pays d' Apt (France); Puibasset, J. [Centre de Recherche sur la Matiere Divisee CNRS et Universite d' Orleans, 45071 Orleans, 02 (France)

    2015-07-01

    SIMPLE (Superheated Instrument for Massive Particle Experiments) is one of only three experiments worldwide in search of evidence of astroparticle dark matter (WIMPs) using halocarbon-loaded superheated liquid (SHL) detectors. The 2012 Phase II SIMPLE measurements yielded the most restrictive exclusion contour in the spin-dependent (SD) sector of WIMP-proton interactions from a direct search experiment at the time, overlapping for the first time results previously obtained only indirectly [1]. In order to remain competitive with other experiments in the field, the next phase measurement requires larger exposure over shorter observation times with significantly improved neutron shielding. To increase exposure, SIMPLE plans, as a first step, to replace its superheated droplet detectors (SDDs), each containing an active mass of about 15 g of halocarbon, with bubble chambers capable of holding up to 20 kg of active halocarbon mass. We report on the development of the first 1 kg halocarbon SIMPLE bubble chamber prototype, including chamber recompression system design and testing and initial acoustic detection of bubble formation. (authors)

  5. NKS/SRV seminar on Barents Rescue 2001 LIVEX. Gamma search cell

    International Nuclear Information System (INIS)

    Ulvsand, T.; Finck, R.R.; Lauritzen, N.

    2002-04-01

    At the seminar, results from the Gamma Search Cell of the Barents Rescue 2001 LIVEX were presented and the performance and experiences of airborne and car-borne teams that took part in the exercise were evaluated. In the Gamma Search Cell, the mobile teams found about 50 % of a large number of radioactive sources hidden within the exercise area. The exercise demonstrated that it is necessary to practise and test equipment under out-door conditions. By which method a source is found is important information in the evaluation of the result. Complementary methods are necessary to find hidden sources. For heavily shielded sources methods based on scattered radiation should be developed. (au)

  6. Search Results | Page 30 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 291 - 300 of 8491 ... ... Policies in Francophone Africa. Africa's persistent job crisis calls for more effective employment policies, including training programs and support for job searches. ... Violence Prevention, Access to Justice, and Economic Empowerment of Women in Latin America. This project will identify and ...

  7. Search Results | Page 846 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 8451 - 8460 of 8518 ... Agriculture and Food Security ... IDRC permits reading, downloading, copying, redistributing, printing, linking and searching, for non-commercial or academic purposes, of any of its content, provided that credit and reference is given ... IDRC funds many research projects that raise ethical issues.

  8. Search for the top quark at D0 using multivariate methods

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1995-07-01

    We report on the search for the top quark in p bar p collisions at the Fermilab Tevatron (√s = 1.8 TeV) in the di-lepton and lepton+jets channels using multivariate methods. An H-matrix analysis of the eμ data corresponding to an integrated luminosity of 13.5±1.6 pb -1 yields one event whose likelihood to be a top quark event, assuming m top = 180 GeV/c 2 , is ten times more than that of WW and eighteen times more than that of Z → ττ. A neural network analysis of the e+jets channel using a data sample corresponding to an integrated luminosity of 47.9±5.7 pb -1 shows an excess of events in the signal region and yields a cross-section for t bar t production of 6.7±2.3 (stat.) pb, assuming a top mass of 200 GeV/c 2 . An analysis of the e+jets data using the probability density estimation method yields a cross-section that is consistent with the above result

  9. Development of user-centered interfaces to search the knowledge resources of the Virginia Henderson International Nursing Library.

    Science.gov (United States)

    Jones, Josette; Harris, Marcelline; Bagley-Thompson, Cheryl; Root, Jane

    2003-01-01

    This poster describes the development of user-centered interfaces in order to extend the functionality of the Virginia Henderson International Nursing Library (VHINL) from library to web based portal to nursing knowledge resources. The existing knowledge structure and computational models are revised and made complementary. Nurses' search behavior is captured and analyzed, and the resulting search models are mapped to the revised knowledge structure and computational model.

  10. Is searching full text more effective than searching abstracts?

    Science.gov (United States)

    Lin, Jimmy

    2009-02-03

    With the growing availability of full-text articles online, scientists and other consumers of the life sciences literature now have the ability to go beyond searching bibliographic records (title, abstract, metadata) to directly access full-text content. Motivated by this emerging trend, I posed the following question: is searching full text more effective than searching abstracts? This question is answered by comparing text retrieval algorithms on MEDLINE abstracts, full-text articles, and spans (paragraphs) within full-text articles using data from the TREC 2007 genomics track evaluation. Two retrieval models are examined: bm25 and the ranking algorithm implemented in the open-source Lucene search engine. Experiments show that treating an entire article as an indexing unit does not consistently yield higher effectiveness compared to abstract-only search. However, retrieval based on spans, or paragraphs-sized segments of full-text articles, consistently outperforms abstract-only search. Results suggest that highest overall effectiveness may be achieved by combining evidence from spans and full articles. Users searching full text are more likely to find relevant articles than searching only abstracts. This finding affirms the value of full text collections for text retrieval and provides a starting point for future work in exploring algorithms that take advantage of rapidly-growing digital archives. Experimental results also highlight the need to develop distributed text retrieval algorithms, since full-text articles are significantly longer than abstracts and may require the computational resources of multiple machines in a cluster. The MapReduce programming model provides a convenient framework for organizing such computations.

  11. Is searching full text more effective than searching abstracts?

    Directory of Open Access Journals (Sweden)

    Lin Jimmy

    2009-02-01

    Full Text Available Abstract Background With the growing availability of full-text articles online, scientists and other consumers of the life sciences literature now have the ability to go beyond searching bibliographic records (title, abstract, metadata to directly access full-text content. Motivated by this emerging trend, I posed the following question: is searching full text more effective than searching abstracts? This question is answered by comparing text retrieval algorithms on MEDLINE® abstracts, full-text articles, and spans (paragraphs within full-text articles using data from the TREC 2007 genomics track evaluation. Two retrieval models are examined: bm25 and the ranking algorithm implemented in the open-source Lucene search engine. Results Experiments show that treating an entire article as an indexing unit does not consistently yield higher effectiveness compared to abstract-only search. However, retrieval based on spans, or paragraphs-sized segments of full-text articles, consistently outperforms abstract-only search. Results suggest that highest overall effectiveness may be achieved by combining evidence from spans and full articles. Conclusion Users searching full text are more likely to find relevant articles than searching only abstracts. This finding affirms the value of full text collections for text retrieval and provides a starting point for future work in exploring algorithms that take advantage of rapidly-growing digital archives. Experimental results also highlight the need to develop distributed text retrieval algorithms, since full-text articles are significantly longer than abstracts and may require the computational resources of multiple machines in a cluster. The MapReduce programming model provides a convenient framework for organizing such computations.

  12. Comparison of Decisions Quality of Heuristic Methods with Limited Depth-First Search Techniques in the Graph Shortest Path Problem

    Directory of Open Access Journals (Sweden)

    Vatutin Eduard

    2017-12-01

    Full Text Available The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.

  13. Comparison of Decisions Quality of Heuristic Methods with Limited Depth-First Search Techniques in the Graph Shortest Path Problem

    Science.gov (United States)

    Vatutin, Eduard

    2017-12-01

    The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.

  14. DEVELOPING AND PROPOSING A CONCEPTUAL MODEL OF THE FLOW EXPERIENCE DURING ONLINE INFORMATION SEARCH

    Directory of Open Access Journals (Sweden)

    Lazoc Alina

    2012-07-01

    Full Text Available Information search is an essential part of the consumer`s decision making process. The online medium offers new opportunities and challenges for information search activities (in and outside the marketing context. We are interested in the way human information experiences and behaviors are affected by this. Very often online games and social web activities are perceived as challenging, engaging and enjoyable, while online information search is far below this evaluation. Our research proposal implies that using the online medium for information search may provoke enjoyable experiences through the flow state, which may in turn positively influence an individual`s exploratory information behavior and encourage his/her pro-active market behavior. The present study sets out to improve the understanding of the online medium`s impact on human`s exploratory behavior. We hypothesize that the inclusion of the online flow experience in our research model will better explain exploratory information search behaviors. A 11-component conceptual framework is proposed to explain the manifestations of flow, its personal and technological determinants and its behavioral consequence in the context of online information search. Our research has the primary purpose to present an integrated online flow model. Its secondary objective is to stimulate extended research in the area of informational behaviors in the digital age. The paper is organized in three sections. In the first section we briefly report the analysis results of the most relevant online flow theory literature and, drawing on it, we are trying to identify variables and relationships among these. In the second part we propose a research model and use prior flow models to specify a range of testable hypothesis. Drawing on the conceptual model developed, the last section of our study presents the final conclusions and proposes further steps in evaluating the model`s validity. Future research directions

  15. Redundancy allocation of series-parallel systems using a variable neighborhood search algorithm

    International Nuclear Information System (INIS)

    Liang, Y.-C.; Chen, Y.-C.

    2007-01-01

    This paper presents a meta-heuristic algorithm, variable neighborhood search (VNS), to the redundancy allocation problem (RAP). The RAP, an NP-hard problem, has attracted the attention of much prior research, generally in a restricted form where each subsystem must consist of identical components. The newer meta-heuristic methods overcome this limitation and offer a practical way to solve large instances of the relaxed RAP where different components can be used in parallel. Authors' previously published work has shown promise for the variable neighborhood descent (VND) method, the simplest version among VNS variations, on RAP. The variable neighborhood search method itself has not been used in reliability design, yet it is a method that fits those combinatorial problems with potential neighborhood structures, as in the case of the RAP. Therefore, authors further extended their work to develop a VNS algorithm for the RAP and tested a set of well-known benchmark problems from the literature. Results on 33 test instances ranging from less to severely constrained conditions show that the variable neighborhood search method improves the performance of VND and provides a competitive solution quality at economically computational expense in comparison with the best-known heuristics including ant colony optimization, genetic algorithm, and tabu search

  16. Redundancy allocation of series-parallel systems using a variable neighborhood search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Y.-C. [Department of Industrial Engineering and Management, Yuan Ze University, No 135 Yuan-Tung Road, Chung-Li, Taoyuan County, Taiwan 320 (China)]. E-mail: ycliang@saturn.yzu.edu.tw; Chen, Y.-C. [Department of Industrial Engineering and Management, Yuan Ze University, No 135 Yuan-Tung Road, Chung-Li, Taoyuan County, Taiwan 320 (China)]. E-mail: s927523@mail.yzu.edu.tw

    2007-03-15

    This paper presents a meta-heuristic algorithm, variable neighborhood search (VNS), to the redundancy allocation problem (RAP). The RAP, an NP-hard problem, has attracted the attention of much prior research, generally in a restricted form where each subsystem must consist of identical components. The newer meta-heuristic methods overcome this limitation and offer a practical way to solve large instances of the relaxed RAP where different components can be used in parallel. Authors' previously published work has shown promise for the variable neighborhood descent (VND) method, the simplest version among VNS variations, on RAP. The variable neighborhood search method itself has not been used in reliability design, yet it is a method that fits those combinatorial problems with potential neighborhood structures, as in the case of the RAP. Therefore, authors further extended their work to develop a VNS algorithm for the RAP and tested a set of well-known benchmark problems from the literature. Results on 33 test instances ranging from less to severely constrained conditions show that the variable neighborhood search method improves the performance of VND and provides a competitive solution quality at economically computational expense in comparison with the best-known heuristics including ant colony optimization, genetic algorithm, and tabu search.

  17. An R-peak detection method that uses an SVD filter and a search back system.

    Science.gov (United States)

    Jung, Woo-Hyuk; Lee, Sang-Goog

    2012-12-01

    In this paper, we present a method for detecting the R-peak of an ECG signal by using an singular value decomposition (SVD) filter and a search back system. The ECG signal was detected in two phases: the pre-processing phase and the decision phase. The pre-processing phase consisted of the stages for the SVD filter, Butterworth High Pass Filter (HPF), moving average (MA), and squaring, whereas the decision phase consisted of a single stage that detected the R-peak. In the pre-processing phase, the SVD filter removed noise while the Butterworth HPF eliminated baseline wander. The MA removed the remaining noise of the signal that had gone through the SVD filter to make the signal smooth, and squaring played a role in strengthening the signal. In the decision phase, the threshold was used to set the interval before detecting the R-peak. When the latest R-R interval (RRI), suggested by Hamilton et al., was greater than 150% of the previous RRI, the method of detecting the R-peak in such an interval was modified to be 150% or greater than the smallest interval of the two most latest RRIs. When the modified search back system was used, the error rate of the peak detection decreased to 0.29%, compared to 1.34% when the modified search back system was not used. Consequently, the sensitivity was 99.47%, the positive predictivity was 99.47%, and the detection error was 1.05%. Furthermore, the quality of the signal in data with a substantial amount of noise was improved, and thus, the R-peak was detected effectively. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  18. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method.

    Science.gov (United States)

    Tien, Shin-Ming; Hsu, Chih-Yuan; Chen, Bor-Sen

    2016-01-01

    Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella's rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the "brake component" in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each "brake component" were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the "brake component". Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate "brake component" in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains.

  19. Engineering Bacteria to Search for Specific Concentrations of Molecules by a Systematic Synthetic Biology Design Method.

    Directory of Open Access Journals (Sweden)

    Shin-Ming Tien

    Full Text Available Bacteria navigate environments full of various chemicals to seek favorable places for survival by controlling the flagella's rotation using a complicated signal transduction pathway. By influencing the pathway, bacteria can be engineered to search for specific molecules, which has great potential for application to biomedicine and bioremediation. In this study, genetic circuits were constructed to make bacteria search for a specific molecule at particular concentrations in their environment through a synthetic biology method. In addition, by replacing the "brake component" in the synthetic circuit with some specific sensitivities, the bacteria can be engineered to locate areas containing specific concentrations of the molecule. Measured by the swarm assay qualitatively and microfluidic techniques quantitatively, the characteristics of each "brake component" were identified and represented by a mathematical model. Furthermore, we established another mathematical model to anticipate the characteristics of the "brake component". Based on this model, an abundant component library can be established to provide adequate component selection for different searching conditions without identifying all components individually. Finally, a systematic design procedure was proposed. Following this systematic procedure, one can design a genetic circuit for bacteria to rapidly search for and locate different concentrations of particular molecules by selecting the most adequate "brake component" in the library. Moreover, following simple procedures, one can also establish an exclusive component library suitable for other cultivated environments, promoter systems, or bacterial strains.

  20. Infodemiology and infoveillance: framework for an emerging set of public health informatics methods to analyze search, communication and publication behavior on the Internet.

    Science.gov (United States)

    Eysenbach, Gunther

    2009-03-27

    Infodemiology can be defined as the science of distribution and determinants of information in an electronic medium, specifically the Internet, or in a population, with the ultimate aim to inform public health and public policy. Infodemiology data can be collected and analyzed in near real time. Examples for infodemiology applications include the analysis of queries from Internet search engines to predict disease outbreaks (eg. influenza), monitoring peoples' status updates on microblogs such as Twitter for syndromic surveillance, detecting and quantifying disparities in health information availability, identifying and monitoring of public health relevant publications on the Internet (eg. anti-vaccination sites, but also news articles or expert-curated outbreak reports), automated tools to measure information diffusion and knowledge translation, and tracking the effectiveness of health marketing campaigns. Moreover, analyzing how people search and navigate the Internet for health-related information, as well as how they communicate and share this information, can provide valuable insights into health-related behavior of populations. Seven years after the infodemiology concept was first introduced, this paper revisits the emerging fields of infodemiology and infoveillance and proposes an expanded framework, introducing some basic metrics such as information prevalence, concept occurrence ratios, and information incidence. The framework distinguishes supply-based applications (analyzing what is being published on the Internet, eg. on Web sites, newsgroups, blogs, microblogs and social media) from demand-based methods (search and navigation behavior), and further distinguishes passive from active infoveillance methods. Infodemiology metrics follow population health relevant events or predict them. Thus, these metrics and methods are potentially useful for public health practice and research, and should be further developed and standardized.

  1. Geant4 Developments for the Radon Electric Dipole Moment Search at TRIUMF

    Science.gov (United States)

    Rand, E. T.; Bangay, J. C.; Bianco, L.; Dunlop, R.; Finlay, P.; Garrett, P. E.; Leach, K. G.; Phillips, A. A.; Sumithrarachchi, C. S.; Svensson, C. E.; Wong, J.

    2011-09-01

    An experiment is being developed at TRIUMF to search for a time-reversal violating electric dipole moment (EDM) in odd-A isotopes of Rn. Extensive simulations of the experiment are being performed with GEANT4 to study the backgrounds and sensitivity of the proposed measurement technique involving the detection of γ rays emitted following the β decay of polarized Rn nuclei. GEANT4 developments for the RnEDM experiment include both realistic modelling of the detector geometry and full tracking of the radioactive β, γ, internal conversion, and x-ray processes, including the γ-ray angular distributions essential for measuring an atomic EDM.

  2. Joint LIGO and TAMA300 search for gravitational waves from inspiralling neutron star binaries

    International Nuclear Information System (INIS)

    Abbott, B.; Abbott, R.; Adhikari, R.; Agresti, J.; Anderson, S.B.; Araya, M.; Armandula, H.; Asiri, F.; Barish, B.C.; Barnes, M.; Barton, M.A.; Bhawal, B.; Billingsley, G.; Black, E.; Blackburn, K.; Bork, R.; Brown, D.A.; Busby, D.; Cardenas, L.; Chandler, A.

    2006-01-01

    We search for coincident gravitational wave signals from inspiralling neutron star binaries using LIGO and TAMA300 data taken during early 2003. Using a simple trigger exchange method, we perform an intercollaboration coincidence search during times when TAMA300 and only one of the LIGO sites were operational. We find no evidence of any gravitational wave signals. We place an observational upper limit on the rate of binary neutron star coalescence with component masses between 1 and 3M · of 49 per year per Milky Way equivalent galaxy at a 90% confidence level. The methods developed during this search will find application in future network inspiral analyses

  3. Development and Evaluation of Thesauri-Based Bibliographic Biomedical Search Engine

    Science.gov (United States)

    Alghoson, Abdullah

    2017-01-01

    Due to the large volume and exponential growth of biomedical documents (e.g., books, journal articles), it has become increasingly challenging for biomedical search engines to retrieve relevant documents based on users' search queries. Part of the challenge is the matching mechanism of free-text indexing that performs matching based on…

  4. A penalty guided stochastic fractal search approach for system reliability optimization

    International Nuclear Information System (INIS)

    Mellal, Mohamed Arezki; Zio, Enrico

    2016-01-01

    Modern industry requires components and systems with high reliability levels. In this paper, we address the system reliability optimization problem. A penalty guided stochastic fractal search approach is developed for solving reliability allocation, redundancy allocation, and reliability–redundancy allocation problems. Numerical results of ten case studies are presented as benchmark problems for highlighting the superiority of the proposed approach compared to others from literature. - Highlights: • System reliability optimization is investigated. • A penalty guided stochastic fractal search approach is developed. • Results of ten case studies are compared with previously published methods. • Performance of the approach is demonstrated.

  5. Forward-Looking Search Within Innovation Projects

    DEFF Research Database (Denmark)

    Jissink, Tymen; Rohrbeck, René; Schweitzer, Fiona

    To develop highly-innovative projects, which are fraught with uncertainty and longer development times, one cannot solely rely on initial planning and budgeting to ensure the project’s outcome remains novel. This study posits that to develop innovative projects, project teams need to engage...... in forward-looking search during development to ensure the project’s outcome remains novel and relevant. We refer to forward-looking search as the search and evaluation of information on markets, customers, and technologies in terms of their future impact. Data on 159 unique innovation projects from...... the Danish manufacturing industry shows that forward-looking search significantly impacts innovativeness. The effect follows an inverted-U shape where the greatest positive effect on innovativeness occurs in moderately planned projects and significantly lower effects in low- and highly planned projects...

  6. Searching for qualitative research for inclusion in systematic reviews: a structured methodological review.

    Science.gov (United States)

    Booth, Andrew

    2016-05-04

    the prospect of rapid development of search methods.

  7. Policy implications for familial searching.

    Science.gov (United States)

    Kim, Joyce; Mammo, Danny; Siegel, Marni B; Katsanis, Sara H

    2011-11-01

    In the United States, several states have made policy decisions regarding whether and how to use familial searching of the Combined DNA Index System (CODIS) database in criminal investigations. Familial searching pushes DNA typing beyond merely identifying individuals to detecting genetic relatedness, an application previously reserved for missing persons identifications and custody battles. The intentional search of CODIS for partial matches to an item of evidence offers law enforcement agencies a powerful tool for developing investigative leads, apprehending criminals, revitalizing cold cases and exonerating wrongfully convicted individuals. As familial searching involves a range of logistical, social, ethical and legal considerations, states are now grappling with policy options for implementing familial searching to balance crime fighting with its potential impact on society. When developing policies for familial searching, legislators should take into account the impact of familial searching on select populations and the need to minimize personal intrusion on relatives of individuals in the DNA database. This review describes the approaches used to narrow a suspect pool from a partial match search of CODIS and summarizes the economic, ethical, logistical and political challenges of implementing familial searching. We examine particular US state policies and the policy options adopted to address these issues. The aim of this review is to provide objective background information on the controversial approach of familial searching to inform policy decisions in this area. Herein we highlight key policy options and recommendations regarding effective utilization of familial searching that minimize harm to and afford maximum protection of US citizens.

  8. Modified harmony search

    Science.gov (United States)

    Mohamed, Najihah; Lutfi Amri Ramli, Ahmad; Majid, Ahmad Abd; Piah, Abd Rahni Mt

    2017-09-01

    A metaheuristic algorithm, called Harmony Search is quite highly applied in optimizing parameters in many areas. HS is a derivative-free real parameter optimization algorithm, and draws an inspiration from the musical improvisation process of searching for a perfect state of harmony. Propose in this paper Modified Harmony Search for solving optimization problems, which employs a concept from genetic algorithm method and particle swarm optimization for generating new solution vectors that enhances the performance of HS algorithm. The performances of MHS and HS are investigated on ten benchmark optimization problems in order to make a comparison to reflect the efficiency of the MHS in terms of final accuracy, convergence speed and robustness.

  9. Optimal Route Searching with Multiple Dynamical Constraints—A Geometric Algebra Approach

    Directory of Open Access Journals (Sweden)

    Dongshuang Li

    2018-05-01

    Full Text Available The process of searching for a dynamic constrained optimal path has received increasing attention in traffic planning, evacuation, and personalized or collaborative traffic service. As most existing multiple constrained optimal path (MCOP methods cannot search for a path given various types of constraints that dynamically change during the search, few approaches for dynamic multiple constrained optimal path (DMCOP with type II dynamics are available for practical use. In this study, we develop a method to solve the DMCOP problem with type II dynamics based on the unification of various types of constraints under a geometric algebra (GA framework. In our method, the network topology and three different types of constraints are represented by using algebraic base coding. With a parameterized optimization of the MCOP algorithm based on a greedy search strategy under the generation-refinement paradigm, this algorithm is found to accurately support the discovery of optimal paths as the constraints of numerical values, nodes, and route structure types are dynamically added to the network. The algorithm was tested with simulated cases of optimal tourism route searches in China’s road networks with various combinations of constraints. The case study indicates that our algorithm can not only solve the DMCOP with different types of constraints but also use constraints to speed up the route filtering.

  10. The Weaknesses of Full-Text Searching

    Science.gov (United States)

    Beall, Jeffrey

    2008-01-01

    This paper provides a theoretical critique of the deficiencies of full-text searching in academic library databases. Because full-text searching relies on matching words in a search query with words in online resources, it is an inefficient method of finding information in a database. This matching fails to retrieve synonyms, and it also retrieves…

  11. New Architectures for Presenting Search Results Based on Web Search Engines Users Experience

    Science.gov (United States)

    Martinez, F. J.; Pastor, J. A.; Rodriguez, J. V.; Lopez, Rosana; Rodriguez, J. V., Jr.

    2011-01-01

    Introduction: The Internet is a dynamic environment which is continuously being updated. Search engines have been, currently are and in all probability will continue to be the most popular systems in this information cosmos. Method: In this work, special attention has been paid to the series of changes made to search engines up to this point,…

  12. Heuristic Search Theory and Applications

    CERN Document Server

    Edelkamp, Stefan

    2011-01-01

    Search has been vital to artificial intelligence from the very beginning as a core technique in problem solving. The authors present a thorough overview of heuristic search with a balance of discussion between theoretical analysis and efficient implementation and application to real-world problems. Current developments in search such as pattern databases and search with efficient use of external memory and parallel processing units on main boards and graphics cards are detailed. Heuristic search as a problem solving tool is demonstrated in applications for puzzle solving, game playing, constra

  13. DRUMS: a human disease related unique gene mutation search engine.

    Science.gov (United States)

    Li, Zuofeng; Liu, Xingnan; Wen, Jingran; Xu, Ye; Zhao, Xin; Li, Xuan; Liu, Lei; Zhang, Xiaoyan

    2011-10-01

    With the completion of the human genome project and the development of new methods for gene variant detection, the integration of mutation data and its phenotypic consequences has become more important than ever. Among all available resources, locus-specific databases (LSDBs) curate one or more specific genes' mutation data along with high-quality phenotypes. Although some genotype-phenotype data from LSDB have been integrated into central databases little effort has been made to integrate all these data by a search engine approach. In this work, we have developed disease related unique gene mutation search engine (DRUMS), a search engine for human disease related unique gene mutation as a convenient tool for biologists or physicians to retrieve gene variant and related phenotype information. Gene variant and phenotype information were stored in a gene-centred relational database. Moreover, the relationships between mutations and diseases were indexed by the uniform resource identifier from LSDB, or another central database. By querying DRUMS, users can access the most popular mutation databases under one interface. DRUMS could be treated as a domain specific search engine. By using web crawling, indexing, and searching technologies, it provides a competitively efficient interface for searching and retrieving mutation data and their relationships to diseases. The present system is freely accessible at http://www.scbit.org/glif/new/drums/index.html. © 2011 Wiley-Liss, Inc.

  14. Gaussian variable neighborhood search for the file transfer scheduling problem

    Directory of Open Access Journals (Sweden)

    Dražić Zorica

    2016-01-01

    Full Text Available This paper presents new modifications of Variable Neighborhood Search approach for solving the file transfer scheduling problem. To obtain better solutions in a small neighborhood of a current solution, we implement two new local search procedures. As Gaussian Variable Neighborhood Search showed promising results when solving continuous optimization problems, its implementation in solving the discrete file transfer scheduling problem is also presented. In order to apply this continuous optimization method to solve the discrete problem, mapping of uncountable set of feasible solutions into a finite set is performed. Both local search modifications gave better results for the large size instances, as well as better average performance for medium and large size instances. One local search modification achieved significant acceleration of the algorithm. The numerical experiments showed that the results obtained by Gaussian modifications are comparable with the results obtained by standard VNS based algorithms, developed for combinatorial optimization. In some cases Gaussian modifications gave even better results. [Projekat Ministarstava nauke Republike Srbije, br. 174010

  15. Self-adaptive global best harmony search algorithm applied to reactor core fuel management optimization

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.; Valavi, K.

    2013-01-01

    Highlights: • SGHS enhanced the convergence rate of LPO using some improvements in comparison to basic HS and GHS. • SGHS optimization algorithm obtained averagely better fitness relative to basic HS and GHS algorithms. • Upshot of the SGHS implementation in the LPO reveals its flexibility, efficiency and reliability. - Abstract: The aim of this work is to apply the new developed optimization algorithm, Self-adaptive Global best Harmony Search (SGHS), for PWRs fuel management optimization. SGHS algorithm has some modifications in comparison with basic Harmony Search (HS) and Global-best Harmony Search (GHS) algorithms such as dynamically change of parameters. For the demonstration of SGHS ability to find an optimal configuration of fuel assemblies, basic Harmony Search (HS) and Global-best Harmony Search (GHS) algorithms also have been developed and investigated. For this purpose, Self-adaptive Global best Harmony Search Nodal Expansion package (SGHSNE) has been developed implementing HS, GHS and SGHS optimization algorithms for the fuel management operation of nuclear reactor cores. This package uses developed average current nodal expansion code which solves the multi group diffusion equation by employment of first and second orders of Nodal Expansion Method (NEM) for two dimensional, hexagonal and rectangular geometries, respectively, by one node per a FA. Loading pattern optimization was performed using SGHSNE package for some test cases to present the SGHS algorithm capability in converging to near optimal loading pattern. Results indicate that the convergence rate and reliability of the SGHS method are quite promising and practically, SGHS improves the quality of loading pattern optimization results relative to HS and GHS algorithms. As a result, it has the potential to be used in the other nuclear engineering optimization problems

  16. Search-Order Independent State Caching

    DEFF Research Database (Denmark)

    Evangelista, Sami; Kristensen, Lars Michael

    2009-01-01

    State caching is a memory reduction technique used by model checkers to alleviate the state explosion problem. It has traditionally been coupled with a depth-first search to ensure termination.We propose and experimentally evaluate an extension of the state caching method for general state...... exploring algorithms that are independent of the search order (i.e., search algorithms that partition the state space into closed (visited) states, open (to visit) states and unmet states)....

  17. Methods and pitfalls in searching drug safety databases utilising the Medical Dictionary for Regulatory Activities (MedDRA).

    Science.gov (United States)

    Brown, Elliot G

    2003-01-01

    The Medical Dictionary for Regulatory Activities (MedDRA) is a unified standard terminology for recording and reporting adverse drug event data. Its introduction is widely seen as a significant improvement on the previous situation, where a multitude of terminologies of widely varying scope and quality were in use. However, there are some complexities that may cause difficulties, and these will form the focus for this paper. Two methods of searching MedDRA-coded databases are described: searching based on term selection from all of MedDRA and searching based on terms in the safety database. There are several potential traps for the unwary in safety searches. There may be multiple locations of relevant terms within a system organ class (SOC) and lack of recognition of appropriate group terms; the user may think that group terms are more inclusive than is the case. MedDRA may distribute terms relevant to one medical condition across several primary SOCs. If the database supports the MedDRA model, it is possible to perform multiaxial searching: while this may help find terms that might have been missed, it is still necessary to consider the entire contents of the SOCs to find all relevant terms and there are many instances of incomplete secondary linkages. It is important to adjust for multiaxiality if data are presented using primary and secondary locations. Other sources for errors in searching are non-intuitive placement and the selection of terms as preferred terms (PTs) that may not be widely recognised. Some MedDRA rules could also result in errors in data retrieval if the individual is unaware of these: in particular, the lack of multiaxial linkages for the Investigations SOC, Social circumstances SOC and Surgical and medical procedures SOC and the requirement that a PT may only be present under one High Level Term (HLT) and one High Level Group Term (HLGT) within any single SOC. Special Search Categories (collections of PTs assembled from various SOCs by

  18. Online Information Search Performance and Search Strategies in a Health Problem-Solving Scenario.

    Science.gov (United States)

    Sharit, Joseph; Taha, Jessica; Berkowsky, Ronald W; Profita, Halley; Czaja, Sara J

    2015-01-01

    Although access to Internet health information can be beneficial, solving complex health-related problems online is challenging for many individuals. In this study, we investigated the performance of a sample of 60 adults ages 18 to 85 years in using the Internet to resolve a relatively complex health information problem. The impact of age, Internet experience, and cognitive abilities on measures of search time, amount of search, and search accuracy was examined, and a model of Internet information seeking was developed to guide the characterization of participants' search strategies. Internet experience was found to have no impact on performance measures. Older participants exhibited longer search times and lower amounts of search but similar search accuracy performance as their younger counterparts. Overall, greater search accuracy was related to an increased amount of search but not to increased search duration and was primarily attributable to higher cognitive abilities, such as processing speed, reasoning ability, and executive function. There was a tendency for those who were younger, had greater Internet experience, and had higher cognitive abilities to use a bottom-up (i.e., analytic) search strategy, although use of a top-down (i.e., browsing) strategy was not necessarily unsuccessful. Implications of the findings for future studies and design interventions are discussed.

  19. Searching for stable Si(n)C(n) clusters: combination of stochastic potential surface search and pseudopotential plane-wave Car-Parinello simulated annealing simulations.

    Science.gov (United States)

    Duan, Xiaofeng F; Burggraf, Larry W; Huang, Lingyu

    2013-07-22

    To find low energy Si(n)C(n) structures out of hundreds to thousands of isomers we have developed a general method to search for stable isomeric structures that combines Stochastic Potential Surface Search and Pseudopotential Plane-Wave Density Functional Theory Car-Parinello Molecular Dynamics simulated annealing (PSPW-CPMD-SA). We enhanced the Sunders stochastic search method to generate random cluster structures used as seed structures for PSPW-CPMD-SA simulations. This method ensures that each SA simulation samples a different potential surface region to find the regional minimum structure. By iterations of this automated, parallel process on a high performance computer we located hundreds to more than a thousand stable isomers for each Si(n)C(n) cluster. Among these, five to 10 of the lowest energy isomers were further optimized using B3LYP/cc-pVTZ method. We applied this method to Si(n)C(n) (n = 4-12) clusters and found the lowest energy structures, most not previously reported. By analyzing the bonding patterns of low energy structures of each Si(n)C(n) cluster, we observed that carbon segregations tend to form condensed conjugated rings while Si connects to unsaturated bonds at the periphery of the carbon segregation as single atoms or clusters when n is small and when n is large a silicon network spans over the carbon segregation region.

  20. Action Search: Learning to Search for Human Activities in Untrimmed Videos

    KAUST Repository

    Alwassel, Humam

    2017-06-13

    Traditional approaches for action detection use trimmed data to learn sophisticated action detector models. Although these methods have achieved great success at detecting human actions, we argue that huge information is discarded when ignoring the process, through which this trimmed data is obtained. In this paper, we propose Action Search, a novel approach that mimics the way people annotate activities in video sequences. Using a Recurrent Neural Network, Action Search can efficiently explore a video and determine the time boundaries during which an action occurs. Experiments on the THUMOS14 dataset reveal that our model is not only able to explore the video efficiently but also accurately find human activities, outperforming state-of-the-art methods.

  1. Simplified automatic on-line document searching

    International Nuclear Information System (INIS)

    Ebinuma, Yukio

    1983-01-01

    The author proposed searching method for users who need not-comprehensive retrieval. That is to provide flexible number of related documents for the users automatically. A group of technical terms are used as search terms to express an inquiry. Logical sums of the terms in the ascending order of frequency of the usage are prepared sequentially and automatically, and then the search formulas, qsub(m) and qsub(m-1) which meet certain threshold values are selected automatically also. Users justify precision of the search output up to 20 items retrieved by the formula qsub(m). If a user wishes more than 30% of recall ratio, the serach result should be output by qsub(m), and if he wishes less than 30% of it, it should be output by qsub(m-1). The search by this method using one year volume of INIS Database (76,600 items) and five inquiries resulted in 32% of recall ratio and 36% of precision ratio on the average in the case of qsub(m). The connecting time of a terminal was within 15 minutes per an inquiry. It showed more efficiency than that of an inexperienced searcher. The method can be applied to on-line searching system for database in which natural language only or natural language and controlled vocabulary are used. (author)

  2. Novel citation-based search method for scientific literature: application to meta-analyses

    NARCIS (Netherlands)

    Janssens, A.C.J.W.; Gwinn, M.

    2015-01-01

    Background: Finding eligible studies for meta-analysis and systematic reviews relies on keyword-based searching as the gold standard, despite its inefficiency. Searching based on direct citations is not sufficiently comprehensive. We propose a novel strategy that ranks articles on their degree of

  3. Search for Higgs bosons in ττqq-bar topologies with the Delphi detector at LEP

    International Nuclear Information System (INIS)

    Fichet, St.

    1998-03-01

    During my thesis, I was doing the official Higgs search in the ττqq-bar topologies for the Delphi collaboration. These final states are useful for 3 different processes for Higgs search, that lead to 3 different analyses. In 1996, I used an inclusive method to search for τ in hadronic events, identifying τ as thin jets quite isolated from the hadronic system. These analyses gave good results published in the official Delphi paper about the search for Higgs bosons. In 1997, with a high luminosity, I developed an exclusive search for τ leptons in hadronic events. This method is a real improvement, with respect to the inclusive one, for background rejection and signal efficiency. This kind of identification is necessary to study τ polarization, which could help measured the Higgs spin. (author)

  4. Source Security Program in the Philippines: a lost source search experience

    International Nuclear Information System (INIS)

    Romallosa, Kristine M.; Salabit, Maria T.; Caseria, Estrella; Valdezco, Eulinia

    2008-01-01

    The Philippine Nuclear Research Institute (PNRI), the national agency in the licensing and regulations of radioactive materials in the country, is strengthening its capabilities in the security of radioactive sources. Part of this program is the PNRI's participation in the Regional Security of Radioactive Sources (RSRS) Project of the Australian Nuclear Science and Technology Organization (ANSTO). The project has provided equipment and methods training, assistance in the development of PNRI's own training program and support for actual orphan source search activities. On May 2007, a source search for the two lost Cs-137 level gauges of a steel manufacturing company was conducted by the PNRI and ANSTO. The source search are the: a) Development of instrument and source search training for the team, the National Training Workshop on Orphan Source Searches which was organized and conducted as a result of train-the-trainors fellowship under the RSRS project; and b) Planning and implementation of the lost source search activity. The conduct of the actual search on warehouses, product yard, canals, dust storage, steel making building, scrap yards and nearby junk shops of the steel plant took one week. The week-long search did not find the lost sources. However, naturally occurring radioactive materials identified to be Thorium, were found on sands, bricks and sack piles that are stored and/or generally present in the warehouses, yard and steel making building. The search activity had therefore cleared the facility of the lost source and its corresponding hazards. The NORM found present in the plant's premises on the other hand brought the attention of the management of the needed measures to ensure safety of the staff from possible hazards of these materials. Currently, the course syllabus that was developed is continuously enhanced to accommodate the training needs of the PNRI staff particularly for the emergency response and preparedness. This component of the source

  5. User Oriented Trajectory Search for Trip Recommendation

    KAUST Repository

    Ding, Ruogu

    2012-07-08

    Trajectory sharing and searching have received significant attention in recent years. In this thesis, we propose and investigate the methods to find and recommend the best trajectory to the traveler, and mainly focus on a novel technique named User Oriented Trajectory Search (UOTS) query processing. In contrast to conventional trajectory search by locations (spatial domain only), we consider both spatial and textual domains in the new UOTS query. Given a trajectory data set, the query input contains a set of intended places given by the traveler and a set of textual attributes describing the traveler’s preference. If a trajectory is connecting/close to the specified query locations, and the textual attributes of the trajectory are similar to the traveler’s preference, it will be recommended to the traveler. This type of queries can enable many popular applications such as trip planning and recommendation. There are two challenges in UOTS query processing, (i) how to constrain the searching range in two domains and (ii) how to schedule multiple query sources effectively. To overcome the challenges and answer the UOTS query efficiently, a novel collaborative searching approach is developed. Conceptually, the UOTS query processing is conducted in the spatial and textual domains alternately. A pair of upper and lower bounds are devised to constrain the searching range in two domains. In the meantime, a heuristic searching strategy based on priority ranking is adopted for scheduling the multiple query sources, which can further reduce the searching range and enhance the query efficiency notably. Furthermore, the devised collaborative searching approach can be extended to situations where the query locations are ordered. Extensive experiments are conducted on both real and synthetic trajectory data in road networks. Our approach is verified to be effective in reducing both CPU time and disk I/O time.

  6. Search for the Higgs boson in the ZH→v$\\bar{v}$b$\\bar{b}$ channel: Development of a b-tagging method based on soft muons

    Energy Technology Data Exchange (ETDEWEB)

    Jamin, David [Univ. of the Mediterranean, Marseille (France)

    2010-09-30

    In the Standard Model of particle physics, the Higgs boson generates elementary particle masses. Current theoretical and experimental constraints lead to a Higgs boson mass between 114.4 and 158 GeV with 95% confidence level. Moreover, Tevatron has recently excluded the mass ranges between 100 and 109 GeV, 158 and 175 GeV with 95% confidence level. These results gives a clear indication to search for a Higgs boson at low mass. The D0 detector is located near Chicago, at the Tevatron, a proton-antiproton collider with an energy in the center of mass of 1.96 TeV. The topic of this thesis is the search for a Higgs boson in association with a Z boson. This channel is sensitive to low mass Higgs boson (<135 GeV) which has a branching ratio H → bb varies between 50% and 90% in this mass range. The decay channel ZH → v$\\bar{v}$b$\\bar{b}$ studied has in the final state 2 heavy-flavor jets and some missing transverse energy due to escaping neutrinos. The heavy-flavor jets identification ('b-tagging') is done with a new algorithm (SLTNN) developed specifically for semi-leptonic decay of b quarks. The Higgs boson search analysis was performed with 3 fb-1 of data. The use of SLTNN increases by 10% the Higgs boson signal efficiency. The global analysis sensitivity improvement, however, is rather low (<1%) after taking into account the backgrounds and systematic uncertainties.

  7. Development of the ATLAS High-Level Trigger Steering and Inclusive Searches for Supersymmetry

    CERN Document Server

    Eifert, T

    2009-01-01

    The presented thesis is divided into two distinct parts. The subject of the first part is the ATLAS high-level trigger (HLT), in particular the development of the HLT Steering, and the trigger user-interface. The second part presents a study of inclusive supersymmetry searches, including a novel background estimation method for the relevant Standard Model (SM) processes. The trigger system of the ATLAS experiment at the Large Hadron Collider (LHC) performs the on-line physics selection in three stages: level-1 (LVL1), level-2 (LVL2), and the event filter (EF). LVL2 and EF together form the HLT. The HLT receives events containing detector data from high-energy proton (or heavy ion) collisions, which pass the LVL1 selection at a maximum rate of 75 kHz. It must reduce this rate to ~200 Hz, while retaining the most interesting physics. The HLT is a software trigger and runs on a large computing farm. At the heart of the HLT is the Steering software. The HLT Steering must reach a decision whether or not to accept ...

  8. ISART: A Generic Framework for Searching Books with Social Information.

    Science.gov (United States)

    Yin, Xu-Cheng; Zhang, Bo-Wen; Cui, Xiao-Ping; Qu, Jiao; Geng, Bin; Zhou, Fang; Song, Li; Hao, Hong-Wei

    2016-01-01

    Effective book search has been discussed for decades and is still future-proof in areas as diverse as computer science, informatics, e-commerce and even culture and arts. A variety of social information contents (e.g, ratings, tags and reviews) emerge with the huge number of books on the Web, but how they are utilized for searching and finding books is seldom investigated. Here we develop an Integrated Search And Recommendation Technology (IsArt), which breaks new ground by providing a generic framework for searching books with rich social information. IsArt comprises a search engine to rank books with book contents and professional metadata, a Generalized Content-based Filtering model to thereafter rerank books with user-generated social contents, and a learning-to-rank technique to finally combine a wide range of diverse reranking results. Experiments show that this technology permits embedding social information to promote book search effectiveness, and IsArt, by making use of it, has the best performance on CLEF/INEX Social Book Search Evaluation datasets of all 4 years (from 2011 to 2014), compared with some other state-of-the-art methods.

  9. An image segmentation method based on fuzzy C-means clustering and Cuckoo search algorithm

    Science.gov (United States)

    Wang, Mingwei; Wan, Youchuan; Gao, Xianjun; Ye, Zhiwei; Chen, Maolin

    2018-04-01

    Image segmentation is a significant step in image analysis and machine vision. Many approaches have been presented in this topic; among them, fuzzy C-means (FCM) clustering is one of the most widely used methods for its high efficiency and ambiguity of images. However, the success of FCM could not be guaranteed because it easily traps into local optimal solution. Cuckoo search (CS) is a novel evolutionary algorithm, which has been tested on some optimization problems and proved to be high-efficiency. Therefore, a new segmentation technique using FCM and blending of CS algorithm is put forward in the paper. Further, the proposed method has been measured on several images and compared with other existing FCM techniques such as genetic algorithm (GA) based FCM and particle swarm optimization (PSO) based FCM in terms of fitness value. Experimental results indicate that the proposed method is robust, adaptive and exhibits the better performance than other methods involved in the paper.

  10. SA-Search: a web tool for protein structure mining based on a Structural Alphabet.

    Science.gov (United States)

    Guyon, Frédéric; Camproux, Anne-Claude; Hochez, Joëlle; Tufféry, Pierre

    2004-07-01

    SA-Search is a web tool that can be used to mine for protein structures and extract structural similarities. It is based on a hidden Markov model derived Structural Alphabet (SA) that allows the compression of three-dimensional (3D) protein conformations into a one-dimensional (1D) representation using a limited number of prototype conformations. Using such a representation, classical methods developed for amino acid sequences can be employed. Currently, SA-Search permits the performance of fast 3D similarity searches such as the extraction of exact words using a suffix tree approach, and the search for fuzzy words viewed as a simple 1D sequence alignment problem. SA-Search is available at http://bioserv.rpbs.jussieu.fr/cgi-bin/SA-Search.

  11. Generalised Adaptive Harmony Search: A Comparative Analysis of Modern Harmony Search

    Directory of Open Access Journals (Sweden)

    Jaco Fourie

    2013-01-01

    Full Text Available Harmony search (HS was introduced in 2001 as a heuristic population-based optimisation algorithm. Since then HS has become a popular alternative to other heuristic algorithms like simulated annealing and particle swarm optimisation. However, some flaws, like the need for parameter tuning, were identified and have been a topic of study for much research over the last 10 years. Many variants of HS were developed to address some of these flaws, and most of them have made substantial improvements. In this paper we compare the performance of three recent HS variants: exploratory harmony search, self-adaptive harmony search, and dynamic local-best harmony search. We compare the accuracy of these algorithms, using a set of well-known optimisation benchmark functions that include both unimodal and multimodal problems. Observations from this comparison led us to design a novel hybrid that combines the best attributes of these modern variants into a single optimiser called generalised adaptive harmony search.

  12. Search Results | Page 811 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 8101 - 8110 of 8531 ... Informational life of the marginalized : a study of digital access in three Mexican towns. The weight of family networks plays a crucial role in learning about ICTs. ... policies advocacy work and to build a more equitable society. ... attention on the search for a balance to protect plant breeder''s and ...

  13. A DE-Based Scatter Search for Global Optimization Problems

    Directory of Open Access Journals (Sweden)

    Kun Li

    2015-01-01

    Full Text Available This paper proposes a hybrid scatter search (SS algorithm for continuous global optimization problems by incorporating the evolution mechanism of differential evolution (DE into the reference set updated procedure of SS to act as the new solution generation method. This hybrid algorithm is called a DE-based SS (SSDE algorithm. Since different kinds of mutation operators of DE have been proposed in the literature and they have shown different search abilities for different kinds of problems, four traditional mutation operators are adopted in the hybrid SSDE algorithm. To adaptively select the mutation operator that is most appropriate to the current problem, an adaptive mechanism for the candidate mutation operators is developed. In addition, to enhance the exploration ability of SSDE, a reinitialization method is adopted to create a new population and subsequently construct a new reference set whenever the search process of SSDE is trapped in local optimum. Computational experiments on benchmark problems show that the proposed SSDE is competitive or superior to some state-of-the-art algorithms in the literature.

  14. PLANE MATCHING WITH OBJECT-SPACE SEARCHING USING INDEPENDENTLY RECTIFIED IMAGES

    Directory of Open Access Journals (Sweden)

    H. Takeda

    2012-07-01

    Full Text Available In recent years, the social situation in cities has changed significantly such as redevelopment due to the massive earthquake and large-scale urban development. For example, numerical simulations can be used to study this phenomenon. Such simulations require the construction of high-definition three-dimensional city models that accurately reflect the real world. Progress in sensor technology allows us to easily obtain multi-view images. However, the existing multi-image matching techniques are inadequate. In this paper, we propose a new technique for multi-image matching. Since the existing method of feature searching is complicated, we have developed a rectification method that can be processed independently for each image does not depend on the stereo-pair. The object-space searching method that produces mismatches due to the occlusion or distortion of wall textures on images is the focus of our study. Our proposed technique can also match the building wall surface. The proposed technique has several advantages, and its usefulness is clarified through an experiment using actual images.

  15. An efficient search method for finding the critical slip surface using the compositional Monte Carlo technique

    International Nuclear Information System (INIS)

    Goshtasbi, K.; Ahmadi, M; Naeimi, Y.

    2008-01-01

    Locating the critical slip surface and the associated minimum factor of safety are two complementary parts in a slope stability analysis. A large number of computer programs exist to solve slope stability problems. Most of these programs, however, have used inefficient and unreliable search procedures to locate the global minimum factor of safety. This paper presents an efficient and reliable method to determine the global minimum factor of safety coupled with a modified version of the Monte Carlo technique. Examples arc presented to illustrate the reliability of the proposed method

  16. A Practical, Robust and Fast Method for Location Localization in Range-Based Systems.

    Science.gov (United States)

    Huang, Shiping; Wu, Zhifeng; Misra, Anil

    2017-12-11

    Location localization technology is used in a number of industrial and civil applications. Real time location localization accuracy is highly dependent on the quality of the distance measurements and efficiency of solving the localization equations. In this paper, we provide a novel approach to solve the nonlinear localization equations efficiently and simultaneously eliminate the bad measurement data in range-based systems. A geometric intersection model was developed to narrow the target search area, where Newton's Method and the Direct Search Method are used to search for the unknown position. Not only does the geometric intersection model offer a small bounded search domain for Newton's Method and the Direct Search Method, but also it can self-correct bad measurement data. The Direct Search Method is useful for the coarse localization or small target search domain, while the Newton's Method can be used for accurate localization. For accurate localization, by utilizing the proposed Modified Newton's Method (MNM), challenges of avoiding the local extrema, singularities, and initial value choice are addressed. The applicability and robustness of the developed method has been demonstrated by experiments with an indoor system.

  17. Development of the method of the preparation of albumin microspheres labelled 188-rhenium

    International Nuclear Information System (INIS)

    Dyomin, D.N.; Petriev, V.M.

    2001-01-01

    The basis of effective and wide usage of radioisotope methods in diagnostic and therapy of tumoral and not tumoral diseases is development of new radiopharmaceuticals, described by high functional value and safety. The efficiency and safety of radiopharmaceuticals are defined by the nuclear - physical characteristics of radionuclides, and also physicochemical and biological properties of carriers for selective delivery of radionuclides. Therefore an important direction in radiopharmacy is the search, the development and estimation of particular properties radiopharmaceuticals for radiotherapy of tumoral and not tumoral diseases, where the main thing is the choice of optimal carriers and radionuclides. At present time two types of carriers are used for delivery radionuclides: soluble preparations and insoluble so-called microparticles. (authors)

  18. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    Science.gov (United States)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value

  19. Methods and Results of a Search for Gravitational Waves Associated with Gamma-Ray Bursts Using the GEO 600, LIGO, and Virgo Detectors

    Science.gov (United States)

    Aasi, J.; Abbott, B. P.; Abbott, R.; Abbott, T.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Blackburn, Lindy L.; hide

    2013-01-01

    In this paper we report on a search for short-duration gravitational wave bursts in the frequency range 64 Hz-1792 Hz associated with gamma-ray bursts (GRBs), using data from GEO600 and one of the LIGO or Virgo detectors. We introduce the method of a linear search grid to analyze GRB events with large sky localization uncertainties such as the localizations provided by the Fermi Gamma-ray Burst Monitor (GBM). Coherent searches for gravitational waves (GWs) can be computationally intensive when the GRB sky position is not well-localized, due to the corrections required for the difference in arrival time between detectors. Using a linear search grid we are able to reduce the computational cost of the analysis by a factor of O(10) for GBM events. Furthermore, we demonstrate that our analysis pipeline can improve upon the sky localization of GRBs detected by the GBM, if a high-frequency GW signal is observed in coincidence. We use the linear search grid method in a search for GWs associated with 129 GRBs observed satellite-based gamma-ray experiments between 2006 and 2011. The GRBs in our sample had not been previously analyzed for GW counterparts. A fraction of our GRB events are analyzed using data from GEO600 while the detector was using squeezed-light states to improve its sensitivity; this is the first search for GWs using data from a squeezed-light interferometric observatory. We find no evidence for GW signals, either with any individual GRB in this sample or with the population as a whole. For each GRB we place lower bounds on the distance to the progenitor, assuming a fixed GW emission energy of 10(exp -2)Stellar Mass sq c, with a median exclusion distance of 0.8 Mpc for emission at 500 Hz and 0.3 Mpc at 1 kHz. The reduced computational cost associated with a linear search grid will enable rapid searches for GWs associated with Fermi GBM events in the Advanced detector era.

  20. Natural Language Search Interfaces: Health Data Needs Single-Field Variable Search

    Science.gov (United States)

    Smith, Sam; Sufi, Shoaib; Goble, Carole; Buchan, Iain

    2016-01-01

    Background Data discovery, particularly the discovery of key variables and their inter-relationships, is key to secondary data analysis, and in-turn, the evolving field of data science. Interface designers have presumed that their users are domain experts, and so they have provided complex interfaces to support these “experts.” Such interfaces hark back to a time when searches needed to be accurate first time as there was a high computational cost associated with each search. Our work is part of a governmental research initiative between the medical and social research funding bodies to improve the use of social data in medical research. Objective The cross-disciplinary nature of data science can make no assumptions regarding the domain expertise of a particular scientist, whose interests may intersect multiple domains. Here we consider the common requirement for scientists to seek archived data for secondary analysis. This has more in common with search needs of the “Google generation” than with their single-domain, single-tool forebears. Our study compares a Google-like interface with traditional ways of searching for noncomplex health data in a data archive. Methods Two user interfaces are evaluated for the same set of tasks in extracting data from surveys stored in the UK Data Archive (UKDA). One interface, Web search, is “Google-like,” enabling users to browse, search for, and view metadata about study variables, whereas the other, traditional search, has standard multioption user interface. Results Using a comprehensive set of tasks with 20 volunteers, we found that the Web search interface met data discovery needs and expectations better than the traditional search. A task × interface repeated measures analysis showed a main effect indicating that answers found through the Web search interface were more likely to be correct (F 1,19=37.3, Peffect of task (F 3,57=6.3, Pinterface (F 1,19=18.0, Peffect of task (F 2,38=4.1, P=.025, Greenhouse

  1. An improved algorithm for searching all minimal cuts in modified networks

    International Nuclear Information System (INIS)

    Yeh, W.-C.

    2008-01-01

    A modified network is an updated network after inserting a branch string (a special path) between two nodes in the original network. Modifications are common for network expansion or reinforcement evaluation and planning. The problem of searching all minimal cuts (MCs) in a modified network is discussed and solved in this study. The existing best-known methods for solving this problem either needed extensive comparison and verification or failed to solve some special but important cases. Therefore, a more efficient, intuitive and generalized method for searching all MCs without an extensive research procedure is proposed. In this study, we first develop an intuitive algorithm based upon the reformation of all MCs in the original network to search for all MCs in a modified network. Next, the correctness of the proposed algorithm will be analyzed and proven. The computational complexity of the proposed algorithm is analyzed and compared with the existing best-known methods. Finally, two examples illustrate how all MCs are generated in a modified network using the information of all of the MCs in the corresponding original network

  2. Whole-Exome Sequencing in Searching for New Variants Associated With the Development of Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Marina V. Shulskaya

    2018-05-01

    Full Text Available Background: Parkinson’s disease (PD is a complex disease with its monogenic forms accounting for less than 10% of all cases. Whole-exome sequencing (WES technology has been used successfully to find mutations in large families. However, because of the late onset of the disease, only small families and unrelated patients are usually available. WES conducted in such cases yields in a large number of candidate variants. There are currently a number of imperfect software tools that allow the pathogenicity of variants to be evaluated.Objectives: We analyzed 48 unrelated patients with an alleged autosomal dominant familial form of PD using WES and developed a strategy for selecting potential pathogenetically significant variants using almost all available bioinformatics resources for the analysis of exonic areas.Methods: DNA sequencing of 48 patients with excluded frequent mutations was performed using an Illumina HiSeq 2500 platform. The possible pathogenetic significance of identified variants and their involvement in the pathogenesis of PD was assessed using SNP and Variation Suite (SVS, Combined Annotation Dependent Depletion (CADD and Rare Exome Variant Ensemble Learner (REVEL software. Functional evaluation was performed using the Pathway Studio database.Results: A significant reduction in the search range from 7082 to 25 variants in 23 genes associated with PD or neuronal function was achieved. Eight (FXN, MFN2, MYOC, NPC1, PSEN1, RET, SCN3A and SPG7 were the most significant.Conclusions: The multistep approach developed made it possible to conduct an effective search for potential pathogenetically significant variants, presumably involved in the pathogenesis of PD. The data obtained need to be further verified experimentally.

  3. Stochastic local search foundations and applications

    CERN Document Server

    Hoos, Holger H; Stutzle, Thomas

    2004-01-01

    Stochastic local search (SLS) algorithms are among the most prominent and successful techniques for solving computationally difficult problems in many areas of computer science and operations research, including propositional satisfiability, constraint satisfaction, routing, and scheduling. SLS algorithms have also become increasingly popular for solving challenging combinatorial problems in many application areas, such as e-commerce and bioinformatics. Hoos and Stützle offer the first systematic and unified treatment of SLS algorithms. In this groundbreaking new book, they examine the general concepts and specific instances of SLS algorithms and carefully consider their development, analysis and application. The discussion focuses on the most successful SLS methods and explores their underlying principles, properties, and features. This book gives hands-on experience with some of the most widely used search techniques, and provides readers with the necessary understanding and skills to use this powerful too...

  4. How to improve your PubMed/MEDLINE searches: 3. advanced searching, MeSH and My NCBI.

    Science.gov (United States)

    Fatehi, Farhad; Gray, Leonard C; Wootton, Richard

    2014-03-01

    Although the basic PubMed search is often helpful, the results may sometimes be non-specific. For more control over the search process you can use the Advanced Search Builder interface. This allows a targeted search in specific fields, with the convenience of being able to select the intended search field from a list. It also provides a history of your previous searches. The search history is useful to develop a complex search query by combining several previous searches using Boolean operators. For indexing the articles in MEDLINE, the NLM uses a controlled vocabulary system called MeSH. This standardised vocabulary solves the problem of authors, researchers and librarians who may use different terms for the same concept. To be efficient in a PubMed search, you should start by identifying the most appropriate MeSH terms and use them in your search where possible. My NCBI is a personal workspace facility available through PubMed and makes it possible to customise the PubMed interface. It provides various capabilities that can enhance your search performance.

  5. Comparative Study on Three Major Internet Search Engines ...

    African Journals Online (AJOL)

    , Google and ask.com search engines. Experimental method was used with ten reference questions which were used to query each of the search engines . Yahoo obtained the highest results (521,801,043) among the three Web search ...

  6. Search for transient ultralight dark matter signatures with networks of precision measurement devices using a Bayesian statistics method

    Science.gov (United States)

    Roberts, B. M.; Blewitt, G.; Dailey, C.; Derevianko, A.

    2018-04-01

    We analyze the prospects of employing a distributed global network of precision measurement devices as a dark matter and exotic physics observatory. In particular, we consider the atomic clocks of the global positioning system (GPS), consisting of a constellation of 32 medium-Earth orbit satellites equipped with either Cs or Rb microwave clocks and a number of Earth-based receiver stations, some of which employ highly-stable H-maser atomic clocks. High-accuracy timing data is available for almost two decades. By analyzing the satellite and terrestrial atomic clock data, it is possible to search for transient signatures of exotic physics, such as "clumpy" dark matter and dark energy, effectively transforming the GPS constellation into a 50 000 km aperture sensor array. Here we characterize the noise of the GPS satellite atomic clocks, describe the search method based on Bayesian statistics, and test the method using simulated clock data. We present the projected discovery reach using our method, and demonstrate that it can surpass the existing constrains by several order of magnitude for certain models. Our method is not limited in scope to GPS or atomic clock networks, and can also be applied to other networks of precision measurement devices.

  7. Indirect dark matter searches: current status and perspectives

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Many theoretical ideas for the particle nature of dark matter exist. The  most popular models often predict that dark matter particles self-annihilate or decay, giving rise to potentially detectable signatures in astronomical observations.  I will summarize the current status of searches for such signatures and critically reassess recent claims for dark matter signals.  I will further provide an outlook on anticipated developments in the next 10 years, and discuss new methods to facilitate strategy development.

  8. Application of fast orthogonal search to linear and nonlinear stochastic systems

    DEFF Research Database (Denmark)

    Chon, K H; Korenberg, M J; Holstein-Rathlou, N H

    1997-01-01

    Standard deterministic autoregressive moving average (ARMA) models consider prediction errors to be unexplainable noise sources. The accuracy of the estimated ARMA model parameters depends on producing minimum prediction errors. In this study, an accurate algorithm is developed for estimating...... linear and nonlinear stochastic ARMA model parameters by using a method known as fast orthogonal search, with an extended model containing prediction errors as part of the model estimation process. The extended algorithm uses fast orthogonal search in a two-step procedure in which deterministic terms...

  9. Application of a heuristic search method for generation of fuel reload configurations

    International Nuclear Information System (INIS)

    Galperin, A.; Nissan, E.

    1988-01-01

    A computerized heuristic search method for the generation and optimization of fuel reload configurations is proposed and investigated. The heuristic knowledge is expressed modularly in the form of ''IF-THEN'' production rules. The method was implemented in a program coded in the Franz LISP programming language and executed under the UNIX operating system. A test problem was formulated, based on a typical light water reactor reload problem with a few simplifications assumed, in order to allow formulation of the reload strategy into a relatively small number of rules. A computer run of the problem was performed with a VAX-780 machine. A set of 312 solutions was generated in -- 20 min of execution time. Testing of a few arbitrarily chosen configurations demonstrated reasonably good performance for the computer-generated solutions. A computerized generator of reload configurations may be used for the fast generation or modification of reload patterns and as a tool for the formulation, tuning, and testing of the heuristic knowledge rules used by an ''expert'' fuel manager

  10. Constructing Effective Search Strategies for Electronic Searching.

    Science.gov (United States)

    Flanagan, Lynn; Parente, Sharon Campbell

    Electronic databases have grown tremendously in both number and popularity since their development during the 1960s. Access to electronic databases in academic libraries was originally offered primarily through mediated search services by trained librarians; however, the advent of CD-ROM and end-user interfaces for online databases has shifted the…

  11. Application of Tabu Search Algorithm in Job Shop Scheduling

    Directory of Open Access Journals (Sweden)

    Betrianis Betrianis

    2010-10-01

    Full Text Available Tabu Search is one of local search methods which is used to solve the combinatorial optimization problem. This method aimed is to make the searching process of the best solution in a complex combinatorial optimization problem(np hard, ex : job shop scheduling problem, became more effective, in a less computational time but with no guarantee to optimum solution.In this paper, tabu search is used to solve the job shop scheduling problem consists of 3 (three cases, which is ordering package of September, October and November with objective of minimizing makespan (Cmax. For each ordering package, there is a combination for initial solution and tabu list length. These result then  compared with 4 (four other methods using basic dispatching rules such as Shortest Processing Time (SPT, Earliest Due Date (EDD, Most Work Remaining (MWKR dan First Come First Served (FCFS. Scheduling used Tabu Search Algorithm is sensitive for variables changes and gives makespan shorter than scheduling used by other four methods.

  12. LHCb Exotica and Higgs searches

    CERN Multimedia

    Lucchesi, Donatella

    2016-01-01

    The unique phase space coverage and features of the LHCb detector at the LHC makes it an ideal environment to probe complementary New Physics parameter regions. In particular, recently developed jet tagging algorithms are ideal for searches involving $b$ and $c$ jets. This poster will review different jet-related exotica searches together with the efforts in the search for a Higgs boson decaying to a pair of heavy quarks.

  13. Automatic multi-cycle reload design of pressurized water reactor using particle swarm optimization algorithm and local search

    International Nuclear Information System (INIS)

    Lin, Chaung; Hung, Shao-Chun

    2013-01-01

    Highlights: • An automatic multi-cycle core reload design tool, which searches the fresh fuel assembly composition, is developed. • The search method adopts particle swarm optimization and local search. • The design objectives are to achieve required cycle energy, minimum fuel cost, and the satisfactory constraints. • The constraints include the hot zero power moderator temperature coefficient and the hot channel factor. - Abstract: An automatic multi-cycle core reload design tool, which searches the fresh fuel assembly composition, is developed using particle swarm optimization and local search. The local search uses heuristic rules to change the current search result a little so that the result can be improved. The composition of the fresh fuel assemblies should provide the required cycle energy and satisfy the constraints, such as the hot zero power moderator temperature coefficient and the hot channel factor. Instead of designing loading pattern for each FA composition during search process, two fixed loading patterns are used to calculate the core status and the better fitness function value is used in the search process. The fitness function contains terms which reflect the design objectives such as cycle energy, constraints, and fuel cost. The results show that the developed tool can achieve the desire objective

  14. An FMRI-compatible Symbol Search task.

    Science.gov (United States)

    Liebel, Spencer W; Clark, Uraina S; Xu, Xiaomeng; Riskin-Jones, Hannah H; Hawkshead, Brittany E; Schwarz, Nicolette F; Labbe, Donald; Jerskey, Beth A; Sweet, Lawrence H

    2015-03-01

    Our objective was to determine whether a Symbol Search paradigm developed for functional magnetic resonance imaging (FMRI) is a reliable and valid measure of cognitive processing speed (CPS) in healthy older adults. As all older adults are expected to experience cognitive declines due to aging, and CPS is one of the domains most affected by age, establishing a reliable and valid measure of CPS that can be administered inside an MR scanner may prove invaluable in future clinical and research settings. We evaluated the reliability and construct validity of a newly developed FMRI Symbol Search task by comparing participants' performance in and outside of the scanner and to the widely used and standardized Symbol Search subtest of the Wechsler Adult Intelligence Scale (WAIS). A brief battery of neuropsychological measures was also administered to assess the convergent and discriminant validity of the FMRI Symbol Search task. The FMRI Symbol Search task demonstrated high test-retest reliability when compared to performance on the same task administered out of the scanner (r=.791; pSymbol Search (r=.717; pSymbol Search task were also observed. The FMRI Symbol Search task is a reliable and valid measure of CPS in healthy older adults and exhibits expected sensitivity to the effects of age on CPS performance.

  15. Accelerated Profile HMM Searches.

    Directory of Open Access Journals (Sweden)

    Sean R Eddy

    2011-10-01

    Full Text Available Profile hidden Markov models (profile HMMs and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the "multiple segment Viterbi" (MSV algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call "sparse rescaling". These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches.

  16. Optimal search strategies for detecting cost and economic studies in EMBASE

    Directory of Open Access Journals (Sweden)

    Haynes R Brian

    2006-06-01

    Full Text Available Abstract Background Economic evaluations in the medical literature compare competing diagnosis or treatment methods for their use of resources and their expected outcomes. The best evidence currently available from research regarding both cost and economic comparisons will continue to expand as this type of information becomes more important in today's clinical practice. Researchers and clinicians need quick, reliable ways to access this information. A key source of this type of information is large bibliographic databases such as EMBASE. The objective of this study was to develop search strategies that optimize the retrieval of health costs and economics studies from EMBASE. Methods We conducted an analytic survey, comparing hand searches of journals with retrievals from EMBASE for candidate search terms and combinations. 6 research assistants read all issues of 55 journals indexed by EMBASE for the publishing year 2000. We rated all articles using purpose and quality indicators and categorized them into clinically relevant original studies, review articles, general papers, or case reports. The original and review articles were then categorized for purpose (i.e., cost and economics and other clinical topics and depending on the purpose as 'pass' or 'fail' for methodologic rigor. Candidate search strategies were developed for economic and cost studies, then run in the 55 EMBASE journals, the retrievals being compared with the hand search data. The sensitivity, specificity, precision, and accuracy of the search strategies were calculated. Results Combinations of search terms for detecting both cost and economic studies attained levels of 100% sensitivity with specificity levels of 92.9% and 92.3% respectively. When maximizing for both sensitivity and specificity, the combination of terms for detecting cost studies (sensitivity increased 2.2% over the single term but at a slight decrease in specificity of 0.9%. The maximized combination of terms

  17. Semi-automating the manual literature search for systematic reviews increases efficiency.

    Science.gov (United States)

    Chapman, Andrea L; Morgan, Laura C; Gartlehner, Gerald

    2010-03-01

    To minimise retrieval bias, manual literature searches are a key part of the search process of any systematic review. Considering the need to have accurate information, valid results of the manual literature search are essential to ensure scientific standards; likewise efficient approaches that minimise the amount of personnel time required to conduct a manual literature search are of great interest. The objective of this project was to determine the validity and efficiency of a new manual search method that utilises the scopus database. We used the traditional manual search approach as the gold standard to determine the validity and efficiency of the proposed scopus method. Outcome measures included completeness of article detection and personnel time involved. Using both methods independently, we compared the results based on accuracy of the results, validity and time spent conducting the search, efficiency. Regarding accuracy, the scopus method identified the same studies as the traditional approach indicating its validity. In terms of efficiency, using scopus led to a time saving of 62.5% compared with the traditional approach (3 h versus 8 h). The scopus method can significantly improve the efficiency of manual searches and thus of systematic reviews.

  18. Development and empirical user-centered evaluation of semantically-based query recommendation for an electronic health record search engine.

    Science.gov (United States)

    Hanauer, David A; Wu, Danny T Y; Yang, Lei; Mei, Qiaozhu; Murkowski-Steffy, Katherine B; Vydiswaran, V G Vinod; Zheng, Kai

    2017-03-01

    The utility of biomedical information retrieval environments can be severely limited when users lack expertise in constructing effective search queries. To address this issue, we developed a computer-based query recommendation algorithm that suggests semantically interchangeable terms based on an initial user-entered query. In this study, we assessed the value of this approach, which has broad applicability in biomedical information retrieval, by demonstrating its application as part of a search engine that facilitates retrieval of information from electronic health records (EHRs). The query recommendation algorithm utilizes MetaMap to identify medical concepts from search queries and indexed EHR documents. Synonym variants from UMLS are used to expand the concepts along with a synonym set curated from historical EHR search logs. The empirical study involved 33 clinicians and staff who evaluated the system through a set of simulated EHR search tasks. User acceptance was assessed using the widely used technology acceptance model. The search engine's performance was rated consistently higher with the query recommendation feature turned on vs. off. The relevance of computer-recommended search terms was also rated high, and in most cases the participants had not thought of these terms on their own. The questions on perceived usefulness and perceived ease of use received overwhelmingly positive responses. A vast majority of the participants wanted the query recommendation feature to be available to assist in their day-to-day EHR search tasks. Challenges persist for users to construct effective search queries when retrieving information from biomedical documents including those from EHRs. This study demonstrates that semantically-based query recommendation is a viable solution to addressing this challenge. Published by Elsevier Inc.

  19. Search Results | Page 6 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 51 - 60 of 369 ... Development of a New Tobacco Tax Policy in Peru. The most direct and effective method for reducing tobacco consumption is to increase the price of tobacco products through legislating higher taxes. Project.

  20. Parallel content-based sub-image retrieval using hierarchical searching.

    Science.gov (United States)

    Yang, Lin; Qi, Xin; Xing, Fuyong; Kurc, Tahsin; Saltz, Joel; Foran, David J

    2014-04-01

    The capacity to systematically search through large image collections and ensembles and detect regions exhibiting similar morphological characteristics is central to pathology diagnosis. Unfortunately, the primary methods used to search digitized, whole-slide histopathology specimens are slow and prone to inter- and intra-observer variability. The central objective of this research was to design, develop, and evaluate a content-based image retrieval system to assist doctors for quick and reliable content-based comparative search of similar prostate image patches. Given a representative image patch (sub-image), the algorithm will return a ranked ensemble of image patches throughout the entire whole-slide histology section which exhibits the most similar morphologic characteristics. This is accomplished by first performing hierarchical searching based on a newly developed hierarchical annular histogram (HAH). The set of candidates is then further refined in the second stage of processing by computing a color histogram from eight equally divided segments within each square annular bin defined in the original HAH. A demand-driven master-worker parallelization approach is employed to speed up the searching procedure. Using this strategy, the query patch is broadcasted to all worker processes. Each worker process is dynamically assigned an image by the master process to search for and return a ranked list of similar patches in the image. The algorithm was tested using digitized hematoxylin and eosin (H&E) stained prostate cancer specimens. We have achieved an excellent image retrieval performance. The recall rate within the first 40 rank retrieved image patches is ∼90%. Both the testing data and source code can be downloaded from http://pleiad.umdnj.edu/CBII/Bioinformatics/.

  1. Combination of Multiple Spectral Libraries Improves the Current Search Methods Used to Identify Missing Proteins in the Chromosome-Centric Human Proteome Project.

    Science.gov (United States)

    Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Kim, Kwang-Youl; Kwon, Kyung-Hoon; Yoo, Jong Shin; Omenn, Gilbert S; Baker, Mark S; Hancock, William S; Paik, Young-Ki

    2015-12-04

    Approximately 2.9 billion long base-pair human reference genome sequences are known to encode some 20 000 representative proteins. However, 3000 proteins, that is, ~15% of all proteins, have no or very weak proteomic evidence and are still missing. Missing proteins may be present in rare samples in very low abundance or be only temporarily expressed, causing problems in their detection and protein profiling. In particular, some technical limitations cause missing proteins to remain unassigned. For example, current mass spectrometry techniques have high limits and error rates for the detection of complex biological samples. An insufficient proteome coverage in a reference sequence database and spectral library also raises major issues. Thus, the development of a better strategy that results in greater sensitivity and accuracy in the search for missing proteins is necessary. To this end, we used a new strategy, which combines a reference spectral library search and a simulated spectral library search, to identify missing proteins. We built the human iRefSPL, which contains the original human reference spectral library and additional peptide sequence-spectrum match entries from other species. We also constructed the human simSPL, which contains the simulated spectra of 173 907 human tryptic peptides determined by MassAnalyzer (version 2.3.1). To prove the enhanced analytical performance of the combination of the human iRefSPL and simSPL methods for the identification of missing proteins, we attempted to reanalyze the placental tissue data set (PXD000754). The data from each experiment were analyzed using PeptideProphet, and the results were combined using iProphet. For the quality control, we applied the class-specific false-discovery rate filtering method. All of the results were filtered at a false-discovery rate of libraries, iRefSPL and simSPL, were designed to ensure no overlap of the proteome coverage. They were shown to be complementary to spectral library

  2. 'Sciencenet'--towards a global search and share engine for all scientific knowledge.

    Science.gov (United States)

    Lütjohann, Dominic S; Shah, Asmi H; Christen, Michael P; Richter, Florian; Knese, Karsten; Liebel, Urban

    2011-06-15

    Modern biological experiments create vast amounts of data which are geographically distributed. These datasets consist of petabytes of raw data and billions of documents. Yet to the best of our knowledge, a search engine technology that searches and cross-links all different data types in life sciences does not exist. We have developed a prototype distributed scientific search engine technology, 'Sciencenet', which facilitates rapid searching over this large data space. By 'bringing the search engine to the data', we do not require server farms. This platform also allows users to contribute to the search index and publish their large-scale data to support e-Science. Furthermore, a community-driven method guarantees that only scientific content is crawled and presented. Our peer-to-peer approach is sufficiently scalable for the science web without performance or capacity tradeoff. The free to use search portal web page and the downloadable client are accessible at: http://sciencenet.kit.edu. The web portal for index administration is implemented in ASP.NET, the 'AskMe' experiment publisher is written in Python 2.7, and the backend 'YaCy' search engine is based on Java 1.6.

  3. Enhancing search efficiency by means of a search filter for finding all studies on animal experimentation in PubMed.

    Science.gov (United States)

    Hooijmans, Carlijn R; Tillema, Alice; Leenaars, Marlies; Ritskes-Hoitinga, Merel

    2010-07-01

    Collecting and analysing all available literature before starting an animal experiment is important and it is indispensable when writing a systematic review (SR) of animal research. Writing such review prevents unnecessary duplication of animal studies and thus unnecessary animal use (Reduction). One of the factors currently impeding the production of 'high-quality' SRs in laboratory animal science is the fact that searching for all available literature concerning animal experimentation is rather difficult. In order to diminish these difficulties, we developed a search filter for PubMed to detect all publications concerning animal studies. This filter was compared with the method most frequently used, the PubMed Limit: Animals, and validated further by performing two PubMed topic searches. Our filter performs much better than the PubMed limit: it retrieves, on average, 7% more records. Other important advantages of our filter are that it also finds the most recent records and that it is easy to use. All in all, by using our search filter in PubMed, all available literature concerning animal studies on a specific topic can easily be found and assessed, which will help in increasing the scientific quality and thereby the ethical validity of animal experiments.

  4. Search for intermediate vector bosons

    International Nuclear Information System (INIS)

    Klajn, D.B.; Rubbia, K.; Meer, S.

    1983-01-01

    Problem of registration and search for intermediate vector bosons is discussed. According to weak-current theory there are three intermediate vector bosons with +1(W + )-1(W - ) and zero (Z 0 ) electric charges. It was suggested to conduct the investigation into particles in 1976 by cline, Rubbia and Makintair using proton-antiproton beams. Major difficulties of the experiment are related to the necessity of formation of sufficient amount of antiparticles and the method of antiproton beam ''cooling'' for the purpose of reduction of its random movements. The stochastic method was suggested by van der Meer in 1968 as one of possible cooling methods. Several large detectors were designed for searching intermediate vector bosons

  5. Search Results | Page 72 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2007-03-01

    Results 711 - 720 of 8489 ... Assessing Organizational Performance - Level 1- International Development Research Centre. Published date. March 1, 2007. Studies. -. Novel Epidemiological Method: Using Newspapers To Provide Insight Into Climate Change And Health. Published date. June 25, 2012. Studies.

  6. Dark Matter Searches at ATLAS

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The astrophysical evidence of dark matter provides some of the most compelling clues to the nature of physics beyond the Standard Model. From these clues, ATLAS has developed a broad and systematic search program for dark matter production in LHC collisions. These searches are now entering their prime, with the LHC now colliding protons at the increased 13 TeV centre-of-mass energy and set to deliver much larger datasets than ever before. The results of these searches on the first 13 TeV data, their interpretation, and the design and possible evolution of the search program will be presented.

  7. Assessing the performance of methodological search filters to improve the efficiency of evidence information retrieval: five literature reviews and a qualitative study.

    Science.gov (United States)

    Lefebvre, Carol; Glanville, Julie; Beale, Sophie; Boachie, Charles; Duffy, Steven; Fraser, Cynthia; Harbour, Jenny; McCool, Rachael; Smith, Lynne

    2017-11-01

    Effective study identification is essential for conducting health research, developing clinical guidance and health policy and supporting health-care decision-making. Methodological search filters (combinations of search terms to capture a specific study design) can assist in searching to achieve this. This project investigated the methods used to assess the performance of methodological search filters, the information that searchers require when choosing search filters and how that information could be better provided. Five literature reviews were undertaken in 2010/11: search filter development and testing; comparison of search filters; decision-making in choosing search filters; diagnostic test accuracy (DTA) study methods; and decision-making in choosing diagnostic tests. We conducted interviews and a questionnaire with experienced searchers to learn what information assists in the choice of search filters and how filters are used. These investigations informed the development of various approaches to gathering and reporting search filter performance data. We acknowledge that there has been a regrettable delay between carrying out the project, including the searches, and the publication of this report, because of serious illness of the principal investigator. The development of filters most frequently involved using a reference standard derived from hand-searching journals. Most filters were validated internally only. Reporting of methods was generally poor. Sensitivity, precision and specificity were the most commonly reported performance measures and were presented in tables. Aspects of DTA study methods are applicable to search filters, particularly in the development of the reference standard. There is limited evidence on how clinicians choose between diagnostic tests. No published literature was found on how searchers select filters. Interviewing and questioning searchers via a questionnaire found that filters were not appropriate for all tasks but were

  8. Modification site localization scoring integrated into a search engine.

    Science.gov (United States)

    Baker, Peter R; Trinidad, Jonathan C; Chalkley, Robert J

    2011-07-01

    Large proteomic data sets identifying hundreds or thousands of modified peptides are becoming increasingly common in the literature. Several methods for assessing the reliability of peptide identifications both at the individual peptide or data set level have become established. However, tools for measuring the confidence of modification site assignments are sparse and are not often employed. A few tools for estimating phosphorylation site assignment reliabilities have been developed, but these are not integral to a search engine, so require a particular search engine output for a second step of processing. They may also require use of a particular fragmentation method and are mostly only applicable for phosphorylation analysis, rather than post-translational modifications analysis in general. In this study, we present the performance of site assignment scoring that is directly integrated into the search engine Protein Prospector, which allows site assignment reliability to be automatically reported for all modifications present in an identified peptide. It clearly indicates when a site assignment is ambiguous (and if so, between which residues), and reports an assignment score that can be translated into a reliability measure for individual site assignments.

  9. Forecasting solar radiation using an optimized hybrid model by Cuckoo Search algorithm

    International Nuclear Information System (INIS)

    Wang, Jianzhou; Jiang, He; Wu, Yujie; Dong, Yao

    2015-01-01

    Due to energy crisis and environmental problems, it is very urgent to find alternative energy sources nowadays. Solar energy, as one of the great potential clean energies, has widely attracted the attention of researchers. In this paper, an optimized hybrid method by CS (Cuckoo Search) on the basis of the OP-ELM (Optimally Pruned Extreme Learning Machine), called CS-OP-ELM, is developed to forecast clear sky and real sky global horizontal radiation. First, MRSR (Multiresponse Sparse Regression) and LOO-CV (leave-one-out cross-validation) can be applied to rank neurons and prune the possibly meaningless neurons of the FFNN (Feed Forward Neural Network), respectively. Then, Direct strategy and Direct-Recursive strategy based on OP-ELM are introduced to build a hybrid model. Furthermore, CS (Cuckoo Search) optimized algorithm is employed to determine the proper weight coefficients. In order to verify the effectiveness of the developed method, hourly solar radiation data from six sites of the United States has been collected, and methods like ARMA (Autoregression moving average), BP (Back Propagation) neural network and OP-ELM can be compared with CS-OP-ELM. Experimental results show the optimized hybrid method CS-OP-ELM has the best forecasting performance. - Highlights: • An optimized hybrid method called CS-OP-ELM is proposed to forecast solar radiation. • CS-OP-ELM adopts multiple variables dataset as input variables. • Direct and Direct-Recursive strategy are introduced to build a hybrid model. • CS (Cuckoo Search) algorithm is used to determine the optimal weight coefficients. • The proposed method has the best performance compared with other methods

  10. PROSPECT improves cis-acting regulatory element prediction by integrating expression profile data with consensus pattern searches

    Science.gov (United States)

    Fujibuchi, Wataru; Anderson, John S. J.; Landsman, David

    2001-01-01

    Consensus pattern and matrix-based searches designed to predict cis-acting transcriptional regulatory sequences have historically been subject to large numbers of false positives. We sought to decrease false positives by incorporating expression profile data into a consensus pattern-based search method. We have systematically analyzed the expression phenotypes of over 6000 yeast genes, across 121 expression profile experiments, and correlated them with the distribution of 14 known regulatory elements over sequences upstream of the genes. Our method is based on a metric we term probabilistic element assessment (PEA), which is a ranking of potential sites based on sequence similarity in the upstream regions of genes with similar expression phenotypes. For eight of the 14 known elements that we examined, our method had a much higher selectivity than a naïve consensus pattern search. Based on our analysis, we have developed a web-based tool called PROSPECT, which allows consensus pattern-based searching of gene clusters obtained from microarray data. PMID:11574681

  11. Search Results | Page 844 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 8431 - 8440 of 8497 ... ... objective was to develop a method of breaking the political deadlocks that too ... In the South, the need for scientific knowledge continues to expand. ... Water as a Human Right for the Middle East and North Africa.

  12. Search Results | Page 781 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 7801 - 7810 of 9602 ... ... objective was to develop a method of breaking the political deadlocks that too ... In the South, the need for scientific knowledge continues to expand. ... Water as a Human Right for the Middle East and North Africa.

  13. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  14. Super-long Anabiosis of Ancient Microorganisms in Ice and Terrestrial Models for Development of Methods to Search for Life on Mars, Europa and other Planetary Bodies

    Science.gov (United States)

    Abyzov, S. S.; Duxbury, N. S.; Bobin, N. E.; Fukuchi, M.; Hoover, R. B.; Kanda, H.; Mitskevich, I. N.; Mulyukin, A. L.; Naganuma, T.; Poglazova, M. N.; hide

    2007-01-01

    Successful missions to Mars, Europe and other bodies of the Solar system have created a prerequisite to search for extraterrestrial life. The first attempts of microbial life detection on the Martian surface by the Viking landed missions gave no biological results. Microbiological investigations of the Martian subsurface ground ice layers seem to be more promising. It is well substantiated to consider the Antarctic ice sheet and the Antarctic and Arctic permafrost as terrestrial analogues of Martian habitats. The results of our long-standing microbiological studies of the Antarctic ice would provide the basis for detection of viable microbial cells on Mars. Our microbiological investigations of the deepest and thus most ancient strata of the Antarctic ice sheet for the first time gave evidence for the natural phenomenon of long-term anabiosis (preservation of viability and vitality for millennia years). A combination of classical microbiological methods, epifluorescence microscopy, SEM, TEM, molecular diagnostics, radioisotope labeling and other techniques made it possible for us to obtain convincing proof of the presence of pro- and eukaryotes in the Antarctic ice sheet. In this communication, we will review and discuss some critical issues related to the detection of viable microorganisms in cold terrestrial environments with regard to future searches for microbial life and/or its biological signatures on extraterrestrial objects.

  15. Searching for globally optimal functional forms for interatomic potentials using genetic programming with parallel tempering.

    Science.gov (United States)

    Slepoy, A; Peters, M D; Thompson, A P

    2007-11-30

    Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. Copyright (c) 2007 Wiley Periodicals, Inc.

  16. The max–min ant system and tabu search for pressurized water reactor loading pattern design

    International Nuclear Information System (INIS)

    Lin, Chaung; Chen, Ying-Hsiu

    2014-01-01

    Highlights: • An automatic loading pattern design tool for a pressurized water reactor is developed. • The design method consists of max–min ant system and tabu search. • The heuristic rules are developed to generate the candidates for tabu search. • The initial solution of tabu search is provided by max–min ant system. • The new algorithm shows very satisfactory results compared to the old one. - Abstract: An automatic loading pattern (LP) design tool for a pressurized water reactor is developed. The design procedure consists of two steps: first, a LP is generated using max–min ant system (MMAS) and then tabu search (TS) is adopted to search the satisfactory LP. The MMAS is previously developed and the TS process is newly-developed. The heuristic rules are implemented to generate the candidate LP in TS process. The heuristic rules are comprised of two kinds of action, i.e., a single swap in the location of two fuel assemblies and rotation of fuel assembly. Since developed TS process is a local search algorithm, it is efficient for the minor change of LP. It means that a proper initial LP should be provided by the first step, i.e., by MMAS. The design requirements such as hot channel factor, the hot zero power moderator temperature coefficient, and cycle length are formulated in the objective function. The results show that the developed tool can obtain the satisfactory LP and dramatically reduce the computation time compared with previous tool using ant system alone

  17. An Exploration of Retrieval-Enhancing Methods for Integrated Search in a Digital Library

    DEFF Research Database (Denmark)

    Sørensen, Diana Ransgaard; Bogers, Toine; Larsen, Birger

    2012-01-01

    Integrated search is defined as searching across different document types and representations simultaneously, with the goal of presenting the user with a single ranked result list containing the optimal mix of document types. In this paper, we compare various approaches to integrating three diffe...

  18. BioCarian: search engine for exploratory searches in heterogeneous biological databases.

    Science.gov (United States)

    Zaki, Nazar; Tennakoon, Chandana

    2017-10-02

    on previously published viral integration data and were able to deduce the main conclusions of the original publication. BioCarian is accessible via http://www.biocarian.com . We have developed a search engine to explore RDF databases that can be used by both novice and advanced users.

  19. Scattering quantum random-walk search with errors

    International Nuclear Information System (INIS)

    Gabris, A.; Kiss, T.; Jex, I.

    2007-01-01

    We analyze the realization of a quantum-walk search algorithm in a passive, linear optical network. The specific model enables us to consider the effect of realistic sources of noise and losses on the search efficiency. Photon loss uniform in all directions is shown to lead to the rescaling of search time. Deviation from directional uniformity leads to the enhancement of the search efficiency compared to uniform loss with the same average. In certain cases even increasing loss in some of the directions can improve search efficiency. We show that while we approach the classical limit of the general search algorithm by introducing random phase fluctuations, its utility for searching is lost. Using numerical methods, we found that for static phase errors the averaged search efficiency displays a damped oscillatory behavior that asymptotically tends to a nonzero value

  20. Optimizing literature search in systematic reviews

    DEFF Research Database (Denmark)

    Aagaard, Thomas; Lund, Hans; Juhl, Carsten Bogh

    2016-01-01

    BACKGROUND: When conducting systematic reviews, it is essential to perform a comprehensive literature search to identify all published studies relevant to the specific research question. The Cochrane Collaborations Methodological Expectations of Cochrane Intervention Reviews (MECIR) guidelines...... of musculoskeletal disorders. METHODS: Data sources were systematic reviews published by the Cochrane Musculoskeletal Review Group, including at least five RCTs, reporting a search history, searching MEDLINE, EMBASE, CENTRAL, and adding reference- and hand-searching. Additional databases were deemed eligible...... if they indexed RCTs, were in English and used in more than three of the systematic reviews. Relative recall was calculated as the number of studies identified by the literature search divided by the number of eligible studies i.e. included studies in the individual systematic reviews. Finally, cumulative median...

  1. A minimal path searching approach for active shape model (ASM)-based segmentation of the lung

    Science.gov (United States)

    Guo, Shengwen; Fei, Baowei

    2009-02-01

    We are developing a minimal path searching method for active shape model (ASM)-based segmentation for detection of lung boundaries on digital radiographs. With the conventional ASM method, the position and shape parameters of the model points are iteratively refined and the target points are updated by the least Mahalanobis distance criterion. We propose an improved searching strategy that extends the searching points in a fan-shape region instead of along the normal direction. A minimal path (MP) deformable model is applied to drive the searching procedure. A statistical shape prior model is incorporated into the segmentation. In order to keep the smoothness of the shape, a smooth constraint is employed to the deformable model. To quantitatively assess the ASM-MP segmentation, we compare the automatic segmentation with manual segmentation for 72 lung digitized radiographs. The distance error between the ASM-MP and manual segmentation is 1.75 +/- 0.33 pixels, while the error is 1.99 +/- 0.45 pixels for the ASM. Our results demonstrate that our ASM-MP method can accurately segment the lung on digital radiographs.

  2. A Minimal Path Searching Approach for Active Shape Model (ASM)-based Segmentation of the Lung.

    Science.gov (United States)

    Guo, Shengwen; Fei, Baowei

    2009-03-27

    We are developing a minimal path searching method for active shape model (ASM)-based segmentation for detection of lung boundaries on digital radiographs. With the conventional ASM method, the position and shape parameters of the model points are iteratively refined and the target points are updated by the least Mahalanobis distance criterion. We propose an improved searching strategy that extends the searching points in a fan-shape region instead of along the normal direction. A minimal path (MP) deformable model is applied to drive the searching procedure. A statistical shape prior model is incorporated into the segmentation. In order to keep the smoothness of the shape, a smooth constraint is employed to the deformable model. To quantitatively assess the ASM-MP segmentation, we compare the automatic segmentation with manual segmentation for 72 lung digitized radiographs. The distance error between the ASM-MP and manual segmentation is 1.75 ± 0.33 pixels, while the error is 1.99 ± 0.45 pixels for the ASM. Our results demonstrate that our ASM-MP method can accurately segment the lung on digital radiographs.

  3. Computer-aided diagnostic scheme for the detection of lung nodules on chest radiographs: Localized search method based on anatomical classification

    International Nuclear Information System (INIS)

    Shiraishi, Junji; Li Qiang; Suzuki, Kenji; Engelmann, Roger; Doi, Kunio

    2006-01-01

    We developed an advanced computer-aided diagnostic (CAD) scheme for the detection of various types of lung nodules on chest radiographs intended for implementation in clinical situations. We used 924 digitized chest images (992 noncalcified nodules) which had a 500x500 matrix size with a 1024 gray scale. The images were divided randomly into two sets which were used for training and testing of the computerized scheme. In this scheme, the lung field was first segmented by use of a ribcage detection technique, and then a large search area (448x448 matrix size) within the chest image was automatically determined by taking into account the locations of a midline and a top edge of the segmented ribcage. In order to detect lung nodule candidates based on a localized search method, we divided the entire search area into 7x7 regions of interest (ROIs: 64x64 matrix size). In the next step, each ROI was classified anatomically into apical, peripheral, hilar, and diaphragm/heart regions by use of its image features. Identification of lung nodule candidates and extraction of image features were applied for each localized region (128x128 matrix size), each having its central part (64x64 matrix size) located at a position corresponding to a ROI that was classified anatomically in the previous step. Initial candidates were identified by use of the nodule-enhanced image obtained with the average radial-gradient filtering technique, in which the filter size was varied adaptively depending on the location and the anatomical classification of the ROI. We extracted 57 image features from the original and nodule-enhanced images based on geometric, gray-level, background structure, and edge-gradient features. In addition, 14 image features were obtained from the corresponding locations in the contralateral subtraction image. A total of 71 image features were employed for three sequential artificial neural networks (ANNs) in order to reduce the number of false-positive candidates. All

  4. Spatial Search, Position Papers

    OpenAIRE

    Center for Spatial Studies, UCSB

    2014-01-01

    The Spatial Search specialist meeting in Santa Barbara (December 2014) brought together 35 academic and industry representatives from computational, geospatial, and cognitive sciences with interest in focused discussions on the development of an interdisciplinary research agenda to advance spatial search from scientific and engineering viewpoints. The position papers from participants represent the shared expertise that guided discussions and the formulation of research questions about proces...

  5. Method Development in Forensic Toxicology.

    Science.gov (United States)

    Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona

    2017-01-01

    In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  6. 'Alive' searches as complementing death searches in the epidemiological follow-up of Ontario miners

    International Nuclear Information System (INIS)

    Fair, M.E.; Newcombe, H.B.; Lalonde, P.; Poliquin, C.

    1988-02-01

    'Alive' searches have been used to complement the 'death' searches, in a study of the mortality experience of a cohort of Ontario miners. The purpose has been to develop a way of distinguishing between those cohort members who are 'confirmed' alive at a given time, and those who are 'lost to follow-up'. A total of 30,000 Workers' Compensation Board (WCB) records with valid Social Insurance Numbers (SIN) were used to search the income tax files by computer over two consecutive years (1977 and 1978) representing nearly 27 million tax returns. These tax file searches using SIN numbers have provided information on the procedures to be used in, and likely success of, the corresponding searches of the tax files that could be carried out where the SIN number is not available on the work records. The latter kind of search would have to be based on names, birth dates and such, and would be probabilistic in nature. The results of the study were as follows: After the initial death search, it was found that 7.5% of the cohort had died in Canada and their records were found on the Mortality Data Base. The remaining 92.5% has been 'assumed alive' in the earlier analysis. After the 'alive' follow-up using the income tax file, one was able to confirm that 89.1% of the 'assumed alive' had filed an income tax after the study period. Thus only 3.4% of the cohort remained untraced. Among those there could be an admixture of deaths outside of Canada, persons who have moved outside of Canada and are still alive, and/or persons alive within Canada, but who have not filed an income tax return. This study has indicated that the procedures developed are useful for purposes of 'alive' follow-up, for evaluation of the quality of the Mortality Data Base-cohort death file searches, for improving the accuracy of analytical results of epidemiological studies, and for reducing the cost and labour of resolving doubtful death searches

  7. Search Results | Page 820 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 8191 - 8200 of 8491 ... The Local Agenda 21 Planning Guide: An Introduction to Sustainable Development Planning. The Local Agenda 21 Planning Guide is an introductory guide on the planning elements, methods, and tools being used by local governments to implement Agenda 21 at the community level.

  8. Search Results | Page 48 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 471 - 480 of 902 ... Latin American Security, Drugs and Democracy (LASDD) Fellowship Program ... The most direct and effective method for reducing tobacco ... of how to better manage natural resources will contribute to the region's ... can affect developing countries' ability to implement regulatory goals and reforms ...

  9. Use of a GPGPU means for the development of search programs of defects of monochrome half-tone pictures

    International Nuclear Information System (INIS)

    Dudnik, V.A.; Kudryavtsev, V.I.; Sereda, T.M.; Us, S.A.; Shestakov, M.V.

    2013-01-01

    Application of a GPGPU means for the development of search programs of defects of monochrome half-tone pictures is described. The description of realization of algorithm of search of images' defects by the means of technology CUDA (Compute Unified Device Architecture - the unified hardware-software decision for parallel calculations on GPU) companies NVIDIA is resulted. It is done the comparison of the temporary characteristics of performance of images' updating without application GPU and with use of opportunities of graphic processor GeForce 8800.

  10. Mathematical programming solver based on local search

    CERN Document Server

    Gardi, Frédéric; Darlay, Julien; Estellon, Bertrand; Megel, Romain

    2014-01-01

    This book covers local search for combinatorial optimization and its extension to mixed-variable optimization. Although not yet understood from the theoretical point of view, local search is the paradigm of choice for tackling large-scale real-life optimization problems. Today's end-users demand interactivity with decision support systems. For optimization software, this means obtaining good-quality solutions quickly. Fast iterative improvement methods, like local search, are suited to satisfying such needs. Here the authors show local search in a new light, in particular presenting a new kind of mathematical programming solver, namely LocalSolver, based on neighborhood search. First, an iconoclast methodology is presented to design and engineer local search algorithms. The authors' concern about industrializing local search approaches is of particular interest for practitioners. This methodology is applied to solve two industrial problems with high economic stakes. Software based on local search induces ex...

  11. A modified harmony search based method for optimal rural radial ...

    African Journals Online (AJOL)

    International Journal of Engineering, Science and Technology. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 2, No 3 (2010) >. Log in or Register to get access to full text downloads.

  12. Optimal correction and design parameter search by modern methods of rigorous global optimization

    International Nuclear Information System (INIS)

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  13. A practical optimization procedure for radial BWR fuel lattice design using tabu search with a multiobjective function

    International Nuclear Information System (INIS)

    Francois, J.L.; Martin-del-Campo, C.; Francois, R.; Morales, L.B.

    2003-01-01

    An optimization procedure based on the tabu search (TS) method was developed for the design of radial enrichment and gadolinia distributions for boiling water reactor (BWR) fuel lattices. The procedure was coded in a computing system in which the optimization code uses the tabu search method to select potential solutions and the HELIOS code to evaluate them. The goal of the procedure is to search for an optimal fuel utilization, looking for a lattice with minimum average enrichment, with minimum deviation of reactivity targets and with a local power peaking factor (PPF) lower than a limit value. Time-dependent-depletion (TDD) effects were considered in the optimization process. The additive utility function method was used to convert the multiobjective optimization problem into a single objective problem. A strategy to reduce the computing time employed by the optimization was developed and is explained in this paper. An example is presented for a 10x10 fuel lattice with 10 different fuel compositions. The main contribution of this study is the development of a practical TDD optimization procedure for BWR fuel lattice design, using TS with a multiobjective function, and a strategy to economize computing time

  14. Search Results | Page 836 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 8351 - 8360 of 8491 ... Qualitative Research for Tobacco Control: A How-to Introductory Manual for Researchers and Development Practitioners. This manual is designed to encourage users from around the world to engage in tobacco control research using qualitative methods and tools. The objective is to expand the ...

  15. One visual search, many memory searches: An eye-tracking investigation of hybrid search.

    Science.gov (United States)

    Drew, Trafton; Boettcher, Sage E P; Wolfe, Jeremy M

    2017-09-01

    Suppose you go to the supermarket with a shopping list of 10 items held in memory. Your shopping expedition can be seen as a combination of visual search and memory search. This is known as "hybrid search." There is a growing interest in understanding how hybrid search tasks are accomplished. We used eye tracking to examine how manipulating the number of possible targets (the memory set size [MSS]) changes how observers (Os) search. We found that dwell time on each distractor increased with MSS, suggesting a memory search was being executed each time a new distractor was fixated. Meanwhile, although the rate of refixation increased with MSS, it was not nearly enough to suggest a strategy that involves repeatedly searching visual space for subgroups of the target set. These data provide a clear demonstration that hybrid search tasks are carried out via a "one visual search, many memory searches" heuristic in which Os examine items in the visual array once with a very low rate of refixations. For each item selected, Os activate a memory search that produces logarithmic response time increases with increased MSS. Furthermore, the percentage of distractors fixated was strongly modulated by the MSS: More items in the MSS led to a higher percentage of fixated distractors. Searching for more potential targets appears to significantly alter how Os approach the task, ultimately resulting in more eye movements and longer response times.

  16. Searching for information on the World Wide Web with a search engine: a pilot study on cognitive flexibility in younger and older users.

    Science.gov (United States)

    Dommes, Aurelie; Chevalier, Aline; Rossetti, Marilyne

    2010-04-01

    This pilot study investigated the age-related differences in searching for information on the World Wide Web with a search engine. 11 older adults (6 men, 5 women; M age=59 yr., SD=2.76, range=55-65 yr.) and 12 younger adults (2 men, 10 women; M=23.7 yr., SD=1.07, range=22-25 yr.) had to conduct six searches differing in complexity, and for which a search method was or was not induced. The results showed that the younger and older participants provided with an induced search method were less flexible than the others and produced fewer new keywords. Moreover, older participants took longer than the younger adults, especially in the complex searches. The younger participants were flexible in the first request and spontaneously produced new keywords (spontaneous flexibility), whereas the older participants only produced new keywords when confronted by impasses (reactive flexibility). Aging may influence web searches, especially the nature of keywords used.

  17. Movable geometry and eigenvalue search capability in the MC21 Monte Carlo code

    International Nuclear Information System (INIS)

    Gill, D. F.; Nease, B. R.; Griesheimer, D. P.

    2013-01-01

    A description of a robust and flexible movable geometry implementation in the Monte Carlo code MC21 is described along with a search algorithm that can be used in conjunction with the movable geometry capability to perform eigenvalue searches based on the position of some geometric component. The natural use of the combined movement and search capability is searching to critical through variation of control rod (or control drum) position. The movable geometry discussion provides the mathematical framework for moving surfaces in the MC21 combinatorial solid geometry description. A discussion of the interface between the movable geometry system and the user is also described, particularly the ability to create a hierarchy of movable groups. Combined with the hierarchical geometry description in MC21 the movable group framework provides a very powerful system for inline geometry modification. The eigenvalue search algorithm implemented in MC21 is also described. The foundations of this algorithm are a regula falsi search though several considerations are made in an effort to increase the efficiency of the algorithm for use with Monte Carlo. Specifically, criteria are developed to determine after each batch whether the Monte Carlo calculation should be continued, the search iteration can be rejected, or the search iteration has converged. These criteria seek to minimize the amount of time spent per iteration. Results for the regula falsi method are shown, illustrating that the method as implemented is indeed convergent and that the optimizations made ultimately reduce the total computational expense. (authors)

  18. Movable geometry and eigenvalue search capability in the MC21 Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Gill, D. F.; Nease, B. R.; Griesheimer, D. P. [Bettis Atomic Power Laboratory, PO Box 79, West Mifflin, PA 15122 (United States)

    2013-07-01

    A description of a robust and flexible movable geometry implementation in the Monte Carlo code MC21 is described along with a search algorithm that can be used in conjunction with the movable geometry capability to perform eigenvalue searches based on the position of some geometric component. The natural use of the combined movement and search capability is searching to critical through variation of control rod (or control drum) position. The movable geometry discussion provides the mathematical framework for moving surfaces in the MC21 combinatorial solid geometry description. A discussion of the interface between the movable geometry system and the user is also described, particularly the ability to create a hierarchy of movable groups. Combined with the hierarchical geometry description in MC21 the movable group framework provides a very powerful system for inline geometry modification. The eigenvalue search algorithm implemented in MC21 is also described. The foundations of this algorithm are a regula falsi search though several considerations are made in an effort to increase the efficiency of the algorithm for use with Monte Carlo. Specifically, criteria are developed to determine after each batch whether the Monte Carlo calculation should be continued, the search iteration can be rejected, or the search iteration has converged. These criteria seek to minimize the amount of time spent per iteration. Results for the regula falsi method are shown, illustrating that the method as implemented is indeed convergent and that the optimizations made ultimately reduce the total computational expense. (authors)

  19. Civic Entrepreneurship: In Search of Sustainable Development

    Energy Technology Data Exchange (ETDEWEB)

    Banuri, Tariq; Najam, Adil; Spanger-Siegfried, Erika [Stockholm Environment Institute - Boston Center (United States)

    2003-07-01

    Around the world, civic entrepreneurs are practising sustainable development through their actions. Representing civil society, business, and government, civic entrepreneurs are championing sustainable development and succeeding – often despite significant odds – in making it happen on the ground. It may often happen at a small scale, but it does so in undeniably real, robust and promising terms. Civic entrepreneurship is driven explicitly by the public interest, and seeks to create new ways of building social capital and of harnessing existing ideas, methods, inventions, technologies, resources or management systems in the service of collective goals.

  20. Design of PCB search coils for AC magnetic flux density measurement

    Science.gov (United States)

    Ulvr, Michal

    2018-04-01

    This paper presents single-layer, double-layer and ten-layer planar square search coils designed for AC magnetic flux density amplitude measurement up to 1 T in the low frequency range in a 10 mm air gap. The printed-circuit-board (PCB) method was used for producing the search coils. Special attention is given to a full characterization of the PCB search coils including a comparison between the detailed analytical design method and the finite integration technique method (FIT) on the one hand, and experimental results on the other. The results show very good agreement in the resistance, inductance and search coil constant values (the area turns) and also in the frequency dependence of the search coil constant.

  1. Two Search Techniques within a Human Pedigree Database

    OpenAIRE

    Gersting, J. M.; Conneally, P. M.; Rogers, K.

    1982-01-01

    This paper presents the basic features of two search techniques from MEGADATS-2 (MEdical Genetics Acquisition and DAta Transfer System), a system for collecting, storing, retrieving and plotting human family pedigrees. The individual search provides a quick method for locating an individual in the pedigree database. This search uses a modified soundex coding and an inverted file structure based on a composite key. The navigational search uses a set of pedigree traversal operations (individual...

  2. A loading pattern optimization method for nuclear fuel management

    International Nuclear Information System (INIS)

    Argaud, J.P.

    1997-01-01

    Nuclear fuel reload of PWR core leads to the search of an optimal nuclear fuel assemblies distribution, namely of loading pattern. This large discrete optimization problem is here expressed as a cost function minimization. To deal with this problem, an approach based on gradient information is used to direct the search in the patterns discrete space. A method using an adjoint state formulation is then developed, and final results of complete patterns search tests by this method are presented. (author)

  3. Overview of the CLEF 2016 Social Book Search Lab

    DEFF Research Database (Denmark)

    Koolen, Marijn; Bogers, Toine; Gäde, Maria

    2016-01-01

    systems. The aim of the Interactive Track is to develop user interfaces that support users through each stage during complex search tasks and to investigate how users exploit professional metadata and user-generated content. The Mining Track focuses on detecting and linking book titles in online book......The Social Book Search (SBS) Lab investigates book search in scenarios where users search with more than just a query, and look for more than objective metadata. Real-world information needs are generally complex, yet almost all research focuses instead on either relatively simple search based...... on queries, or on profile-based recommendation. The goal is to research and develop techniques to support users in complex book search tasks. The SBS Lab has three tracks. The aim of the Suggestion Track is to develop test collections for evaluating ranking effectiveness of book retrieval and recommender...

  4. The search for understanding: the role of paradigms.

    Science.gov (United States)

    Kelly, Marcella; Dowling, Maura; Millar, Michelle

    2018-03-16

    Kuhn's ( 1962 ) acknowledgement of a paradigm as a way that scientists make sense of their world and its reality gave recognition to the idea of 'paradigm shift'. This shift exposes the transience of paradigm development shaped by societal and scientific evolution. This ongoing evolutionary development provides the researcher with many paradigms to consider regarding how research is undertaken and the search for understanding achieved. An understanding of paradigm development is necessary when planning a study and can shape the search for understanding. It is hoped that the discussion presented here will assist novice and experienced researchers in articulating the rationales for their paradigm choices. An overview of the dominant paradigms is presented, reflecting ongoing paradigm development shaped by ontological, epistemological and methodological perspectives. Potential paradigm choices that shape research aims, objectives and focus in the search for understanding are considered. The inherent debates about paradigm shift, division, war and synthesis leave the researcher many perspectives to consider. Articulating the world views underpinning constructivism, interpretivism and pragmatism is particularly challenging because of the blurring of boundaries between them. The evolutionary nature of paradigmatic development has provided nurse researchers with the opportunity for methodological openness to the myriad research approaches, methods and designs that they may choose to answer their research question. However, it is imperative that researchers consider their ontological stances and the nature of their research questions. This is challenging in constructivism, interpretivism and pragmatism, where there is often an overlap of paradigm world views. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  5. Efficient methods for time-absorption (α) eigenvalue calculations

    International Nuclear Information System (INIS)

    Hill, T.R.

    1983-01-01

    The time-absorption eigenvalue (α) calculation is one of the options found in most discrete-ordinates transport codes. Several methods have been developed at Los Alamos to improve the efficiency of this calculation. Two procedures, based on coarse-mesh rebalance, to accelerate the α eigenvalue search are derived. A hybrid scheme to automatically choose the more-effective rebalance method is described. The α rebalance scheme permits some simple modifications to the iteration strategy that eliminates many unnecessary calculations required in the standard search procedure. For several fast supercritical test problems, these methods resulted in convergence with one-fifth the number of iterations required for the conventional eigenvalue search procedure

  6. A FISH-based method for assessment of HER-2 amplification status in breast cancer circulating tumor cells following CellSearch isolation

    Directory of Open Access Journals (Sweden)

    Frithiof H

    2016-11-01

    Full Text Available Henrik Frithiof,1 Kristina Aaltonen,1 Lisa Rydén2,3 1Division of Oncology and Pathology, 2Division of Surgery, Department of Clinical Sciences Lund, Lund University, Lund, 3Department of Surgery, Skåne University Hospital, Malmö, Sweden Introduction: Amplification of the HER-2/neu (HER-2 proto-oncogene occurs in 10%–15% of primary breast cancer, leading to an activated HER-2 receptor, augmenting growth of cancer cells. Tumor classification is determined in primary tumor tissue and metastatic biopsies. However, malignant cells tend to alter their phenotype during disease progression. Circulating tumor cell (CTC analysis may serve as an alternative to repeated biopsies. The Food and Drug Administration-approved CellSearch system allows determination of the HER-2 protein, but not of the HER-2 gene. The aim of this study was to optimize a fluorescence in situ hybridization (FISH-based method to quantitatively determine HER-2 amplification in breast cancer CTCs following CellSearch-based isolation and verify the method in patient samples. Methods: Using healthy donor blood spiked with human epidermal growth factor receptor 2 (HER-2-positive breast cancer cell lines, SKBr-3 and BT-474, and a corresponding negative control (the HER-2-negative MCF-7 cell line, an in vitro CTC model system was designed. Following isolation in the CellSearch system, CTC samples were further enriched and fixed on microscope slides. Immunocytochemical staining with cytokeratin and 4',6-diamidino-2'-phenylindole dihydrochloride identified CTCs under a fluorescence microscope. A FISH-based procedure was optimized by applying the HER2 IQFISH pharmDx assay for assessment of HER-2 amplification status in breast cancer CTCs. Results: A method for defining the presence of HER-2 amplification in single breast cancer CTCs after CellSearch isolation was established using cell lines as positive and negative controls. The method was validated in blood from breast cancer patients

  7. Protein search for multiple targets on DNA

    Energy Technology Data Exchange (ETDEWEB)

    Lange, Martin [Johannes Gutenberg University, Mainz 55122 (Germany); Department of Chemistry, Rice University, Houston, Texas 77005 (United States); Kochugaeva, Maria [Department of Chemistry, Rice University, Houston, Texas 77005 (United States); Kolomeisky, Anatoly B., E-mail: tolya@rice.edu [Department of Chemistry, Rice University, Houston, Texas 77005 (United States); Center for Theoretical Biological Physics, Rice University, Houston, Texas 77005 (United States)

    2015-09-14

    Protein-DNA interactions are crucial for all biological processes. One of the most important fundamental aspects of these interactions is the process of protein searching and recognizing specific binding sites on DNA. A large number of experimental and theoretical investigations have been devoted to uncovering the molecular description of these phenomena, but many aspects of the mechanisms of protein search for the targets on DNA remain not well understood. One of the most intriguing problems is the role of multiple targets in protein search dynamics. Using a recently developed theoretical framework we analyze this question in detail. Our method is based on a discrete-state stochastic approach that takes into account most relevant physical-chemical processes and leads to fully analytical description of all dynamic properties. Specifically, systems with two and three targets have been explicitly investigated. It is found that multiple targets in most cases accelerate the search in comparison with a single target situation. However, the acceleration is not always proportional to the number of targets. Surprisingly, there are even situations when it takes longer to find one of the multiple targets in comparison with the single target. It depends on the spatial position of the targets, distances between them, average scanning lengths of protein molecules on DNA, and the total DNA lengths. Physical-chemical explanations of observed results are presented. Our predictions are compared with experimental observations as well as with results from a continuum theory for the protein search. Extensive Monte Carlo computer simulations fully support our theoretical calculations.

  8. Clinical leadership and nursing explored: A literature search.

    Science.gov (United States)

    Stanley, David; Stanley, Karen

    2017-10-27

    To explore what we know of the concept of clinical leadership and what the term means. Clues to the definition of clinical leadership, the attributes of effective and less effective clinical leaders, models of clinical leadership and the barriers that hinder clinical leadership development were explored. While nursing leadership and healthcare leadership are terms that have been evident in nursing and health industry literature for many decades, clinical leadership is a relatively new term and is may still be misunderstood. A search was undertaken of formal and informal literature using a library database and a range of search engines for the words "clinical leadership" and "clinical leadership in nursing." In each case, the full search parameters were employed with searches between 1974-2016. Full-text articles were requested, and English was the preferred language. In total, 3,259 publications were located through seven database search tools, although these included a large number of duplications. Following further informal searches and removing irrelevant material, 27 research or literature review focused papers were retained that included 17 qualitative studies, one quantitative study, one mixed method study, one Delphi study and two that compared other research studies. As well, five literature reviews were retained in the synthesis. The data synthesis resulted five categories: definitions of clinical leadership, characteristics most likely or least associated with clinical leadership, models applied to clinical leadership and limits to clinical leadership development. Clinical leaders are recognised for having their values and beliefs parallel their actions and interventions. They are found across the spectrum of health organisations, often at the highest level for clinical interaction, but not commonly at the highest management level in a ward or unit team and they are seen in all clinical environments. Clinical Leadership and an understanding on how

  9. Complex Sequencing Problems and Local Search Heuristics

    NARCIS (Netherlands)

    Brucker, P.; Hurink, Johann L.; Osman, I.H.; Kelly, J.P.

    1996-01-01

    Many problems can be formulated as complex sequencing problems. We will present problems in flexible manufacturing that have such a formulation and apply local search methods like iterative improvement, simulated annealing and tabu search to solve these problems. Computational results are reported.

  10. SearchResultFinder: federated search made easy

    NARCIS (Netherlands)

    Trieschnigg, Rudolf Berend; Tjin-Kam-Jet, Kien; Hiemstra, Djoerd

    Building a federated search engine based on a large number existing web search engines is a challenge: implementing the programming interface (API) for each search engine is an exacting and time-consuming job. In this demonstration we present SearchResultFinder, a browser plugin which speeds up

  11. IBRI-CASONTO: Ontology-based semantic search engine

    Directory of Open Access Journals (Sweden)

    Awny Sayed

    2017-11-01

    Full Text Available The vast availability of information, that added in a very fast pace, in the data repositories creates a challenge in extracting correct and accurate information. Which has increased the competition among developers in order to gain access to technology that seeks to understand the intent researcher and contextual meaning of terms. While the competition for developing an Arabic Semantic Search systems are still in their infancy, and the reason could be traced back to the complexity of Arabic Language. It has a complex morphological, grammatical and semantic aspects, as it is a highly inflectional and derivational language. In this paper, we try to highlight and present an Ontological Search Engine called IBRI-CASONTO for Colleges of Applied Sciences, Oman. Our proposed engine supports both Arabic and English language. It is also employed two types of search which are a keyword-based search and a semantics-based search. IBRI-CASONTO is based on different technologies such as Resource Description Framework (RDF data and Ontological graph. The experiments represent in two sections, first it shows a comparison among Entity-Search and the Classical-Search inside the IBRI-CASONTO itself, second it compares the Entity-Search of IBRI-CASONTO with currently used search engines, such as Kngine, Wolfram Alpha and the most popular engine nowadays Google, in order to measure their performance and efficiency.

  12. Search Results | Page 843 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2008-06-02

    Results 8421 - 8430 of 8496 ... James Putzel, Crisis States Research Centre, London School of Economics ... and analyzes the 3-year-long “L-20” project, whose objective was to develop a method of breaking the political deadlocks that too often prevent progress on critical global issues. Publication Date. June 2, 2008 ...

  13. Evaluation Framework for Search Instruments

    International Nuclear Information System (INIS)

    Warren, Glen A.; Smith, Leon E.; Cooper, Matt W.; Kaye, William R.

    2005-01-01

    A framework for quantitatively evaluating current and proposed gamma-ray search instrument designs has been developed. The framework is designed to generate a large library of ''virtual neighborhoods'' that can be used to test and evaluate nearly any gamma-ray sensor type. Calculating nuisance-source emissions and combining various sources to create a large number of random virtual scenes places a significant computational burden on the development of the framework. To reduce this burden, a number of radiation transport simplifications have been made which maintain the essential physics ingredients for the quantitative assessment of search instruments while significantly reducing computational times. The various components of the framework, from the simulation and benchmarking of nuisance source emissions to the computational engine for generating the gigabytes of simulated search scenes, are discussed

  14. A systematic literature search on psychological first aid: lack of evidence to develop guidelines.

    Science.gov (United States)

    Dieltjens, Tessa; Moonens, Inge; Van Praet, Koen; De Buck, Emmy; Vandekerckhove, Philippe

    2014-01-01

    Providing psychological first aid (PFA) is generally considered to be an important element in preliminary care of disaster victims. Using the best available scientific basis for courses and educational materials, the Belgian Red Cross-Flanders wants to ensure that its volunteers are trained in the best way possible. To identify effective PFA practices, by systematically reviewing the evidence in existing guidelines, systematic reviews and individual studies. Systematic literature searches in five bibliographic databases (MEDLINE, PsycINFO, The Cochrane Library, PILOTS and G-I-N) were conducted from inception to July 2013. Five practice guidelines were included which were found to vary in the development process (AGREE II score 20-53%) and evidence base used. None of them provides solid evidence concerning the effectiveness of PFA practices. Additionally, two systematic reviews of PFA were found, both noting a lack of studies on PFA. A complementary search for individual studies, using a more sensitive search strategy, identified 11 237 references of which 102 were included for further full-text examination, none of which ultimately provides solid evidence concerning the effectiveness of PFA practices. The scientific literature on psychological first aid available to date, does not provide any evidence about the effectiveness of PFA interventions. Currently it is impossible to make evidence-based guidelines about which practices in psychosocial support are most effective to help disaster and trauma victims.

  15. A systematic literature search on psychological first aid: lack of evidence to develop guidelines.

    Directory of Open Access Journals (Sweden)

    Tessa Dieltjens

    Full Text Available Providing psychological first aid (PFA is generally considered to be an important element in preliminary care of disaster victims. Using the best available scientific basis for courses and educational materials, the Belgian Red Cross-Flanders wants to ensure that its volunteers are trained in the best way possible.To identify effective PFA practices, by systematically reviewing the evidence in existing guidelines, systematic reviews and individual studies.Systematic literature searches in five bibliographic databases (MEDLINE, PsycINFO, The Cochrane Library, PILOTS and G-I-N were conducted from inception to July 2013.Five practice guidelines were included which were found to vary in the development process (AGREE II score 20-53% and evidence base used. None of them provides solid evidence concerning the effectiveness of PFA practices. Additionally, two systematic reviews of PFA were found, both noting a lack of studies on PFA. A complementary search for individual studies, using a more sensitive search strategy, identified 11 237 references of which 102 were included for further full-text examination, none of which ultimately provides solid evidence concerning the effectiveness of PFA practices.The scientific literature on psychological first aid available to date, does not provide any evidence about the effectiveness of PFA interventions. Currently it is impossible to make evidence-based guidelines about which practices in psychosocial support are most effective to help disaster and trauma victims.

  16. Bomb Threats and Bomb Search Techniques.

    Science.gov (United States)

    Department of the Treasury, Washington, DC.

    This pamphlet explains how to be prepared and plan for bomb threats and describes procedures to follow once a call has been received. The content covers (1) preparation for bomb threats, (2) evacuation procedures, (3) room search methods, (4) procedures to follow once a bomb has been located, and (5) typical problems that search teams will…

  17. The Search for Stable, Massive, Elementary Particles

    International Nuclear Information System (INIS)

    Kim, Peter C.

    2001-01-01

    In this paper we review the experimental and observational searches for stable, massive, elementary particles other than the electron and proton. The particles may be neutral, may have unit charge or may have fractional charge. They may interact through the strong, electromagnetic, weak or gravitational forces or through some unknown force. The purpose of this review is to provide a guide for future searches--what is known, what is not known, and what appear to be the most fruitful areas for new searches. A variety of experimental and observational methods such as accelerator experiments, cosmic ray studies, searches for exotic particles in bulk matter and searches using astrophysical observations is included in this review

  18. A hybrid method for in-core optimization of pressurized water reactor reload core design

    International Nuclear Information System (INIS)

    Stevens, J.G.

    1995-05-01

    The objective of this research is the development of an accurate, practical, and robust method for optimization of the design of loading patterns for pressurized water reactors, a nonlinear, non-convex, integer optimization problem. The many logical constraints which may be applied during the design process are modeled herein by a network construction upon which performance objectives and safety constraints from reactor physics calculations are optimized. This thesis presents the synthesis of the strengths of previous algorithms developed for reload design optimization and extension of robustness through development of a hybrid liberated search algorithm. Development of three independent methods for reload design optimization is presented: random direct search for local improvement, liberated search by simulated annealing, and deterministic search for local improvement via successive linear assignment by branch and bound. Comparative application of the methods to a variety of problems is discussed, including an exhaustive enumeration benchmark created to allow comparison of search results to a known global optimum for a large scale problem. While direct search and determinism are shown to be capable of finding improvement, only the liberation of simulated annealing is found to perform robustly in the non-convex design spaces. The hybrid method SHAMAN is presented. The algorithm applies: determinism to shuffle an initial solution for satisfaction of heuristics and symmetry; liberated search through simulated annealing with a bounds cooling constraint treatment; and search bias through relational heuristics for the application of engineering judgment. The accuracy, practicality, and robustness of the SHAMAN algorithm is demonstrated through application to a variety of reload loading pattern optimization problems

  19. A student's guide to searching the literature using online databases

    Science.gov (United States)

    Miller, Casey W.; Belyea, Dustin; Chabot, Michelle; Messina, Troy

    2012-02-01

    A method is described to empower students to efficiently perform general and specific literature searches using online resources [Miller et al., Am. J. Phys. 77, 1112 (2009)]. The method was tested on multiple groups, including undergraduate and graduate students with varying backgrounds in scientific literature searches. Students involved in this study showed marked improvement in their awareness of how and where to find scientific information. Repeated exposure to literature searching methods appears worthwhile, starting early in the undergraduate career, and even in graduate school orientation.

  20. Harvesting social images for bi-concept search

    NARCIS (Netherlands)

    Li, X.; Snoek, C.G.M.; Worring, M.; Smeulders, A.W.M.

    2012-01-01

    Searching for the co-occurrence of two visual concepts in unlabeled images is an important step towards answering complex user queries. Traditional visual search methods use combinations of the confidence scores of individual concept detectors to tackle such queries. In this paper we introduce the