Sample records for accelerated similarity searching

  1. SW#db: GPU-Accelerated Exact Sequence Similarity Database Search.

    Matija Korpar

    Full Text Available In recent years we have witnessed a growth in sequencing yield, the number of samples sequenced, and as a result-the growth of publicly maintained sequence databases. The increase of data present all around has put high requirements on protein similarity search algorithms with two ever-opposite goals: how to keep the running times acceptable while maintaining a high-enough level of sensitivity. The most time consuming step of similarity search are the local alignments between query and database sequences. This step is usually performed using exact local alignment algorithms such as Smith-Waterman. Due to its quadratic time complexity, alignments of a query to the whole database are usually too slow. Therefore, the majority of the protein similarity search methods prior to doing the exact local alignment apply heuristics to reduce the number of possible candidate sequences in the database. However, there is still a need for the alignment of a query sequence to a reduced database. In this paper we present the SW#db tool and a library for fast exact similarity search. Although its running times, as a standalone tool, are comparable to the running times of BLAST, it is primarily intended to be used for exact local alignment phase in which the database of sequences has already been reduced. It uses both GPU and CPU parallelization and was 4-5 times faster than SSEARCH, 6-25 times faster than CUDASW++ and more than 20 times faster than SSW at the time of writing, using multiple queries on Swiss-prot and Uniref90 databases.

  2. Accelerated Profile HMM Searches.

    Sean R Eddy


    Full Text Available Profile hidden Markov models (profile HMMs and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the "multiple segment Viterbi" (MSV algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call "sparse rescaling". These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches.

  3. Semantically enabled image similarity search

    Casterline, May V.; Emerick, Timothy; Sadeghi, Kolia; Gosse, C. A.; Bartlett, Brent; Casey, Jason


    Georeferenced data of various modalities are increasingly available for intelligence and commercial use, however effectively exploiting these sources demands a unified data space capable of capturing the unique contribution of each input. This work presents a suite of software tools for representing geospatial vector data and overhead imagery in a shared high-dimension vector or embedding" space that supports fused learning and similarity search across dissimilar modalities. While the approach is suitable for fusing arbitrary input types, including free text, the present work exploits the obvious but computationally difficult relationship between GIS and overhead imagery. GIS is comprised of temporally-smoothed but information-limited content of a GIS, while overhead imagery provides an information-rich but temporally-limited perspective. This processing framework includes some important extensions of concepts in literature but, more critically, presents a means to accomplish them as a unified framework at scale on commodity cloud architectures.

  4. Protein structural similarity search by Ramachandran codes

    Chang Chih-Hung


    Full Text Available Abstract Background Protein structural data has increased exponentially, such that fast and accurate tools are necessary to access structure similarity search. To improve the search speed, several methods have been designed to reduce three-dimensional protein structures to one-dimensional text strings that are then analyzed by traditional sequence alignment methods; however, the accuracy is usually sacrificed and the speed is still unable to match sequence similarity search tools. Here, we aimed to improve the linear encoding methodology and develop efficient search tools that can rapidly retrieve structural homologs from large protein databases. Results We propose a new linear encoding method, SARST (Structural similarity search Aided by Ramachandran Sequential Transformation. SARST transforms protein structures into text strings through a Ramachandran map organized by nearest-neighbor clustering and uses a regenerative approach to produce substitution matrices. Then, classical sequence similarity search methods can be applied to the structural similarity search. Its accuracy is similar to Combinatorial Extension (CE and works over 243,000 times faster, searching 34,000 proteins in 0.34 sec with a 3.2-GHz CPU. SARST provides statistically meaningful expectation values to assess the retrieved information. It has been implemented into a web service and a stand-alone Java program that is able to run on many different platforms. Conclusion As a database search method, SARST can rapidly distinguish high from low similarities and efficiently retrieve homologous structures. It demonstrates that the easily accessible linear encoding methodology has the potential to serve as a foundation for efficient protein structural similarity search tools. These search tools are supposed applicable to automated and high-throughput functional annotations or predictions for the ever increasing number of published protein structures in this post-genomic era.

  5. Similarity search processing. Paralelization and indexing technologies.

    Eder Dos Santos


    The next Scientific-Technical Report addresses the similarity search and the implementation of metric structures on parallel environments. It also presents the state of the art related to similarity search on metric structures and parallelism technologies. Comparative analysis are also proposed, seeking to identify the behavior of a set of metric spaces and metric structures over processing platforms multicore-based and GPU-based.

  6. Effective semantic search using thematic similarity

    Sharifullah Khan


    Full Text Available Most existing semantic search systems expand search keywords using domain ontology to deal with semantic heterogeneity. They focus on matching the semantic similarity of individual keywords in a multiple-keywords query; however, they ignore the semantic relationships that exist among the keywords of the query themselves. The systems return less relevant answers for these types of queries. More relevant documents for a multiple-keywords query can be retrieved if the systems know the relationships that exist among multiple keywords in the query. The proposed search methodology matches patterns of keywords for capturing the context of keywords, and then the relevant documents are ranked according to their pattern relevance score. A prototype system has been implemented to validate the proposed search methodology. The system has been compared with existing systems for evaluation. The results demonstrate improvement in precision and recall of search.

  7. Molecular fingerprint similarity search in virtual screening.

    Cereto-Massagué, Adrià; Ojeda, María José; Valls, Cristina; Mulero, Miquel; Garcia-Vallvé, Santiago; Pujadas, Gerard


    Molecular fingerprints have been used for a long time now in drug discovery and virtual screening. Their ease of use (requiring little to no configuration) and the speed at which substructure and similarity searches can be performed with them - paired with a virtual screening performance similar to other more complex methods - is the reason for their popularity. However, there are many types of fingerprints, each representing a different aspect of the molecule, which can greatly affect search performance. This review focuses on commonly used fingerprint algorithms, their usage in virtual screening, and the software packages and online tools that provide these algorithms.

  8. Gene functional similarity search tool (GFSST

    Russo James J


    Full Text Available Abstract Background With the completion of the genome sequences of human, mouse, and other species and the advent of high throughput functional genomic research technologies such as biomicroarray chips, more and more genes and their products have been discovered and their functions have begun to be understood. Increasing amounts of data about genes, gene products and their functions have been stored in databases. To facilitate selection of candidate genes for gene-disease research, genetic association studies, biomarker and drug target selection, and animal models of human diseases, it is essential to have search engines that can retrieve genes by their functions from proteome databases. In recent years, the development of Gene Ontology (GO has established structured, controlled vocabularies describing gene functions, which makes it possible to develop novel tools to search genes by functional similarity. Results By using a statistical model to measure the functional similarity of genes based on the Gene Ontology directed acyclic graph, we developed a novel Gene Functional Similarity Search Tool (GFSST to identify genes with related functions from annotated proteome databases. This search engine lets users design their search targets by gene functions. Conclusion An implementation of GFSST which works on the UniProt (Universal Protein Resource for the human and mouse proteomes is available at GFSST Web Server. GFSST provides functions not only for similar gene retrieval but also for gene search by one or more GO terms. This represents a powerful new approach for selecting similar genes and gene products from proteome databases according to their functions.

  9. Web Search Results Summarization Using Similarity Assessment

    Sawant V.V.


    Full Text Available Now day’s internet has become part of our life, the WWW is most important service of internet because it allows presenting information such as document, imaging etc. The WWW grows rapidly and caters to a diversified levels and categories of users. For user specified results web search results are extracted. Millions of information pouring online, users has no time to surf the contents completely .Moreover the information available is repeated or duplicated in nature. This issue has created the necessity to restructure the search results that could yield results summarized. The proposed approach comprises of different feature extraction of web pages. Web page visual similarity assessment has been employed to address the problems in different fields including phishing, web archiving, web search engine etc. In this approach, initially by enters user query the number of search results get stored. The Earth Mover's Distance is used to assessment of web page visual similarity, in this technique take the web page as a low resolution image, create signature of that web page image with color and co-ordinate features .Calculate the distance between web pages by applying EMD method. Compute the Layout Similarity value by using tag comparison algorithm and template comparison algorithm. Textual similarity is computed by using cosine similarity, and hyperlink analysis is performed to compute outward links. The final similarity value is calculated by fusion of layout, text, hyperlink and EMD value. Once the similarity matrix is found clustering is employed with the help of connected component. Finally group of similar web pages i.e. summarized results get displayed to user. Experiment conducted to demonstrate the effectiveness of four methods to generate summarized result on different web pages and user queries also.

  10. SEAL: Spatio-Textual Similarity Search

    Fan, Ju; Zhou, Lizhu; Chen, Shanshan; Hu, Jun


    Location-based services (LBS) have become more and more ubiquitous recently. Existing methods focus on finding relevant points-of-interest (POIs) based on users' locations and query keywords. Nowadays, modern LBS applications generate a new kind of spatio-textual data, regions-of-interest (ROIs), containing region-based spatial information and textual description, e.g., mobile user profiles with active regions and interest tags. To satisfy search requirements on ROIs, we study a new research problem, called spatio-textual similarity search: Given a set of ROIs and a query ROI, we find the similar ROIs by considering spatial overlap and textual similarity. Spatio-textual similarity search has many important applications, e.g., social marketing in location-aware social networks. It calls for an efficient search method to support large scales of spatio-textual data in LBS systems. To this end, we introduce a filter-and-verification framework to compute the answers. In the filter step, we generate signatures for ...

  11. Similarity searching in large combinatorial chemistry spaces

    Rarey, Matthias; Stahl, Martin


    We present a novel algorithm, called Ftrees-FS, for similarity searching in large chemistry spaces based on dynamic programming. Given a query compound, the algorithm generates sets of compounds from a given chemistry space that are similar to the query. The similarity search is based on the feature tree similarity measure representing molecules by tree structures. This descriptor allows handling combinatorial chemistry spaces as a whole instead of looking at subsets of enumerated compounds. Within few minutes of computing time, the algorithm is able to find the most similar compound in very large spaces as well as sets of compounds at an arbitrary similarity level. In addition, the diversity among the generated compounds can be controlled. A set of 17 000 fragments of known drugs, generated by the RECAP procedure from the World Drug Index, was used as the search chemistry space. These fragments can be combined to more than 1018 compounds of reasonable size. For validation, known antagonists/inhibitors of several targets including dopamine D4, histamine H1, and COX2 are used as queries. Comparison of the compounds created by Ftrees-FS to other known actives demonstrates the ability of the method to jump between structurally unrelated molecule classes.

  12. Predicting the performance of fingerprint similarity searching.

    Vogt, Martin; Bajorath, Jürgen


    Fingerprints are bit string representations of molecular structure that typically encode structural fragments, topological features, or pharmacophore patterns. Various fingerprint designs are utilized in virtual screening and their search performance essentially depends on three parameters: the nature of the fingerprint, the active compounds serving as reference molecules, and the composition of the screening database. It is of considerable interest and practical relevance to predict the performance of fingerprint similarity searching. A quantitative assessment of the potential that a fingerprint search might successfully retrieve active compounds, if available in the screening database, would substantially help to select the type of fingerprint most suitable for a given search problem. The method presented herein utilizes concepts from information theory to relate the fingerprint feature distributions of reference compounds to screening libraries. If these feature distributions do not sufficiently differ, active database compounds that are similar to reference molecules cannot be retrieved because they disappear in the "background." By quantifying the difference in feature distribution using the Kullback-Leibler divergence and relating the divergence to compound recovery rates obtained for different benchmark classes, fingerprint search performance can be quantitatively predicted.

  13. New similarity search based glioma grading

    Haegler, Katrin; Brueckmann, Hartmut; Linn, Jennifer [Ludwig-Maximilians-University of Munich, Department of Neuroradiology, Munich (Germany); Wiesmann, Martin; Freiherr, Jessica [RWTH Aachen University, Department of Neuroradiology, Aachen (Germany); Boehm, Christian [Ludwig-Maximilians-University of Munich, Department of Computer Science, Munich (Germany); Schnell, Oliver; Tonn, Joerg-Christian [Ludwig-Maximilians-University of Munich, Department of Neurosurgery, Munich (Germany)


    MR-based differentiation between low- and high-grade gliomas is predominately based on contrast-enhanced T1-weighted images (CE-T1w). However, functional MR sequences as perfusion- and diffusion-weighted sequences can provide additional information on tumor grade. Here, we tested the potential of a recently developed similarity search based method that integrates information of CE-T1w and perfusion maps for non-invasive MR-based glioma grading. We prospectively included 37 untreated glioma patients (23 grade I/II, 14 grade III gliomas), in whom 3T MRI with FLAIR, pre- and post-contrast T1-weighted, and perfusion sequences was performed. Cerebral blood volume, cerebral blood flow, and mean transit time maps as well as CE-T1w images were used as input for the similarity search. Data sets were preprocessed and converted to four-dimensional Gaussian Mixture Models that considered correlations between the different MR sequences. For each patient, a so-called tumor feature vector (= probability-based classifier) was defined and used for grading. Biopsy was used as gold standard, and similarity based grading was compared to grading solely based on CE-T1w. Accuracy, sensitivity, and specificity of pure CE-T1w based glioma grading were 64.9%, 78.6%, and 56.5%, respectively. Similarity search based tumor grading allowed differentiation between low-grade (I or II) and high-grade (III) gliomas with an accuracy, sensitivity, and specificity of 83.8%, 78.6%, and 87.0%. Our findings indicate that integration of perfusion parameters and CE-T1w information in a semi-automatic similarity search based analysis improves the potential of MR-based glioma grading compared to CE-T1w data alone. (orig.)

  14. Efficient Video Similarity Measurement and Search

    Cheung, S-C S


    The amount of information on the world wide web has grown enormously since its creation in 1990. Duplication of content is inevitable because there is no central management on the web. Studies have shown that many similar versions of the same text documents can be found throughout the web. This redundancy problem is more severe for multimedia content such as web video sequences, as they are often stored in multiple locations and different formats to facilitate downloading and streaming. Similar versions of the same video can also be found, unknown to content creators, when web users modify and republish original content using video editing tools. Identifying similar content can benefit many web applications and content owners. For example, it will reduce the number of similar answers to a web search and identify inappropriate use of copyright content. In this dissertation, they present a system architecture and corresponding algorithms to efficiently measure, search, and organize similar video sequences found on any large database such as the web.

  15. GPU accelerated chemical similarity calculation for compound library comparison.

    Ma, Chao; Wang, Lirong; Xie, Xiang-Qun


    Chemical similarity calculation plays an important role in compound library design, virtual screening, and "lead" optimization. In this manuscript, we present a novel GPU-accelerated algorithm for all-vs-all Tanimoto matrix calculation and nearest neighbor search. By taking advantage of multicore GPU architecture and CUDA parallel programming technology, the algorithm is up to 39 times superior to the existing commercial software that runs on CPUs. Because of the utilization of intrinsic GPU instructions, this approach is nearly 10 times faster than existing GPU-accelerated sparse vector algorithm, when Unity fingerprints are used for Tanimoto calculation. The GPU program that implements this new method takes about 20 min to complete the calculation of Tanimoto coefficients between 32 M PubChem compounds and 10K Active Probes compounds, i.e., 324G Tanimoto coefficients, on a 128-CUDA-core GPU.

  16. Earthquake detection through computationally efficient similarity search

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.


    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  17. Search for Dark Photons with Accelerators

    Merkel Harald


    Full Text Available A dark photon as the mediator of an interaction of the dark sector is a well motivated extension of the standard model. While possible dark matter particles are heavy and seem to be beyond the reach of current accelerators, the dark photon is not necessarily heavy and might have a mass in the range of existing accelerators. In recent years, an extensive experimental program at several accelerators for the search for dark photons were established. In this talk, recent results and progress in the determination of exclusion limits with accelerators is presented.

  18. Outsourced Similarity Search on Metric Data Assets

    Yiu, Man Lung; Assent, Ira; Jensen, Christian S.


    This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example...

  19. A Similarity Search Using Molecular Topological Graphs

    Yoshifumi Fukunishi


    Full Text Available A molecular similarity measure has been developed using molecular topological graphs and atomic partial charges. Two kinds of topological graphs were used. One is the ordinary adjacency matrix and the other is a matrix which represents the minimum path length between two atoms of the molecule. The ordinary adjacency matrix is suitable to compare the local structures of molecules such as functional groups, and the other matrix is suitable to compare the global structures of molecules. The combination of these two matrices gave a similarity measure. This method was applied to in silico drug screening, and the results showed that it was effective as a similarity measure.

  20. Outsourced similarity search on metric data assets

    Yiu, Man Lung


    This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example. Outsourcing offers the data owner scalability and a low-initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying it to the service provider for similarity queries on the transformed data. Our techniques provide interesting trade-offs between query cost and accuracy. They are then further extended to offer an intuitive privacy guarantee. Empirical studies with real data demonstrate that the techniques are capable of offering privacy while enabling efficient and accurate processing of similarity queries.

  1. Accelerator-based neutrino oscillation searches

    Whitehouse, D. A.; Rameika, R.; Stanton, N.

    This paper attempts to summarize the neutrino oscillation section of the Workshop on Future Directions in Particle and Nuclear Physics at Multi-GeV Hadron Beam Facilities. There were very lively discussions about the merits of the different oscillation channels, experiments, and facilities, but we believe a substantial consensus emerged. First, the next decade is one of great potential for discovery in neutrino physics, but it is also one of great peril. The possibility that neutrino oscillations explain the solar neutrino and atmospheric neutrino experiments, and the indirect evidence that Hot Dark Matter (HDM) in the form of light neutrinos might make up 30% of the mass of the universe, point to areas where accelerator-based experiments could play a crucial role in piecing together the puzzle. At the same time, the field faces a very uncertain future. The LSND experiment at LAMPF is the only funded neutrino oscillation experiment in the United States and it is threatened by the abrupt shutdown of LAMPF proposed for fiscal 1994. The future of neutrino physics at the Brookhaven National Laboratory AGS depends on the continuation of High Energy Physics (HEP) funding after the RHIC startup. Most proposed neutrino oscillation searches at Fermilab depend on the completion of the Main Injector project and on the construction of a new neutrino beamline, which is uncertain at this point. The proposed KAON facility at TRIUMF would provide a neutrino beam similar to that at the AGS but with a much increased intensity. The future of KAON is also uncertain. Despite the difficult obstacles present, there is a real possibility that we are on the verge of understanding the masses and mixings of the neutrinos. The physics importance of such a discovery cannot be overstated. The current experimental status and future possibilities are discussed.

  2. Learning Style Similarity for Searching Infographics

    Saleh, Babak; Dontcheva, Mira; Hertzmann, Aaron; Liu, Zhicheng


    Infographics are complex graphic designs integrating text, images, charts and sketches. Despite the increasing popularity of infographics and the rapid growth of online design portfolios, little research investigates how we can take advantage of these design resources. In this paper we present a method for measuring the style similarity between infographics. Based on human perception data collected from crowdsourced experiments, we use computer vision and machine learning algorithms to learn ...

  3. Distributed Efficient Similarity Search Mechanism in Wireless Sensor Networks

    Khandakar Ahmed


    Full Text Available The Wireless Sensor Network similarity search problem has received considerable research attention due to sensor hardware imprecision and environmental parameter variations. Most of the state-of-the-art distributed data centric storage (DCS schemes lack optimization for similarity queries of events. In this paper, a DCS scheme with metric based similarity searching (DCSMSS is proposed. DCSMSS takes motivation from vector distance index, called iDistance, in order to transform the issue of similarity searching into the problem of an interval search in one dimension. In addition, a sector based distance routing algorithm is used to efficiently route messages. Extensive simulation results reveal that DCSMSS is highly efficient and significantly outperforms previous approaches in processing similarity search queries.

  4. Accelerating Neuroimage Registration through Parallel Computation of Similarity Metric.

    Yun-Gang Luo

    Full Text Available Neuroimage registration is crucial for brain morphometric analysis and treatment efficacy evaluation. However, existing advanced registration algorithms such as FLIRT and ANTs are not efficient enough for clinical use. In this paper, a GPU implementation of FLIRT with the correlation ratio (CR as the similarity metric and a GPU accelerated correlation coefficient (CC calculation for the symmetric diffeomorphic registration of ANTs have been developed. The comparison with their corresponding original tools shows that our accelerated algorithms can greatly outperform the original algorithm in terms of computational efficiency. This paper demonstrates the great potential of applying these registration tools in clinical applications.

  5. Visual similarity is stronger than semantic similarity in guiding visual search for numbers.

    Godwin, Hayward J; Hout, Michael C; Menneer, Tamaryn


    Using a visual search task, we explored how behavior is influenced by both visual and semantic information. We recorded participants' eye movements as they searched for a single target number in a search array of single-digit numbers (0-9). We examined the probability of fixating the various distractors as a function of two key dimensions: the visual similarity between the target and each distractor, and the semantic similarity (i.e., the numerical distance) between the target and each distractor. Visual similarity estimates were obtained using multidimensional scaling based on the independent observer similarity ratings. A linear mixed-effects model demonstrated that both visual and semantic similarity influenced the probability that distractors would be fixated. However, the visual similarity effect was substantially larger than the semantic similarity effect. We close by discussing the potential value of using this novel methodological approach and the implications for both simple and complex visual search displays.

  6. How Google Web Search copes with very similar documents

    Mettrop, W.; Nieuwenhuysen, P.; Smulders, H.


    A significant portion of the computer files that carry documents, multimedia, programs etc. on the Web are identical or very similar to other files on the Web. How do search engines cope with this? Do they perform some kind of “deduplication”? How should users take into account that web search resul

  7. RAPSearch: a fast protein similarity search tool for short reads

    Ye, Yuzhen; Choi, Jeong-Hyeon; Tang, Haixu


    ... of the very sizes of the short read datasets. We developed a fast protein similarity search tool RAPSearch that utilizes a reduced amino acid alphabet and suffix array to detect seeds of flexible length. For short reads...

  8. Accelerated expansion in a stochastic self-similar fractal universe

    Santini, Eduardo Sergio [Centro Brasileiro de Pesquisas Fisicas-MCT, Coordenacao de Cosmologia, Relatividade e Astrofisica: ICRA-BR, Rua Dr. Xavier Sigaud 150, Urca 22290-180, Rio de Janeiro, RJ (Brazil) and Comissao Nacional de Energia Nuclear-MCT, Rua General Severiano 90, Botafogo 22290-901, Rio de Janeiro, RJ (Brazil)]. E-mail:; Lemarchand, Guillermo Andres [Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, C.C. 8-Sucursal 25, C1425FFJ Buenos Aires (Argentina)]. E-mail:


    In a recent paper, a cosmological model based on El Naschie E infinity Cantorian space-time was presented [Iovane G. Varying G, accelerating universe, and other relevant consequences of a stochastic self-similar and fractal universe. Chaos, Solitons and Fractals 2004;20:657-67]. In that work, it was claimed that the present accelerated expansion of the universe can be obtained as the effect of a scaling law on Newtonian cosmology with a certain time-dependent gravitational constant (G). In the present work we show that such a cosmological model actually describes a decelerated universe. Then starting from the scenario presented in that paper, we realize a complementary approach based on an extended Friedmann model. In fact, we apply the same scaling law and a time-dependent gravitational constant, that follows from the observational constraints, to relativistic cosmology, i.e. a (extended) Friedmann's model. We are able to show that for a matter-dominated flat universe, with the scaling law and a varying G, an accelerated expansion emerges in such a way that the function luminosity distance vs redshift can be made close to the corresponding function that comes from the usual Friedmann's model supplemented with a cosmological constant, of value {omega} {sub {lambda}} {approx_equal} 0.7. Then the measurements of high redshift supernovae, could be interpreted as a consequence of the fractal self-similarity of the G varying relativistic universe.

  9. Efficient Subgraph Similarity Search on Large Probabilistic Graph Databases

    Yuan, Ye; Chen, Lei; Wang, Haixun


    Many studies have been conducted on seeking the efficient solution for subgraph similarity search over certain (deterministic) graphs due to its wide application in many fields, including bioinformatics, social network analysis, and Resource Description Framework (RDF) data management. All these works assume that the underlying data are certain. However, in reality, graphs are often noisy and uncertain due to various factors, such as errors in data extraction, inconsistencies in data integration, and privacy preserving purposes. Therefore, in this paper, we study subgraph similarity search on large probabilistic graph databases. Different from previous works assuming that edges in an uncertain graph are independent of each other, we study the uncertain graphs where edges' occurrences are correlated. We formally prove that subgraph similarity search over probabilistic graphs is #P-complete, thus, we employ a filter-and-verify framework to speed up the search. In the filtering phase,we develop tight lower and u...

  10. Using SQL Databases for Sequence Similarity Searching and Analysis.

    Pearson, William R; Mackey, Aaron J


    Relational databases can integrate diverse types of information and manage large sets of similarity search results, greatly simplifying genome-scale analyses. By focusing on taxonomic subsets of sequences, relational databases can reduce the size and redundancy of sequence libraries and improve the statistical significance of homologs. In addition, by loading similarity search results into a relational database, it becomes possible to explore and summarize the relationships between all of the proteins in an organism and those in other biological kingdoms. This unit describes how to use relational databases to improve the efficiency of sequence similarity searching and demonstrates various large-scale genomic analyses of homology-related data. It also describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. The unit also introduces search_demo, a database that stores sequence similarity search results. The search_demo database is then used to explore the evolutionary relationships between E. coli proteins and proteins in other organisms in a large-scale comparative genomic analysis. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  11. Exact score distribution computation for ontological similarity searches

    Schulz Marcel H


    Full Text Available Abstract Background Semantic similarity searches in ontologies are an important component of many bioinformatic algorithms, e.g., finding functionally related proteins with the Gene Ontology or phenotypically similar diseases with the Human Phenotype Ontology (HPO. We have recently shown that the performance of semantic similarity searches can be improved by ranking results according to the probability of obtaining a given score at random rather than by the scores themselves. However, to date, there are no algorithms for computing the exact distribution of semantic similarity scores, which is necessary for computing the exact P-value of a given score. Results In this paper we consider the exact computation of score distributions for similarity searches in ontologies, and introduce a simple null hypothesis which can be used to compute a P-value for the statistical significance of similarity scores. We concentrate on measures based on Resnik's definition of ontological similarity. A new algorithm is proposed that collapses subgraphs of the ontology graph and thereby allows fast score distribution computation. The new algorithm is several orders of magnitude faster than the naive approach, as we demonstrate by computing score distributions for similarity searches in the HPO. It is shown that exact P-value calculation improves clinical diagnosis using the HPO compared to approaches based on sampling. Conclusions The new algorithm enables for the first time exact P-value calculation via exact score distribution computation for ontology similarity searches. The approach is applicable to any ontology for which the annotation-propagation rule holds and can improve any bioinformatic method that makes only use of the raw similarity scores. The algorithm was implemented in Java, supports any ontology in OBO format, and is available for non-commercial and academic usage under:

  12. Search Profiles Based on User to Cluster Similarity

    Saša Bošnjak


    Full Text Available Privacy of web users' query search logs has, since the AOL dataset release few years ago, been treated as one of the central issues concerning privacy on the Internet. Therefore, the question of privacy preservation has also raised a lot of attention in different communities surrounding the search engines. Usage of clustering methods for providing low level contextual search while retaining high privacy-utility tradeoff, is examined in this paper. By using only the user`s cluster membership the search query terms could be no longer retained thus providing less privacy concerns both for the users and companies. The paper brings lightweight framework for combining query words, user similarities and clustering in order to provide a meaningful way of mining user searches while protecting their privacy. This differs from previous attempts for privacy preserving in the attempt to anonymize the queries instead of the users.


    Ilija Subasic


    Full Text Available Privacy of web users' query search logs has, since last year's AOL dataset release, been treated as one of the central issues concerning privacy on the Internet, Therefore, the question of privacy preservation has also raised a lot of attention in different communities surrounding the search engines. Usage of clustering methods for providing low level contextual search, wriile retaining high privacy/utility is examined in this paper. By using only the user's cluster membership the search query terms could be no longer retained thus providing less privacy concerns both for the users and companies. The paper brings lightweight framework for combining query words, user similarities and clustering in order to provide a meaningful way of mining user searches while protecting their privacy. This differs from previous attempts for privacy preserving in the attempt to anonymize the queries instead of the users.

  14. RAPSearch: a fast protein similarity search tool for short reads

    Choi Jeong-Hyeon


    Full Text Available Abstract Background Next Generation Sequencing (NGS is producing enormous corpuses of short DNA reads, affecting emerging fields like metagenomics. Protein similarity search--a key step to achieve annotation of protein-coding genes in these short reads, and identification of their biological functions--faces daunting challenges because of the very sizes of the short read datasets. Results We developed a fast protein similarity search tool RAPSearch that utilizes a reduced amino acid alphabet and suffix array to detect seeds of flexible length. For short reads (translated in 6 frames we tested, RAPSearch achieved ~20-90 times speedup as compared to BLASTX. RAPSearch missed only a small fraction (~1.3-3.2% of BLASTX similarity hits, but it also discovered additional homologous proteins (~0.3-2.1% that BLASTX missed. By contrast, BLAT, a tool that is even slightly faster than RAPSearch, had significant loss of sensitivity as compared to RAPSearch and BLAST. Conclusions RAPSearch is implemented as open-source software and is accessible at It enables faster protein similarity search. The application of RAPSearch in metageomics has also been demonstrated.

  15. Robust hashing with local models for approximate similarity search.

    Song, Jingkuan; Yang, Yi; Li, Xuelong; Huang, Zi; Yang, Yang


    Similarity search plays an important role in many applications involving high-dimensional data. Due to the known dimensionality curse, the performance of most existing indexing structures degrades quickly as the feature dimensionality increases. Hashing methods, such as locality sensitive hashing (LSH) and its variants, have been widely used to achieve fast approximate similarity search by trading search quality for efficiency. However, most existing hashing methods make use of randomized algorithms to generate hash codes without considering the specific structural information in the data. In this paper, we propose a novel hashing method, namely, robust hashing with local models (RHLM), which learns a set of robust hash functions to map the high-dimensional data points into binary hash codes by effectively utilizing local structural information. In RHLM, for each individual data point in the training dataset, a local hashing model is learned and used to predict the hash codes of its neighboring data points. The local models from all the data points are globally aligned so that an optimal hash code can be assigned to each data point. After obtaining the hash codes of all the training data points, we design a robust method by employing l2,1 -norm minimization on the loss function to learn effective hash functions, which are then used to map each database point into its hash code. Given a query data point, the search process first maps it into the query hash code by the hash functions and then explores the buckets, which have similar hash codes to the query hash code. Extensive experimental results conducted on real-life datasets show that the proposed RHLM outperforms the state-of-the-art methods in terms of search quality and efficiency.

  16. Online multiple kernel similarity learning for visual search.

    Xia, Hao; Hoi, Steven C H; Jin, Rong; Zhao, Peilin


    Recent years have witnessed a number of studies on distance metric learning to improve visual similarity search in content-based image retrieval (CBIR). Despite their successes, most existing methods on distance metric learning are limited in two aspects. First, they usually assume the target proximity function follows the family of Mahalanobis distances, which limits their capacity of measuring similarity of complex patterns in real applications. Second, they often cannot effectively handle the similarity measure of multimodal data that may originate from multiple resources. To overcome these limitations, this paper investigates an online kernel similarity learning framework for learning kernel-based proximity functions which goes beyond the conventional linear distance metric learning approaches. Based on the framework, we propose a novel online multiple kernel similarity (OMKS) learning method which learns a flexible nonlinear proximity function with multiple kernels to improve visual similarity search in CBIR. We evaluate the proposed technique for CBIR on a variety of image data sets in which encouraging results show that OMKS outperforms the state-of-the-art techniques significantly.

  17. Dark Matter Searches at Accelerator Facilities

    Dutta, Bhaskar


    About 80 percent of the matter content of the universe is dark matter. However, the particle origin of dark matter is yet to be established. Many extensions of the Standard Model (SM) contain candidates of dark matter. The search for the particle origin is currently ongoing at the large hadron collider (LHC). In this review, I will summarize the different search strategies for this elusive particle.

  18. Similarity Search and Locality Sensitive Hashing using TCAMs

    Shinde, Rajendra; Gupta, Pankaj; Dutta, Debojyoti


    Similarity search methods are widely used as kernels in various machine learning applications. Nearest neighbor search (NNS) algorithms are often used to retrieve similar entries, given a query. While there exist efficient techniques for exact query lookup using hashing, similarity search using exact nearest neighbors is known to be a hard problem and in high dimensions, best known solutions offer little improvement over a linear scan. Fast solutions to the approximate NNS problem include Locality Sensitive Hashing (LSH) based techniques, which need storage polynomial in $n$ with exponent greater than $1$, and query time sublinear, but still polynomial in $n$, where $n$ is the size of the database. In this work we present a new technique of solving the approximate NNS problem in Euclidean space using a Ternary Content Addressable Memory (TCAM), which needs near linear space and has O(1) query time. In fact, this method also works around the best known lower bounds in the cell probe model for the query time us...

  19. Semantic similarity measure in biomedical domain leverage web search engine.

    Chen, Chi-Huang; Hsieh, Sheau-Ling; Weng, Yung-Ching; Chang, Wen-Yung; Lai, Feipei


    Semantic similarity measure plays an essential role in Information Retrieval and Natural Language Processing. In this paper we propose a page-count-based semantic similarity measure and apply it in biomedical domains. Previous researches in semantic web related applications have deployed various semantic similarity measures. Despite the usefulness of the measurements in those applications, measuring semantic similarity between two terms remains a challenge task. The proposed method exploits page counts returned by the Web Search Engine. We define various similarity scores for two given terms P and Q, using the page counts for querying P, Q and P AND Q. Moreover, we propose a novel approach to compute semantic similarity using lexico-syntactic patterns with page counts. These different similarity scores are integrated adapting support vector machines, to leverage the robustness of semantic similarity measures. Experimental results on two datasets achieve correlation coefficients of 0.798 on the dataset provided by A. Hliaoutakis, 0.705 on the dataset provide by T. Pedersen with physician scores and 0.496 on the dataset provided by T. Pedersen et al. with expert scores.

  20. Computing Semantic Similarity Measure Between Words Using Web Search Engine

    Pushpa C N


    Full Text Available Semantic Similarity measures between words plays an important role in information retrieval, natural language processing and in various tasks on the web. In this paper, we have proposed a Modified Pattern Extraction Algorithm to compute th e supervised semantic similarity measure between the words by combining both page count meth od and web snippets method. Four association measures are used to find semantic simi larity between words in page count method using web search engines. We use a Sequential Minim al Optimization (SMO support vector machines (SVM to find the optimal combination of p age counts-based similarity scores and top-ranking patterns from the web snippets method. The SVM is trained to classify synonymous word-pairs and non-synonymous word-pairs. The propo sed Modified Pattern Extraction Algorithm outperforms by 89.8 percent of correlatio n value.

  1. SS-Wrapper: a package of wrapper applications for similarity searches on Linux clusters

    Lefkowitz Elliot J


    Full Text Available Abstract Background Large-scale sequence comparison is a powerful tool for biological inference in modern molecular biology. Comparing new sequences to those in annotated databases is a useful source of functional and structural information about these sequences. Using software such as the basic local alignment search tool (BLAST or HMMPFAM to identify statistically significant matches between newly sequenced segments of genetic material and those in databases is an important task for most molecular biologists. Searching algorithms are intrinsically slow and data-intensive, especially in light of the rapid growth of biological sequence databases due to the emergence of high throughput DNA sequencing techniques. Thus, traditional bioinformatics tools are impractical on PCs and even on dedicated UNIX servers. To take advantage of larger databases and more reliable methods, high performance computation becomes necessary. Results We describe the implementation of SS-Wrapper (Similarity Search Wrapper, a package of wrapper applications that can parallelize similarity search applications on a Linux cluster. Our wrapper utilizes a query segmentation-search (QS-search approach to parallelize sequence database search applications. It takes into consideration load balancing between each node on the cluster to maximize resource usage. QS-search is designed to wrap many different search tools, such as BLAST and HMMPFAM using the same interface. This implementation does not alter the original program, so newly obtained programs and program updates should be accommodated easily. Benchmark experiments using QS-search to optimize BLAST and HMMPFAM showed that QS-search accelerated the performance of these programs almost linearly in proportion to the number of CPUs used. We have also implemented a wrapper that utilizes a database segmentation approach (DS-BLAST that provides a complementary solution for BLAST searches when the database is too large to fit into

  2. CLIP: similarity searching of 3D databases using clique detection.

    Rhodes, Nicholas; Willett, Peter; Calvet, Alain; Dunbar, James B; Humblet, Christine


    This paper describes a program for 3D similarity searching, called CLIP (for Candidate Ligand Identification Program), that uses the Bron-Kerbosch clique detection algorithm to find those structures in a file that have large structures in common with a target structure. Structures are characterized by the geometric arrangement of pharmacophore points and the similarity between two structures calculated using modifications of the Simpson and Tanimoto association coefficients. This modification takes into account the fact that a distance tolerance is required to ensure that pairs of interatomic distances can be regarded as equivalent during the clique-construction stage of the matching algorithm. Experiments with HIV assay data demonstrate the effectiveness and the efficiency of this approach to virtual screening.

  3. SHOP: scaffold hopping by GRID-based similarity searches

    Bergmann, Rikke; Linusson, Anna; Zamora, Ismael


    A new GRID-based method for scaffold hopping (SHOP) is presented. In a fully automatic manner, scaffolds were identified in a database based on three types of 3D-descriptors. SHOP's ability to recover scaffolds was assessed and validated by searching a database spiked with fragments of known...... ligands of three different protein targets relevant for drug discovery using a rational approach based on statistical experimental design. Five out of eight and seven out of eight thrombin scaffolds and all seven HIV protease scaffolds were recovered within the top 10 and 31 out of 31 neuraminidase...... scaffolds were in the 31 top-ranked scaffolds. SHOP also identified new scaffolds with substantially different chemotypes from the queries. Docking analysis indicated that the new scaffolds would have similar binding modes to those of the respective query scaffolds observed in X-ray structures...

  4. New limits on Magnetic Monopoles searches from accelerator and non-accelerator experiments

    Cozzi, M


    Here the status of the searches for ``classical Dirac'' Magnetic Monopoles (MMs) at accelerators and for GUT MMs in the cosmic radiation is discussed. We present recent analysis for ``classical Dirac'' monopoles at accelerators and the lowest flux upper limit for Magnetic Monopoles in the mass range 10$^{5}$ - 10$^{12}$ GeV obtained with the SLIM experiment at the Chacaltaya High Altitude Laboratory (5290 m a.s.l.).

  5. A genetic similarity algorithm for searching the Gene Ontology terms and annotating anonymous protein sequences.

    Othman, Razib M; Deris, Safaai; Illias, Rosli M


    A genetic similarity algorithm is introduced in this study to find a group of semantically similar Gene Ontology terms. The genetic similarity algorithm combines semantic similarity measure algorithm with parallel genetic algorithm. The semantic similarity measure algorithm is used to compute the similitude strength between the Gene Ontology terms. Then, the parallel genetic algorithm is employed to perform batch retrieval and to accelerate the search in large search space of the Gene Ontology graph. The genetic similarity algorithm is implemented in the Gene Ontology browser named basic UTMGO to overcome the weaknesses of the existing Gene Ontology browsers which use a conventional approach based on keyword matching. To show the applicability of the basic UTMGO, we extend its structure to develop a Gene Ontology -based protein sequence annotation tool named extended UTMGO. The objective of developing the extended UTMGO is to provide a simple and practical tool that is capable of producing better results and requires a reasonable amount of running time with low computing cost specifically for offline usage. The computational results and comparison with other related tools are presented to show the effectiveness of the proposed algorithm and tools.

  6. Efficient searching and annotation of metabolic networks using chemical similarity

    Pertusi, Dante A.; Stine, Andrew E.; Broadbelt, Linda J.; Keith E J Tyo


    Motivation: The urgent need for efficient and sustainable biological production of fuels and high-value chemicals has elicited a wave of in silico techniques for identifying promising novel pathways to these compounds in large putative metabolic networks. To date, these approaches have primarily used general graph search algorithms, which are prohibitively slow as putative metabolic networks may exceed 1 million compounds. To alleviate this limitation, we report two methods—SimIndex (SI) and ...

  7. Acceleration of saddle-point searches with machine learning.

    Peterson, Andrew A


    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  8. Acceleration of saddle-point searches with machine learning

    Peterson, Andrew A.


    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  9. Similarity for ultra-relativistic laser plasmas and the optimal acceleration regime

    Pukhov, A


    A similarity theory is developed for ultra-relativistic laser-plasmas. It is shown that the most fundamental S-similarity is valid for both under- and overdense plasmas. Optimal scalings for laser wake field electron acceleration are obtained heuristically. The strong message of the present work is that the bubble acceleration regime [see Pukhov, Meyer-ter-Vehn, Appl. Phys. B, 74, 355 (2002)] satisfies these optimal scalings.

  10. Density-based similarity measures for content based search

    Hush, Don R [Los Alamos National Laboratory; Porter, Reid B [Los Alamos National Laboratory; Ruggiero, Christy E [Los Alamos National Laboratory


    We consider the query by multiple example problem where the goal is to identify database samples whose content is similar to a coUection of query samples. To assess the similarity we use a relative content density which quantifies the relative concentration of the query distribution to the database distribution. If the database distribution is a mixture of the query distribution and a background distribution then it can be shown that database samples whose relative content density is greater than a particular threshold {rho} are more likely to have been generated by the query distribution than the background distribution. We describe an algorithm for predicting samples with relative content density greater than {rho} that is computationally efficient and possesses strong performance guarantees. We also show empirical results for applications in computer network monitoring and image segmentation.

  11. SymDex: increasing the efficiency of chemical fingerprint similarity searches for comparing large chemical libraries by using query set indexing.

    Tai, David; Fang, Jianwen


    The large sizes of today's chemical databases require efficient algorithms to perform similarity searches. It can be very time consuming to compare two large chemical databases. This paper seeks to build upon existing research efforts by describing a novel strategy for accelerating existing search algorithms for comparing large chemical collections. The quest for efficiency has focused on developing better indexing algorithms by creating heuristics for searching individual chemical against a chemical library by detecting and eliminating needless similarity calculations. For comparing two chemical collections, these algorithms simply execute searches for each chemical in the query set sequentially. The strategy presented in this paper achieves a speedup upon these algorithms by indexing the set of all query chemicals so redundant calculations that arise in the case of sequential searches are eliminated. We implement this novel algorithm by developing a similarity search program called Symmetric inDexing or SymDex. SymDex shows over a 232% maximum speedup compared to the state-of-the-art single query search algorithm over real data for various fingerprint lengths. Considerable speedup is even seen for batch searches where query set sizes are relatively small compared to typical database sizes. To the best of our knowledge, SymDex is the first search algorithm designed specifically for comparing chemical libraries. It can be adapted to most, if not all, existing indexing algorithms and shows potential for accelerating future similarity search algorithms for comparing chemical databases.

  12. Content-Based Search on a Database of Geometric Models: Identifying Objects of Similar Shape



    The Geometric Search Engine is a software system for storing and searching a database of geometric models. The database maybe searched for modeled objects similar in shape to a target model supplied by the user. The database models are generally from CAD models while the target model may be either a CAD model or a model generated from range data collected from a physical object. This document describes key generation, database layout, and search of the database.

  13. Pulsar Acceleration Searches on the GPU for the Square Kilometre Array

    Dimoudi, Sofia


    Pulsar acceleration searches are methods for recovering signals from radio telescopes, that may otherwise be lost due to the effect of orbital acceleration in binary systems. The vast amount of data that will be produced by next generation instruments such as the Square Kilometre Array (SKA) necessitates real-time acceleration searches, which in turn requires the use of HPC platforms. We present our implementation of the Fourier Domain Acceleration Search (FDAS) algorithm on Graphics Processor Units (GPUs) in the context of the SKA, as part of the Astro-Accelerate real-time data processing library, currently under development at the Oxford e-Research Centre (OeRC), University of Oxford.

  14. Perceptual Grouping in Haptic Search: The Influence of Proximity, Similarity, and Good Continuation

    Overvliet, Krista E.; Krampe, Ralf Th.; Wagemans, Johan


    We conducted a haptic search experiment to investigate the influence of the Gestalt principles of proximity, similarity, and good continuation. We expected faster search when the distractors could be grouped. We chose edges at different orientations as stimuli because they are processed similarly in the haptic and visual modality. We therefore…

  15. A MiniBooNE Accelerator-Produced (sub)-GeV Dark Matter Search

    Thornton, Remington; MiniBooNE-DM Collaboration


    Cosmological observations indicate that our universe contains dark matter (DM), yet we have no measurements of its microscopic properties. Whereas the gravitational interaction of DM is well understood, its interaction with the Standard Model is not. Direct detection experiments search for a nuclear recoil interaction produced by a DM relic particle and have a low-mass sensitivity edge of order 1 GeV. To detect DM with mass below 1 GeV, either the sensitivity of the experiments needs to be improved or use of accelerators producing boosted low-mass DM are needed. Using neutrino detectors to search for low-mass DM is logical due to the similarity of the DM and ν signatures in the detector. The MiniBooNE experiment, located at Fermilab on the Booster Neutrino Beamline, ran for 10 years in ν and ν modes and is already well understood, making it desirable to search for accelerator-produced boosted low-mass DM. A search for DM produced by 8 GeV protons hitting a steel beam-dump has finished, collecting 1 . 86 ×1020 POT . Final analysis containing 90% confidence limits and a model independent fit will be presented.

  16. Complementarity of Indirect and Accelerator Dark Matter Searches

    Bertone, G; Fornasa, M; Pieri, L; de Austri, R Ruiz; Trotta, R


    Even if Supersymmetric particles are found at the Large Hadron Collider (LHC), it will be difficult to prove that they constitute the bulk of the Dark Matter (DM) in the Universe using LHC data alone. We study the complementarity of LHC and DM indirect searches, working out explicitly the reconstruction of the DM properties for a specific benchmark model in the coannihilation region of a 24-parameters supersymmetric model. Combining mock high-luminosity LHC data with present-day null searches for gamma-rays from dwarf galaxies with the Fermi LAT, we show that current Fermi LAT limits already have the capability of ruling out a spurious Wino-like solution that would survive using LHC data only, thus leading to the correct identification of the cosmological solution. We also demonstrate that upcoming Planck constraints on the reionization history will have a similar constraining power, and discuss the impact of a possible detection of gamma-rays from DM annihilation in Draco with a CTA-like experiment. Our resu...

  17. δ-Similar Elimination to Enhance Search Performance of Multiobjective Evolutionary Algorithms

    Aguirre, Hernán; Sato, Masahiko; Tanaka, Kiyoshi

    In this paper, we propose δ-similar elimination to improve the search performance of multiobjective evolutionary algorithms in combinatorial optimization problems. This method eliminates similar individuals in objective space to fairly distribute selection among the different regions of the instantaneous Pareto front. We investigate four eliminating methods analyzing their effects using NSGA-II. In addition, we compare the search performance of NSGA-II enhanced by our method and NSGA-II enhanced by controlled elitism.

  18. RAPSearch2: a fast and memory-efficient protein similarity search tool for next-generation sequencing data.

    Zhao, Yongan; Tang, Haixu; Ye, Yuzhen


    With the wide application of next-generation sequencing (NGS) techniques, fast tools for protein similarity search that scale well to large query datasets and large databases are highly desirable. In a previous work, we developed RAPSearch, an algorithm that achieved a ~20-90-fold speedup relative to BLAST while still achieving similar levels of sensitivity for short protein fragments derived from NGS data. RAPSearch, however, requires a substantial memory footprint to identify alignment seeds, due to its use of a suffix array data structure. Here we present RAPSearch2, a new memory-efficient implementation of the RAPSearch algorithm that uses a collision-free hash table to index a similarity search database. The utilization of an optimized data structure further speeds up the similarity search-another 2-3 times. We also implemented multi-threading in RAPSearch2, and the multi-thread modes achieve significant acceleration (e.g. 3.5X for 4-thread mode). RAPSearch2 requires up to 2G memory when running in single thread mode, or up to 3.5G memory when running in 4-thread mode. Implemented in C++, the source code is freely available for download at the RAPSearch2 website: Available at the RAPSearch2 website.

  19. Accelerated search for biomolecular network models to interpret high-throughput experimental data

    Sokhansanj Bahrad A


    Full Text Available Abstract Background The functions of human cells are carried out by biomolecular networks, which include proteins, genes, and regulatory sites within DNA that encode and control protein expression. Models of biomolecular network structure and dynamics can be inferred from high-throughput measurements of gene and protein expression. We build on our previously developed fuzzy logic method for bridging quantitative and qualitative biological data to address the challenges of noisy, low resolution high-throughput measurements, i.e., from gene expression microarrays. We employ an evolutionary search algorithm to accelerate the search for hypothetical fuzzy biomolecular network models consistent with a biological data set. We also develop a method to estimate the probability of a potential network model fitting a set of data by chance. The resulting metric provides an estimate of both model quality and dataset quality, identifying data that are too noisy to identify meaningful correlations between the measured variables. Results Optimal parameters for the evolutionary search were identified based on artificial data, and the algorithm showed scalable and consistent performance for as many as 150 variables. The method was tested on previously published human cell cycle gene expression microarray data sets. The evolutionary search method was found to converge to the results of exhaustive search. The randomized evolutionary search was able to converge on a set of similar best-fitting network models on different training data sets after 30 generations running 30 models per generation. Consistent results were found regardless of which of the published data sets were used to train or verify the quantitative predictions of the best-fitting models for cell cycle gene dynamics. Conclusion Our results demonstrate the capability of scalable evolutionary search for fuzzy network models to address the problem of inferring models based on complex, noisy biomolecular

  20. Accelerating chemical database searching using graphics processing units.

    Liu, Pu; Agrafiotis, Dimitris K; Rassokhin, Dmitrii N; Yang, Eric


    The utility of chemoinformatics systems depends on the accurate computer representation and efficient manipulation of chemical compounds. In such systems, a small molecule is often digitized as a large fingerprint vector, where each element indicates the presence/absence or the number of occurrences of a particular structural feature. Since in theory the number of unique features can be exceedingly large, these fingerprint vectors are usually folded into much shorter ones using hashing and modulo operations, allowing fast "in-memory" manipulation and comparison of molecules. There is increasing evidence that lossless fingerprints can substantially improve retrieval performance in chemical database searching (substructure or similarity), which have led to the development of several lossless fingerprint compression algorithms. However, any gains in storage and retrieval afforded by compression need to be weighed against the extra computational burden required for decompression before these fingerprints can be compared. Here we demonstrate that graphics processing units (GPU) can greatly alleviate this problem, enabling the practical application of lossless fingerprints on large databases. More specifically, we show that, with the help of a ~$500 ordinary video card, the entire PubChem database of ~32 million compounds can be searched in ~0.2-2 s on average, which is 2 orders of magnitude faster than a conventional CPU. If multiple query patterns are processed in batch, the speedup is even more dramatic (less than 0.02-0.2 s/query for 1000 queries). In the present study, we use the Elias gamma compression algorithm, which results in a compression ratio as high as 0.097.

  1. Efficient Similarity Search Using the Earth Mover's Distance for Large Multimedia Databases

    Assent, Ira; Wichterich, Marc; Meisen, Tobias


    Multimedia similarity search in large databases requires efficient query processing. The Earth mover's distance, introduced in computer vision, is successfully used as a similarity model in a number of small-scale applications. Its computational complexity hindered its adoption in large multimedia...


    S. K. Jayanthi


    Full Text Available In the current scenario, web page result personalization is playing a vital role. Nearly 80 % of the users expect the best results in the first page itself without having any persistence to browse longer in URL mode. This research work focuses on two main themes: Semantic web search through online and Domain based search through offline. The first part is to find an effective method which allows grouping similar results together using BookShelf Data Structure and organizing the various clusters. The second one is focused on the academic domain based search through offline. This paper focuses on finding documents which are similar and how Vector space can be used to solve it. So more weightage is given for the principles and working methodology of similarity propagation. Cosine similarity measure is used for finding the relevancy among the documents.

  3. Efficient Retrieval of Images for Search Engine by Visual Similarity and Re Ranking

    Viswa S S


    Full Text Available Nowadays, web scale image search engines (e.g. Google Image Search, Microsoft Live Image Search rely almost purely on surrounding text features. Users type keywords in hope of finding a certain type of images. The search engine returns thousands of images ranked by the text keywords extracted from the surrounding text. However, many of returned images are noisy, disorganized, or irrelevant. Even Google and Microsoft have no Visual Information for searching of images. Using visual information to re rank and improve text based image search results is the idea. This improves the precision of the text based image search ranking by incorporating the information conveyed by the visual modality. The typical assumption that the top- images in the text-based search result are equally relevant is relaxed by linking the relevance of the images to their initial rank positions. Then, a number of images from the initial search result are employed as the prototypes that serve to visually represent the query and that are subsequently used to construct meta re rankers .i.e. The most relevant images are found by visual similarity and the average scores are calculated. By applying different meta re rankers to an image from the initial result, re ranking scores are generated, which are then used to find the new rank position for an image in the re ranked search result. Human supervision is introduced to learn the model weights offline, prior to the online re ranking process. While model learning requires manual labelling of the results for a few queries, the resulting model is query independent and therefore applicable to any other query. The experimental results on a representative web image search dataset comprising 353 queries demonstrate that the proposed method outperforms the existing supervised and unsupervised Re ranking approaches. Moreover, it improves the performance over the text-based image search engine by more than 25.48%.

  4. MS/MS similarity networking accelerated target profiling of triterpene saponins in Eleutherococcus senticosus leaves.

    Ge, Yue-Wei; Zhu, Shu; Yoshimatsu, Kayo; Komatsu, Katsuko


    The targeted mass information of compounds accelerated their discovery in a large volume of untargeted MS data. An MS/MS similarity networking is advanced in clustering the structural analogues, which benefits the collection of mass information of similar compounds. The triterpene saponins extracted from Eleutherococcus senticosus leaves (ESL), a kind of functional tea, have shown promise in the relief of Alzheimer's disease. In this work, a target-precursor list (TPL) generated using MS/MS similarity networking was employed to rapidly trace 106 triterpene saponins from the aqueous extracts of ESL, of which 49 were tentatively identified as potentially new triterpene saponins. Moreover, a compound database of triterpene saponins was established and successfully applied to uncover their distribution features in ESL samples collected from different areas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Similarity

    Apostol, Tom M. (Editor)


    In this 'Project Mathematics! series, sponsored by the California Institute for Technology (CalTech), the mathematical concept of similarity is presented. he history of and real life applications are discussed using actual film footage and computer animation. Terms used and various concepts of size, shape, ratio, area, and volume are demonstrated. The similarity of polygons, solids, congruent triangles, internal ratios, perimeters, and line segments using the previous mentioned concepts are shown.

  6. A comparison of field-based similarity searching methods: CatShape, FBSS, and ROCS.

    Moffat, Kirstin; Gillet, Valerie J; Whittle, Martin; Bravi, Gianpaolo; Leach, Andrew R


    Three field-based similarity methods are compared in retrospective virtual screening experiments. The methods are the CatShape module of CATALYST, ROCS, and an in-house program developed at the University of Sheffield called FBSS. The programs are used in both rigid and flexible searches carried out in the MDL Drug Data Report. UNITY 2D fingerprints are also used to provide a comparison with a more traditional approach to similarity searching, and similarity based on simple whole-molecule properties is used to provide a baseline for the more sophisticated searches. Overall, UNITY 2D fingerprints and ROCS with the chemical force field option gave comparable performance and were superior to the shape-only 3D methods. When the flexible methods were compared with the rigid methods, it was generally found that the flexible methods gave slightly better results than their respective rigid methods; however, the increased performance did not justify the additional computational cost required.

  7. Sequence heterogeneity accelerates protein search for targets on DNA

    Shvets, Alexey A.; Kolomeisky, Anatoly B., E-mail: [Department of Chemistry and Center for Theoretical Biological Physics, Rice University, Houston, Texas 77005 (United States)


    The process of protein search for specific binding sites on DNA is fundamentally important since it marks the beginning of all major biological processes. We present a theoretical investigation that probes the role of DNA sequence symmetry, heterogeneity, and chemical composition in the protein search dynamics. Using a discrete-state stochastic approach with a first-passage events analysis, which takes into account the most relevant physical-chemical processes, a full analytical description of the search dynamics is obtained. It is found that, contrary to existing views, the protein search is generally faster on DNA with more heterogeneous sequences. In addition, the search dynamics might be affected by the chemical composition near the target site. The physical origins of these phenomena are discussed. Our results suggest that biological processes might be effectively regulated by modifying chemical composition, symmetry, and heterogeneity of a genome.

  8. Similarities and differences between Web search procedure and searching in the pre-web information retrieval systems

    Yazdan Mansourian


    Full Text Available This paper presents an introductory discussion about the commonalities and dissimilarities between Web searching procedure and the searching process in the previous online information retrieval systems including classic information retrieval systems and database. The paper attempts to explain which factors make these two groups different, why investigating about the search process on the Web environment is important, how much we know about this procedure and what are the main lines of research in front of the researchers in this area of study and practice. After presenting the major involved factor the paper concludes that although information seeking process on the Web is fairly similar to the pre-web systems in some ways, there are notable differences between them as well. These differences may provide Web searcher and Web researchers with some opportunities and challenges.

  9. A New Retrieval Model Based on TextTiling for Document Similarity Search

    Xiao-Jun Wan; Yu-Xin Peng


    Document similarity search is to find documents similar to a given query document and return a ranked list of similar documents to users, which is widely used in many text and web systems, such as digital library, search engine,etc. Traditional retrieval models, including the Okapi's BM25 model and the Smart's vector space model with length normalization, could handle this problem to some extent by taking the query document as a long query. In practice,the Cosine measure is considered as the best model for document similarity search because of its good ability to measure similarity between two documents. In this paper, the quantitative performances of the above models are compared using experiments. Because the Cosine measure is not able to reflect the structural similarity between documents, a new retrieval model based on TextTiling is proposed in the paper. The proposed model takes into account the subtopic structures of documents. It first splits the documents into text segments with TextTiling and calculates the similarities for different pairs of text segments in the documents. Lastly the overall similarity between the documents is returned by combining the similarities of different pairs of text segments with optimal matching method. Experiments are performed and results show:1) the popular retrieval models (the Okapi's BM25 model and the Smart's vector space model with length normalization)do not perform well for document similarity search; 2) the proposed model based on TextTiling is effective and outperforms other models, including the Cosine measure; 3) the methods for the three components in the proposed model are validated to be appropriately employed.

  10. Molecular fingerprint recombination: generating hybrid fingerprints for similarity searching from different fingerprint types.

    Nisius, Britta; Bajorath, Jürgen


    Molecular fingerprints have a long history in computational medicinal chemistry and continue to be popular tools for similarity searching. Over the years, a variety of fingerprint types have been introduced. We report an approach to identify preferred bit subsets in fingerprints of different design and "recombine" these bit segments into "hybrid fingerprints". These compound class-directed fingerprint representations are found to increase the similarity search performance of their parental fingerprints, which can be rationalized by the often complementary nature of distinct fingerprint features.

  11. Accelerated Simplified Swarm Optimization with Exploitation Search Scheme for Data Clustering.

    Wei-Chang Yeh

    Full Text Available Data clustering is commonly employed in many disciplines. The aim of clustering is to partition a set of data into clusters, in which objects within the same cluster are similar and dissimilar to other objects that belong to different clusters. Over the past decade, the evolutionary algorithm has been commonly used to solve clustering problems. This study presents a novel algorithm based on simplified swarm optimization, an emerging population-based stochastic optimization approach with the advantages of simplicity, efficiency, and flexibility. This approach combines variable vibrating search (VVS and rapid centralized strategy (RCS in dealing with clustering problem. VVS is an exploitation search scheme that can refine the quality of solutions by searching the extreme points nearby the global best position. RCS is developed to accelerate the convergence rate of the algorithm by using the arithmetic average. To empirically evaluate the performance of the proposed algorithm, experiments are examined using 12 benchmark datasets, and corresponding results are compared with recent works. Results of statistical analysis indicate that the proposed algorithm is competitive in terms of the quality of solutions.

  12. Web Image Search Re-ranking with Click-based Similarity and Typicality.

    Yang, Xiaopeng; Mei, Tao; Zhang, Yong Dong; Liu, Jie; Satoh, Shin'ichi


    In image search re-ranking, besides the well known semantic gap, intent gap, which is the gap between the representation of users' query/demand and the real intent of the users, is becoming a major problem restricting the development of image retrieval. To reduce human effects, in this paper, we use image click-through data, which can be viewed as the "implicit feedback" from users, to help overcome the intention gap, and further improve the image search performance. Generally, the hypothesis visually similar images should be close in a ranking list and the strategy images with higher relevance should be ranked higher than others are widely accepted. To obtain satisfying search results, thus, image similarity and the level of relevance typicality are determinate factors correspondingly. However, when measuring image similarity and typicality, conventional re-ranking approaches only consider visual information and initial ranks of images, while overlooking the influence of click-through data. This paper presents a novel re-ranking approach, named spectral clustering re-ranking with click-based similarity and typicality (SCCST). First, to learn an appropriate similarity measurement, we propose click-based multi-feature similarity learning algorithm (CMSL), which conducts metric learning based on clickbased triplets selection, and integrates multiple features into a unified similarity space via multiple kernel learning. Then based on the learnt click-based image similarity measure, we conduct spectral clustering to group visually and semantically similar images into same clusters, and get the final re-rank list by calculating click-based clusters typicality and withinclusters click-based image typicality in descending order. Our experiments conducted on two real-world query-image datasets with diverse representative queries show that our proposed reranking approach can significantly improve initial search results, and outperform several existing re-ranking approaches.

  13. Similarity-based search of model organism, disease and drug effect phenotypes

    Hoehndorf, Robert


    Background: Semantic similarity measures over phenotype ontologies have been demonstrated to provide a powerful approach for the analysis of model organism phenotypes, the discovery of animal models of human disease, novel pathways, gene functions, druggable therapeutic targets, and determination of pathogenicity. Results: We have developed PhenomeNET 2, a system that enables similarity-based searches over a large repository of phenotypes in real-time. It can be used to identify strains of model organisms that are phenotypically similar to human patients, diseases that are phenotypically similar to model organism phenotypes, or drug effect profiles that are similar to the phenotypes observed in a patient or model organism. PhenomeNET 2 is available at Conclusions: Phenotype-similarity searches can provide a powerful tool for the discovery and investigation of molecular mechanisms underlying an observed phenotypic manifestation. PhenomeNET 2 facilitates user-defined similarity searches and allows researchers to analyze their data within a large repository of human, mouse and rat phenotypes.

  14. Using homology relations within a database markedly boosts protein sequence similarity search.

    Tong, Jing; Sadreyev, Ruslan I; Pei, Jimin; Kinch, Lisa N; Grishin, Nick V


    Inference of homology from protein sequences provides an essential tool for analyzing protein structure, function, and evolution. Current sequence-based homology search methods are still unable to detect many similarities evident from protein spatial structures. In computer science a search engine can be improved by considering networks of known relationships within the search database. Here, we apply this idea to protein-sequence-based homology search and show that it dramatically enhances the search accuracy. Our new method, COMPADRE (COmparison of Multiple Protein sequence Alignments using Database RElationships) assesses the relationship between the query sequence and a hit in the database by considering the similarity between the query and hit's known homologs. This approach increases detection quality, boosting the precision rate from 18% to 83% at half-coverage of all database homologs. The increased precision rate allows detection of a large fraction of protein structural relationships, thus providing structure and function predictions for previously uncharacterized proteins. Our results suggest that this general approach is applicable to a wide variety of methods for detection of biological similarities. The web server is available at

  15. GHOSTM: a GPU-accelerated homology search tool for metagenomics.

    Shuji Suzuki

    Full Text Available BACKGROUND: A large number of sensitive homology searches are required for mapping DNA sequence fragments to known protein sequences in public and private databases during metagenomic analysis. BLAST is currently used for this purpose, but its calculation speed is insufficient, especially for analyzing the large quantities of sequence data obtained from a next-generation sequencer. However, faster search tools, such as BLAT, do not have sufficient search sensitivity for metagenomic analysis. Thus, a sensitive and efficient homology search tool is in high demand for this type of analysis. METHODOLOGY/PRINCIPAL FINDINGS: We developed a new, highly efficient homology search algorithm suitable for graphics processing unit (GPU calculations that was implemented as a GPU system that we called GHOSTM. The system first searches for candidate alignment positions for a sequence from the database using pre-calculated indexes and then calculates local alignments around the candidate positions before calculating alignment scores. We implemented both of these processes on GPUs. The system achieved calculation speeds that were 130 and 407 times faster than BLAST with 1 GPU and 4 GPUs, respectively. The system also showed higher search sensitivity and had a calculation speed that was 4 and 15 times faster than BLAT with 1 GPU and 4 GPUs. CONCLUSIONS: We developed a GPU-optimized algorithm to perform sensitive sequence homology searches and implemented the system as GHOSTM. Currently, sequencing technology continues to improve, and sequencers are increasingly producing larger and larger quantities of data. This explosion of sequence data makes computational analysis with contemporary tools more difficult. We developed GHOSTM, which is a cost-efficient tool, and offer this tool as a potential solution to this problem.

  16. Accelerating dark-matter axion searches with quantum measurement technology

    Zheng, Huaixiu; Brierley, R T; Girvin, S M; Lehnert, K W


    The axion particle, a consequence of an elegant hypothesis that resolves the strong-CP problem of quantum chromodynamics, is a plausible origin for cosmological dark matter. In searches for axionic dark matter that detect the conversion of axions to microwave photons, the quantum noise associated with microwave vacuum fluctuations will soon limit the rate at which parameter space is searched. Here we show that this noise can be partially overcome either by squeezing the quantum vacuum using recently developed Josephson parametric devices, or by using superconducting qubits to count microwave photons.

  17. RScan: fast searching structural similarities for structured RNAs in large databases

    Liu Guo-Ping


    Full Text Available Abstract Background Many RNAs have evolutionarily conserved secondary structures instead of primary sequences. Recently, there are an increasing number of methods being developed with focus on the structural alignments for finding conserved secondary structures as well as common structural motifs in pair-wise or multiple sequences. A challenging task is to search similar structures quickly for structured RNA sequences in large genomic databases since existing methods are too slow to be used in large databases. Results An implementation of a fast structural alignment algorithm, RScan, is proposed to fulfill the task. RScan is developed by levering the advantages of both hashing algorithms and local alignment algorithms. In our experiment, on the average, the times for searching a tRNA and an rRNA in the randomized A. pernix genome are only 256 seconds and 832 seconds respectively by using RScan, but need 3,178 seconds and 8,951 seconds respectively by using an existing method RSEARCH. Remarkably, RScan can handle large database queries, taking less than 4 minutes for searching similar structures for a microRNA precursor in human chromosome 21. Conclusion These results indicate that RScan is a preferable choice for real-life application of searching structural similarities for structured RNAs in large databases. RScan software is freely available at

  18. Search for New Physics in reactor and accelerator experiments

    Di Iura, A.; Girardi, I.; Meloni, D.


    We consider two scenarios of New Physics: the Large Extra Dimensions (LED), where sterile neutrinos can propagate in a (4+d) -dimensional space-time, and the Non Standard Interactions (NSI), where the neutrino interactions with ordinary matter are parametrized at low energy in terms of effective flavour-dependent complex couplings \\varepsilon_{αβ} . We study how these models have an impact on oscillation parameters in reactor and accelerator experiments.

  19. Manifold Learning for Multivariate Variable-Length Sequences With an Application to Similarity Search.

    Ho, Shen-Shyang; Dai, Peng; Rudzicz, Frank


    Multivariate variable-length sequence data are becoming ubiquitous with the technological advancement in mobile devices and sensor networks. Such data are difficult to compare, visualize, and analyze due to the nonmetric nature of data sequence similarity measures. In this paper, we propose a general manifold learning framework for arbitrary-length multivariate data sequences driven by similarity/distance (parameter) learning in both the original data sequence space and the learned manifold. Our proposed algorithm transforms the data sequences in a nonmetric data sequence space into feature vectors in a manifold that preserves the data sequence space structure. In particular, the feature vectors in the manifold representing similar data sequences remain close to one another and far from the feature points corresponding to dissimilar data sequences. To achieve this objective, we assume a semisupervised setting where we have knowledge about whether some of data sequences are similar or dissimilar, called the instance-level constraints. Using this information, one learns the similarity measure for the data sequence space and the distance measures for the manifold. Moreover, we describe an approach to handle the similarity search problem given user-defined instance level constraints in the learned manifold using a consensus voting scheme. Experimental results on both synthetic data and real tropical cyclone sequence data are presented to demonstrate the feasibility of our manifold learning framework and the robustness of performing similarity search in the learned manifold.

  20. Software Suite for Gene and Protein Annotation Prediction and Similarity Search.

    Chicco, Davide; Masseroli, Marco


    In the computational biology community, machine learning algorithms are key instruments for many applications, including the prediction of gene-functions based upon the available biomolecular annotations. Additionally, they may also be employed to compute similarity between genes or proteins. Here, we describe and discuss a software suite we developed to implement and make publicly available some of such prediction methods and a computational technique based upon Latent Semantic Indexing (LSI), which leverages both inferred and available annotations to search for semantically similar genes. The suite consists of three components. BioAnnotationPredictor is a computational software module to predict new gene-functions based upon Singular Value Decomposition of available annotations. SimilBio is a Web module that leverages annotations available or predicted by BioAnnotationPredictor to discover similarities between genes via LSI. The suite includes also SemSim, a new Web service built upon these modules to allow accessing them programmatically. We integrated SemSim in the Bio Search Computing framework (http://www.bioinformatics.deib., where users can exploit the Search Computing technology to run multi-topic complex queries on multiple integrated Web services. Accordingly, researchers may obtain ranked answers involving the computation of the functional similarity between genes in support of biomedical knowledge discovery.

  1. NSSRF: global network similarity search with subgraph signatures and its applications.

    Zhang, Jiao; Kwong, Sam; Jia, Yuheng; Wong, Ka-Chun


    The exponential growth of biological network database has increasingly rendered the global network similarity search (NSS) computationally intensive. Given a query network and a network database, it aims to find out the top similar networks in the database against the query network based on a topological similarity measure of interest. With the advent of big network data, the existing search methods may become unsuitable since some of them could render queries unsuccessful by returning empty answers or arbitrary query restrictions. Therefore, the design of NSS algorithm remains challenging under the dilemma between accuracy and efficiency. We propose a global NSS method based on regression, denotated as NSSRF, which boosts the search speed without any significant sacrifice in practical performance. As motivated from the nature, subgraph signatures are heavily involved. Two phases are proposed in NSSRF: offline model building phase and similarity query phase. In the offline model building phase, the subgraph signatures and cosine similarity scores are used for efficient random forest regression (RFR) model training. In the similarity query phase, the trained regression model is queried to return similar networks. We have extensively validated NSSRF on biological pathways and molecular structures; NSSRF demonstrates competitive performance over the state-of-the-arts. Remarkably, NSSRF works especially well for large networks, which indicates that the proposed approach can be promising in the era of big data. Case studies have proven the efficiencies and uniqueness of NSSRF which could be missed by the existing state-of-the-arts. The source code of two versions of NSSRF are freely available for downloading at and . Supplementary data are available at Bioinformatics online.

  2. Acceleration of stable interface structure searching using a kriging approach

    Kiyohara, Shin; Oda, Hiromi; Tsuda, Koji; Mizoguchi, Teruyasu


    Crystalline interfaces have a tremendous impact on the properties of materials. Determination of the atomic structure of the interface is crucial for a comprehensive understanding of the interface properties. Despite this importance, extensive calculation is necessary to determine even one interface structure. In this study, we apply a technique called kriging, borrowed from geostatistics, to accelerate the determination of the interface structure. The atomic structure of simplified coincidence-site lattice interfaces were determined using the kriging approach. Our approach successfully determined the most stable interface structure with an efficiency almost 2 orders of magnitude better than the traditional “brute force” approach.

  3. Applying Statistical Models and Parametric Distance Measures for Music Similarity Search

    Lukashevich, Hanna; Dittmar, Christian; Bastuck, Christoph

    Automatic deriving of similarity relations between music pieces is an inherent field of music information retrieval research. Due to the nearly unrestricted amount of musical data, the real-world similarity search algorithms have to be highly efficient and scalable. The possible solution is to represent each music excerpt with a statistical model (ex. Gaussian mixture model) and thus to reduce the computational costs by applying the parametric distance measures between the models. In this paper we discuss the combinations of applying different parametric modelling techniques and distance measures and weigh the benefits of each one against the others.


    Pushpa C N


    Full Text Available Semantic Similarity measures plays an important role in information retrieval, natural language processing and various tasks on web such as relation extraction, community mining, document clustering, and automatic meta-data extraction. In this paper, we have proposed a Pattern Retrieval Algorithm [PRA] to compute the semantic similarity measure between the words by combining both page count method and web snippets method. Four association measures are used to find semantic similarity between words in page count method using web search engines. We use a Sequential Minimal Optimization (SMO support vector machines (SVM to find the optimal combination of page counts-based similarity scores and top-ranking patterns from the web snippets method. The SVM is trained to classify synonymous word-pairs and nonsynonymous word-pairs. The proposed approach aims to improve the Correlation values, Precision, Recall, and F-measures, compared to the existing methods. The proposed algorithm outperforms by 89.8 % of correlation value.

  5. Parallel implementation of 3D protein structure similarity searches using a GPU and the CUDA.

    Mrozek, Dariusz; Brożek, Miłosz; Małysiak-Mrozek, Bożena


    Searching for similar 3D protein structures is one of the primary processes employed in the field of structural bioinformatics. However, the computational complexity of this process means that it is constantly necessary to search for new methods that can perform such a process faster and more efficiently. Finding molecular substructures that complex protein structures have in common is still a challenging task, especially when entire databases containing tens or even hundreds of thousands of protein structures must be scanned. Graphics processing units (GPUs) and general purpose graphics processing units (GPGPUs) can perform many time-consuming and computationally demanding processes much more quickly than a classical CPU can. In this paper, we describe the GPU-based implementation of the CASSERT algorithm for 3D protein structure similarity searching. This algorithm is based on the two-phase alignment of protein structures when matching fragments of the compared proteins. The GPU (GeForce GTX 560Ti: 384 cores, 2GB RAM) implementation of CASSERT ("GPU-CASSERT") parallelizes both alignment phases and yields an average 180-fold increase in speed over its CPU-based, single-core implementation on an Intel Xeon E5620 (2.40GHz, 4 cores). In this paper, we show that massive parallelization of the 3D structure similarity search process on many-core GPU devices can reduce the execution time of the process, allowing it to be performed in real time. GPU-CASSERT is available at:

  6. Similarity searching and scaffold hopping in synthetically accessible combinatorial chemistry spaces.

    Boehm, Markus; Wu, Tong-Ying; Claussen, Holger; Lemmen, Christian


    Large collections of combinatorial libraries are an integral element in today's pharmaceutical industry. It is of great interest to perform similarity searches against all virtual compounds that are synthetically accessible by any such library. Here we describe the successful application of a new software tool CoLibri on 358 combinatorial libraries based on validated reaction protocols to create a single chemistry space containing over 10 (12) possible products. Similarity searching with FTrees-FS allows the systematic exploration of this space without the need to enumerate all product structures. The search result is a set of virtual hits which are synthetically accessible by one or more of the existing reaction protocols. Grouping these virtual hits by their synthetic protocols allows the rapid design and synthesis of multiple follow-up libraries. Such library ideas support hit-to-lead design efforts for tasks like follow-up from high-throughput screening hits or scaffold hopping from one hit to another attractive series.

  7. Effects of multiple conformers per compound upon 3-D similarity search and bioassay data analysis

    Kim Sunghwan


    Full Text Available Abstract Background To improve the utility of PubChem, a public repository containing biological activities of small molecules, the PubChem3D project adds computationally-derived three-dimensional (3-D descriptions to the small-molecule records contained in the PubChem Compound database and provides various search and analysis tools that exploit 3-D molecular similarity. Therefore, the efficient use of PubChem3D resources requires an understanding of the statistical and biological meaning of computed 3-D molecular similarity scores between molecules. Results The present study investigated effects of employing multiple conformers per compound upon the 3-D similarity scores between ten thousand randomly selected biologically-tested compounds (10-K set and between non-inactive compounds in a given biological assay (156-K set. When the “best-conformer-pair” approach, in which a 3-D similarity score between two compounds is represented by the greatest similarity score among all possible conformer pairs arising from a compound pair, was employed with ten diverse conformers per compound, the average 3-D similarity scores for the 10-K set increased by 0.11, 0.09, 0.15, 0.16, 0.07, and 0.18 for STST-opt, CTST-opt, ComboTST-opt, STCT-opt, CTCT-opt, and ComboTCT-opt, respectively, relative to the corresponding averages computed using a single conformer per compound. Interestingly, the best-conformer-pair approach also increased the average 3-D similarity scores for the non-inactive–non-inactive (NN pairs for a given assay, by comparable amounts to those for the random compound pairs, although some assays showed a pronounced increase in the per-assay NN-pair 3-D similarity scores, compared to the average increase for the random compound pairs. Conclusion These results suggest that the use of ten diverse conformers per compound in PubChem bioassay data analysis using 3-D molecular similarity is not expected to increase the separation of non

  8. Semantic similarity measures in the biomedical domain by leveraging a web search engine.

    Hsieh, Sheau-Ling; Chang, Wen-Yung; Chen, Chi-Huang; Weng, Yung-Ching


    Various researches in web related semantic similarity measures have been deployed. However, measuring semantic similarity between two terms remains a challenging task. The traditional ontology-based methodologies have a limitation that both concepts must be resided in the same ontology tree(s). Unfortunately, in practice, the assumption is not always applicable. On the other hand, if the corpus is sufficiently adequate, the corpus-based methodologies can overcome the limitation. Now, the web is a continuous and enormous growth corpus. Therefore, a method of estimating semantic similarity is proposed via exploiting the page counts of two biomedical concepts returned by Google AJAX web search engine. The features are extracted as the co-occurrence patterns of two given terms P and Q, by querying P, Q, as well as P AND Q, and the web search hit counts of the defined lexico-syntactic patterns. These similarity scores of different patterns are evaluated, by adapting support vector machines for classification, to leverage the robustness of semantic similarity measures. Experimental results validating against two datasets: dataset 1 provided by A. Hliaoutakis; dataset 2 provided by T. Pedersen, are presented and discussed. In dataset 1, the proposed approach achieves the best correlation coefficient (0.802) under SNOMED-CT. In dataset 2, the proposed method obtains the best correlation coefficient (SNOMED-CT: 0.705; MeSH: 0.723) with physician scores comparing with measures of other methods. However, the correlation coefficients (SNOMED-CT: 0.496; MeSH: 0.539) with coder scores received opposite outcomes. In conclusion, the semantic similarity findings of the proposed method are close to those of physicians' ratings. Furthermore, the study provides a cornerstone investigation for extracting fully relevant information from digitizing, free-text medical records in the National Taiwan University Hospital database.

  9. Protein structure alignment and fast similarity search using local shape signatures.

    Can, Tolga; Wang, Yuan-Fang


    We present a new method for conducting protein structure similarity searches, which improves on the efficiency of some existing techniques. Our method is grounded in the theory of differential geometry on 3D space curve matching. We generate shape signatures for proteins that are invariant, localized, robust, compact, and biologically meaningful. The invariancy of the shape signatures allows us to improve similarity searching efficiency by adopting a hierarchical coarse-to-fine strategy. We index the shape signatures using an efficient hashing-based technique. With the help of this technique we screen out unlikely candidates and perform detailed pairwise alignments only for a small number of candidates that survive the screening process. Contrary to other hashing based techniques, our technique employs domain specific information (not just geometric information) in constructing the hash key, and hence, is more tuned to the domain of biology. Furthermore, the invariancy, localization, and compactness of the shape signatures allow us to utilize a well-known local sequence alignment algorithm for aligning two protein structures. One measure of the efficacy of the proposed technique is that we were able to perform structure alignment queries 36 times faster (on the average) than a well-known method while keeping the quality of the query results at an approximately similar level.

  10. Improving GPU-accelerated adaptive IDW interpolation algorithm using fast kNN search.

    Mei, Gang; Xu, Nengxiong; Xu, Liangliang


    This paper presents an efficient parallel Adaptive Inverse Distance Weighting (AIDW) interpolation algorithm on modern Graphics Processing Unit (GPU). The presented algorithm is an improvement of our previous GPU-accelerated AIDW algorithm by adopting fast k-nearest neighbors (kNN) search. In AIDW, it needs to find several nearest neighboring data points for each interpolated point to adaptively determine the power parameter; and then the desired prediction value of the interpolated point is obtained by weighted interpolating using the power parameter. In this work, we develop a fast kNN search approach based on the space-partitioning data structure, even grid, to improve the previous GPU-accelerated AIDW algorithm. The improved algorithm is composed of the stages of kNN search and weighted interpolating. To evaluate the performance of the improved algorithm, we perform five groups of experimental tests. The experimental results indicate: (1) the improved algorithm can achieve a speedup of up to 1017 over the corresponding serial algorithm; (2) the improved algorithm is at least two times faster than our previous GPU-accelerated AIDW algorithm; and (3) the utilization of fast kNN search can significantly improve the computational efficiency of the entire GPU-accelerated AIDW algorithm.

  11. SiMPSON: Efficient Similarity Search in Metric Spaces over P2P Structured Overlay Networks

    Vu, Quang Hieu; Lupu, Mihai; Wu, Sai

    Similarity search in metric spaces over centralized systems has been significantly studied in the database research community. However, not so much work has been done in the context of P2P networks. This paper introduces SiMPSON: a P2P system supporting similarity search in metric spaces. The aim is to answer queries faster and using less resources than existing systems. For this, each peer first clusters its own data using any off-the-shelf clustering algorithms. Then, the resulting clusters are mapped to one-dimensional values. Finally, these one-dimensional values are indexed into a structured P2P overlay. Our method slightly increases the indexing overhead, but allows us to greatly reduce the number of peers and messages involved in query processing: we trade a small amount of overhead in the data publishing process for a substantial reduction of costs in the querying phase. Based on this architecture, we propose algorithms for processing range and kNN queries. Extensive experimental results validate the claims of efficiency and effectiveness of SiMPSON.

  12. World Climate Classification and Search: Data Mining Approach Utilizing Dynamic Time Warping Similarity Function

    Stepinski, T. F.; Netzel, P.; Jasiewicz, J.


    We have developed a novel method for classification and search of climate over the global land surface excluding Antarctica. Our method classifies climate on the basis of the outcome of time series segmentation and clustering. We use WorldClim 30 arc sec. (approx. 1 km) resolution grid data which is based on 50 years of climatic observations. Each cell in a grid is assigned a 12 month series consisting of 50-years monthly averages of mean, maximum, and minimum temperatures as well as the total precipitation. The presented method introduces several innovations with comparison to existing data-driven methods of world climate classifications. First, it uses only climatic rather than bioclimatic data. Second, it employs object-oriented methodology - the grid is first segmented before climatic segments are classified. Third, and most importantly, the similarity between climates in two given cells is performed using the dynamic time warping (DTW) measure instead of the Euclidean distance. The DTW is known to be superior to Euclidean distance for time series, but has not been utilized before in classification of global climate. To account for computational expense of DTW we use highly efficient GeoPAT software ( that, in the first step, segments the grid into local regions of uniform climate. In the second step, the segments are classified. We also introduce a climate search - a GeoWeb-based method for interactive presentation of global climate information in the form of query-and-retrieval. A user selects a geographical location and the system returns a global map indicating level of similarity between local climates and a climate in the selected location. The results of the search for location: "University of Cincinnati, Main Campus" are presented on attached map. The results of the search for location: "University of Cincinnati, Main Campus" are presented on the map. We have compared the results of our method to Koeppen classification scheme

  13. iSARST: an integrated SARST web server for rapid protein structural similarity searches.

    Lo, Wei-Cheng; Lee, Che-Yu; Lee, Chi-Ching; Lyu, Ping-Chiang


    iSARST is a web server for efficient protein structural similarity searches. It is a multi-processor, batch-processing and integrated implementation of several structural comparison tools and two database searching methods: SARST for common structural homologs and CPSARST for homologs with circular permutations. iSARST allows users submitting multiple PDB/SCOP entry IDs or an archive file containing many structures. After scanning the target database using SARST/CPSARST, the ordering of hits are refined with conventional structure alignment tools such as FAST, TM-align and SAMO, which are run in a PC cluster. In this way, iSARST achieves a high running speed while preserving the high precision of refinement engines. The final outputs include tables listing co-linear or circularly permuted homologs of the query proteins and a functional summary of the best hits. Superimposed structures can be examined through an interactive and informative visualization tool. iSARST provides the first batch mode structural comparison web service for both co-linear homologs and circular permutants. It can serve as a rapid annotation system for functionally unknown or hypothetical proteins, which are increasing rapidly in this post-genomics era. The server can be accessed at

  14. Searching the protein structure database for ligand-binding site similarities using CPASS v.2

    Caprez Adam


    Full Text Available Abstract Background A recent analysis of protein sequences deposited in the NCBI RefSeq database indicates that ~8.5 million protein sequences are encoded in prokaryotic and eukaryotic genomes, where ~30% are explicitly annotated as "hypothetical" or "uncharacterized" protein. Our Comparison of Protein Active-Site Structures (CPASS v.2 database and software compares the sequence and structural characteristics of experimentally determined ligand binding sites to infer a functional relationship in the absence of global sequence or structure similarity. CPASS is an important component of our Functional Annotation Screening Technology by NMR (FAST-NMR protocol and has been successfully applied to aid the annotation of a number of proteins of unknown function. Findings We report a major upgrade to our CPASS software and database that significantly improves its broad utility. CPASS v.2 is designed with a layered architecture to increase flexibility and portability that also enables job distribution over the Open Science Grid (OSG to increase speed. Similarly, the CPASS interface was enhanced to provide more user flexibility in submitting a CPASS query. CPASS v.2 now allows for both automatic and manual definition of ligand-binding sites and permits pair-wise, one versus all, one versus list, or list versus list comparisons. Solvent accessible surface area, ligand root-mean square difference, and Cβ distances have been incorporated into the CPASS similarity function to improve the quality of the results. The CPASS database has also been updated. Conclusions CPASS v.2 is more than an order of magnitude faster than the original implementation, and allows for multiple simultaneous job submissions. Similarly, the CPASS database of ligand-defined binding sites has increased in size by ~ 38%, dramatically increasing the likelihood of a positive search result. The modification to the CPASS similarity function is effective in reducing CPASS similarity scores

  15. Massive problem reports mining and analysis based parallelism for similar search

    Zhou, Ya; Hu, Cailin; Xiong, Han; Wei, Xiafei; Li, Ling


    Massive problem reports and solutions accumulated over time and continuously collected in XML Spreadsheet (XMLSS) format from enterprises and organizations, which record a series of comprehensive description about problems that can help technicians to trace problems and their solutions. It's a significant and challenging issue to effectively manage and analyze these massive semi-structured data to provide similar problem solutions, decisions of immediate problem and assisting product optimization for users during hardware and software maintenance. For this purpose, we build a data management system to manage, mine and analyze these data search results that can be categorized and organized into several categories for users to quickly find out where their interesting results locate. Experiment results demonstrate that this system is better than traditional centralized management system on the performance and the adaptive capability of heterogeneous data greatly. Besides, because of re-extracting topics, it enables each cluster to be described more precise and reasonable.

  16. Gene network homology in prokaryotes using a similarity search approach: queries of quorum sensing signal transduction.

    David N Quan

    Full Text Available Bacterial cell-cell communication is mediated by small signaling molecules known as autoinducers. Importantly, autoinducer-2 (AI-2 is synthesized via the enzyme LuxS in over 80 species, some of which mediate their pathogenicity by recognizing and transducing this signal in a cell density dependent manner. AI-2 mediated phenotypes are not well understood however, as the means for signal transduction appears varied among species, while AI-2 synthesis processes appear conserved. Approaches to reveal the recognition pathways of AI-2 will shed light on pathogenicity as we believe recognition of the signal is likely as important, if not more, than the signal synthesis. LMNAST (Local Modular Network Alignment Similarity Tool uses a local similarity search heuristic to study gene order, generating homology hits for the genomic arrangement of a query gene sequence. We develop and apply this tool for the E. coli lac and LuxS regulated (Lsr systems. Lsr is of great interest as it mediates AI-2 uptake and processing. Both test searches generated results that were subsequently analyzed through a number of different lenses, each with its own level of granularity, from a binary phylogenetic representation down to trackback plots that preserve genomic organizational information. Through a survey of these results, we demonstrate the identification of orthologs, paralogs, hitchhiking genes, gene loss, gene rearrangement within an operon context, and also horizontal gene transfer (HGT. We found a variety of operon structures that are consistent with our hypothesis that the signal can be perceived and transduced by homologous protein complexes, while their regulation may be key to defining subsequent phenotypic behavior.

  17. PHOG-BLAST – a new generation tool for fast similarity search of protein families

    Mironov Andrey A


    Full Text Available Abstract Background The need to compare protein profiles frequently arises in various protein research areas: comparison of protein families, domain searches, resolution of orthology and paralogy. The existing fast algorithms can only compare a protein sequence with a protein sequence and a profile with a sequence. Algorithms to compare profiles use dynamic programming and complex scoring functions. Results We developed a new algorithm called PHOG-BLAST for fast similarity search of profiles. This algorithm uses profile discretization to convert a profile to a finite alphabet and utilizes hashing for fast search. To determine the optimal alphabet, we analyzed columns in reliable multiple alignments and obtained column clusters in the 20-dimensional profile space by applying a special clustering procedure. We show that the clustering procedure works best if its parameters are chosen so that 20 profile clusters are obtained which can be interpreted as ancestral amino acid residues. With these clusters, only less than 2% of columns in multiple alignments are out of clusters. We tested the performance of PHOG-BLAST vs. PSI-BLAST on three well-known databases of multiple alignments: COG, PFAM and BALIBASE. On the COG database both algorithms showed the same performance, on PFAM and BALIBASE PHOG-BLAST was much superior to PSI-BLAST. PHOG-BLAST required 10–20 times less computer memory and computation time than PSI-BLAST. Conclusion Since PHOG-BLAST can compare multiple alignments of protein families, it can be used in different areas of comparative proteomics and protein evolution. For example, PHOG-BLAST helped to build the PHOG database of phylogenetic orthologous groups. An essential step in building this database was comparing protein complements of different species and orthologous groups of different taxons on a personal computer in reasonable time. When it is applied to detect weak similarity between protein families, PHOG-BLAST is less

  18. PSimScan: algorithm and utility for fast protein similarity search.

    Anna Kaznadzey

    Full Text Available In the era of metagenomics and diagnostics sequencing, the importance of protein comparison methods of boosted performance cannot be overstated. Here we present PSimScan (Protein Similarity Scanner, a flexible open source protein similarity search tool which provides a significant gain in speed compared to BLASTP at the price of controlled sensitivity loss. The PSimScan algorithm introduces a number of novel performance optimization methods that can be further used by the community to improve the speed and lower hardware requirements of bioinformatics software. The optimization starts at the lookup table construction, then the initial lookup table-based hits are passed through a pipeline of filtering and aggregation routines of increasing computational complexity. The first step in this pipeline is a novel algorithm that builds and selects 'similarity zones' aggregated from neighboring matches on small arrays of adjacent diagonals. PSimScan performs 5 to 100 times faster than the standard NCBI BLASTP, depending on chosen parameters, and runs on commodity hardware. Its sensitivity and selectivity at the slowest settings are comparable to the NCBI BLASTP's and decrease with the increase of speed, yet stay at the levels reasonable for many tasks. PSimScan is most advantageous when used on large collections of query sequences. Comparing the entire proteome of Streptocuccus pneumoniae (2,042 proteins to the NCBI's non-redundant protein database of 16,971,855 records takes 6.5 hours on a moderately powerful PC, while the same task with the NCBI BLASTP takes over 66 hours. We describe innovations in the PSimScan algorithm in considerable detail to encourage bioinformaticians to improve on the tool and to use the innovations in their own software development.

  19. Meta-Storms: efficient search for similar microbial communities based on a novel indexing scheme and similarity score for metagenomic data.

    Su, Xiaoquan; Xu, Jian; Ning, Kang


    It has long been intriguing scientists to effectively compare different microbial communities (also referred as 'metagenomic samples' here) in a large scale: given a set of unknown samples, find similar metagenomic samples from a large repository and examine how similar these samples are. With the current metagenomic samples accumulated, it is possible to build a database of metagenomic samples of interests. Any metagenomic samples could then be searched against this database to find the most similar metagenomic sample(s). However, on one hand, current databases with a large number of metagenomic samples mostly serve as data repositories that offer few functionalities for analysis; and on the other hand, methods to measure the similarity of metagenomic data work well only for small set of samples by pairwise comparison. It is not yet clear, how to efficiently search for metagenomic samples against a large metagenomic database. In this study, we have proposed a novel method, Meta-Storms, that could systematically and efficiently organize and search metagenomic data. It includes the following components: (i) creating a database of metagenomic samples based on their taxonomical annotations, (ii) efficient indexing of samples in the database based on a hierarchical taxonomy indexing strategy, (iii) searching for a metagenomic sample against the database by a fast scoring function based on quantitative phylogeny and (iv) managing database by index export, index import, data insertion, data deletion and database merging. We have collected more than 1300 metagenomic data from the public domain and in-house facilities, and tested the Meta-Storms method on these datasets. Our experimental results show that Meta-Storms is capable of database creation and effective searching for a large number of metagenomic samples, and it could achieve similar accuracies compared with the current popular significance testing-based methods. Meta-Storms method would serve as a suitable

  20. Application of 3D Zernike descriptors to shape-based ligand similarity searching

    Venkatraman Vishwesh


    Full Text Available Abstract Background The identification of promising drug leads from a large database of compounds is an important step in the preliminary stages of drug design. Although shape is known to play a key role in the molecular recognition process, its application to virtual screening poses significant hurdles both in terms of the encoding scheme and speed. Results In this study, we have examined the efficacy of the alignment independent three-dimensional Zernike descriptor (3DZD for fast shape based similarity searching. Performance of this approach was compared with several other methods including the statistical moments based ultrafast shape recognition scheme (USR and SIMCOMP, a graph matching algorithm that compares atom environments. Three benchmark datasets are used to thoroughly test the methods in terms of their ability for molecular classification, retrieval rate, and performance under the situation that simulates actual virtual screening tasks over a large pharmaceutical database. The 3DZD performed better than or comparable to the other methods examined, depending on the datasets and evaluation metrics used. Reasons for the success and the failure of the shape based methods for specific cases are investigated. Based on the results for the three datasets, general conclusions are drawn with regard to their efficiency and applicability. Conclusion The 3DZD has unique ability for fast comparison of three-dimensional shape of compounds. Examples analyzed illustrate the advantages and the room for improvements for the 3DZD.

  1. Fast parallel tandem mass spectral library searching using GPU hardware acceleration.

    Baumgardner, Lydia Ashleigh; Shanmugam, Avinash Kumar; Lam, Henry; Eng, Jimmy K; Martin, Daniel B


    Mass spectrometry-based proteomics is a maturing discipline of biologic research that is experiencing substantial growth. Instrumentation has steadily improved over time with the advent of faster and more sensitive instruments collecting ever larger data files. Consequently, the computational process of matching a peptide fragmentation pattern to its sequence, traditionally accomplished by sequence database searching and more recently also by spectral library searching, has become a bottleneck in many mass spectrometry experiments. In both of these methods, the main rate-limiting step is the comparison of an acquired spectrum with all potential matches from a spectral library or sequence database. This is a highly parallelizable process because the core computational element can be represented as a simple but arithmetically intense multiplication of two vectors. In this paper, we present a proof of concept project taking advantage of the massively parallel computing available on graphics processing units (GPUs) to distribute and accelerate the process of spectral assignment using spectral library searching. This program, which we have named FastPaSS (for Fast Parallelized Spectral Searching), is implemented in CUDA (Compute Unified Device Architecture) from NVIDIA, which allows direct access to the processors in an NVIDIA GPU. Our efforts demonstrate the feasibility of GPU computing for spectral assignment, through implementation of the validated spectral searching algorithm SpectraST in the CUDA environment.

  2. Target-distractor similarity has a larger impact on visual search in school-age children than spacing.

    Huurneman, Bianca; Boonstra, F Nienke


    In typically developing children, crowding decreases with increasing age. The influence of target-distractor similarity with respect to orientation and element spacing on visual search performance was investigated in 29 school-age children with normal vision (4- to 6-year-olds [N = 16], 7- to 8-year-olds [N = 13]). Children were instructed to search for a target E among distractor Es (feature search: all flanking Es pointing right; conjunction search: flankers in three orientations). Orientation of the target was manipulated in four directions: right (target absent), left (inversed), up, and down (vertical). Spacing was varied in four steps: 0.04°, 0.5°, 1°, and 2°. During feature search, high target-distractor similarity had a stronger impact on performance than spacing: Orientation affected accuracy until spacing was 1°, and spacing only influenced accuracy for identifying inversed targets. Spatial analyses showed that orientation affected oculomotor strategy: Children made more fixations in the "inversed" target area (4.6) than the vertical target areas (1.8 and 1.9). Furthermore, age groups differed in fixation duration: 4- to 6-year-old children showed longer fixation durations than 7- to 8-year-olds at the two largest element spacings (p = 0.039 and p = 0.027). Conjunction search performance was unaffected by spacing. Four conclusions can be drawn from this study: (a) Target-distractor similarity governs visual search performance in school-age children, (b) children make more fixations in target areas when target-distractor similarity is high, (c) 4- to 6-year-olds show longer fixation durations than 7- to 8-year-olds at 1° and 2° element spacing, and (d) spacing affects feature but not conjunction search-a finding that might indicate top-down control ameliorates crowding in children.

  3. Efficient Retrieval of Images for Search Engine by Visual Similarity and Re Ranking

    Viswa S S


    Full Text Available Nowadays, web scale image search engines (e.g.Google Image Search, Microsoft Live ImageSearch rely almost purely on surrounding textfeatures. Users type keywords in hope of finding acertain type of images. The search engine returnsthousands of images ranked by the text keywordsextracted from the surrounding text. However,many of returned images are noisy, disorganized, orirrelevant. Even Google and Microsoft have noVisual Information for searching of images. Usingvisual information to re rank and improve textbased image search results is the idea. Thisimproves the precision of the text based imagesearch ranking by incorporating the informationconveyed by the visual modality.The typicalassumption that the top-images in the text-basedsearch result are equally relevant is relaxed bylinking the relevance of the images to their initialrank positions. Then, a number of images from theinitial search result are employed as the prototypesthat serve to visually represent the query and thatare subsequently used to construct meta re rankers.i.e. The most relevant images are found by visualsimilarity and the average scores are calculated. Byapplying different meta re rankers to an image fromthe initial result, re ranking scores are generated,which are then used to find the new rank positionfor an image in the re ranked search result.Humansupervision is introduced to learn the model weightsoffline, prior to the online re ranking process. Whilemodel learning requires manual labelling of theresults for a few queries, the resulting model isquery independent and therefore applicable to anyother query. The experimental results on arepresentative web image search dataset comprising353 queries demonstrate that the proposed methodoutperforms the existing supervised andunsupervised Re ranking approaches. Moreover, itimproves the performance over the text-based imagesearch engine by morethan 25.48%

  4. WIPO Re:Search: Accelerating anthelmintic development through cross-sector partnerships

    Roopa Ramamoorthi


    Full Text Available Neglected tropical diseases (NTDs, malaria, and tuberculosis have a devastating effect on an estimated 1.6 billion people worldwide. The World Intellectual Property Organization (WIPO Re:Search consortium accelerates the development of new drugs, vaccines, and diagnostics for these diseases by connecting the assets and resources of pharmaceutical companies, such as compound libraries and expertise, to academic or nonprofit researchers with novel product discovery or development ideas. As the WIPO Re:Search Partnership Hub Administrator, BIO Ventures for Global Health (BVGH fields requests from researchers, identifies Member organizations able to fulfill these requests, and helps forge mutually beneficial collaborations. Since its inception in October 2011, WIPO Re:Search membership has expanded to more than 90 institutions, including leading pharmaceutical companies, universities, nonprofit research institutions, and product development partnerships from around the world. To date, WIPO Re:Search has facilitated over 70 research agreements between Consortium Members, including 11 collaborations focused on anthelmintic drug discovery.

  5. Massively Multi-core Acceleration of a Document-Similarity Classifier to Detect Web Attacks

    Ulmer, C; Gokhale, M; Top, P; Gallagher, B; Eliassi-Rad, T


    This paper describes our approach to adapting a text document similarity classifier based on the Term Frequency Inverse Document Frequency (TFIDF) metric to two massively multi-core hardware platforms. The TFIDF classifier is used to detect web attacks in HTTP data. In our parallel hardware approaches, we design streaming, real time classifiers by simplifying the sequential algorithm and manipulating the classifier's model to allow decision information to be represented compactly. Parallel implementations on the Tilera 64-core System on Chip and the Xilinx Virtex 5-LX FPGA are presented. For the Tilera, we employ a reduced state machine to recognize dictionary terms without requiring explicit tokenization, and achieve throughput of 37MB/s at slightly reduced accuracy. For the FPGA, we have developed a set of software tools to help automate the process of converting training data to synthesizable hardware and to provide a means of trading off between accuracy and resource utilization. The Xilinx Virtex 5-LX implementation requires 0.2% of the memory used by the original algorithm. At 166MB/s (80X the software) the hardware implementation is able to achieve Gigabit network throughput at the same accuracy as the original algorithm.

  6. Accelerated damage visualization using binary search with fixed pitch-catch distance laser ultrasonic scanning

    Park, Byeongjin; Sohn, Hoon


    Laser ultrasonic scanning, especially full-field wave propagation imaging, is attractive for damage visualization thanks to its noncontact nature, sensitivity to local damage, and high spatial resolution. However, its practicality is limited because scanning at a high spatial resolution demands a prohibitively long scanning time. Inspired by binary search, an accelerated damage visualization technique is developed to visualize damage with a reduced scanning time. The pitch-catch distance between the excitation point and the sensing point is also fixed during scanning to maintain a high signal-to-noise ratio (SNR) of measured ultrasonic responses. The approximate damage boundary is identified by examining the interactions between ultrasonic waves and damage observed at the scanning points that are sparsely selected by a binary search algorithm. Here, a time-domain laser ultrasonic response is transformed into a spatial ultrasonic domain response using a basis pursuit approach so that the interactions between ultrasonic waves and damage, such as reflections and transmissions, can be better identified in the spatial ultrasonic domain. Then, the area inside the identified damage boundary is visualized as damage. The performance of the proposed damage visualization technique is validated excusing a numerical simulation performed on an aluminum plate with a notch and experiments performed on an aluminum plate with a crack and a wind turbine blade with delamination. The proposed damage visualization technique accelerates the damage visualization process in three aspects: (1) the number of measurements that is necessary for damage visualization is dramatically reduced by a binary search algorithm; (2) the number of averaging that is necessary to achieve a high SNR is reduced by maintaining the wave propagation distance short; and (3) with the proposed technique, the same damage can be identified with a lower spatial resolution than the spatial resolution required by full

  7. General relativistic self-similar waves that induce an anomalous acceleration into the standard model of cosmology

    Smoller, Joel


    We prove that the Einstein equations in Standard Schwarzschild Coordinates close to form a system of three ordinary differential equations for a family of spherically symmetric, self-similar expansion waves, and the critical ($k=0$) Friedmann universe associated with the pure radiation phase of the Standard Model of Cosmology (FRW), is embedded as a single point in this family. Removing a scaling law and imposing regularity at the center, we prove that the family reduces to an implicitly defined one parameter family of distinct spacetimes determined by the value of a new {\\it acceleration parameter} $a$, such that $a=1$ corresponds to FRW. We prove that all self-similar spacetimes in the family are distinct from the non-critical $k\

  8. Efficient EMD-based Similarity Search in Multimedia Databases via Flexible Dimensionality Reduction

    Wichterich, Marc; Assent, Ira; Philipp, Kranen


    The Earth Mover's Distance (EMD) was developed in computer vision as a flexible similarity model that utilizes similarities in feature space to define a high quality similarity measure in feature representation space. It has been successfully adopted in a multitude of applications with low to med...

  9. Finding and Reusing Learning Materials with Multimedia Similarity Search and Social Networks

    Little, Suzanne; Ferguson, Rebecca; Ruger, Stefan


    The authors describe how content-based multimedia search technologies can be used to help learners find new materials and learning pathways by identifying semantic relationships between educational resources in a social learning network. This helps users--both learners and educators--to explore and find material to support their learning aims.…

  10. Finding and Reusing Learning Materials with Multimedia Similarity Search and Social Networks

    Little, Suzanne; Ferguson, Rebecca; Ruger, Stefan


    The authors describe how content-based multimedia search technologies can be used to help learners find new materials and learning pathways by identifying semantic relationships between educational resources in a social learning network. This helps users--both learners and educators--to explore and find material to support their learning aims.…

  11. Target-distractor similarity has a larger impact on visual search in school-age children than spacing

    Huurneman, B.; Boonstra, F.N.


    In typically developing children, crowding decreases with increasing age. The influence of target-distractor similarity with respect to orientation and element spacing on visual search performance was investigated in 29 school-age children with normal vision (4- to 6-year-olds [N = 16], 7- to 8-year

  12. Breast cancer stories on the internet : improving search facilities to help patients find stories of similar others

    Overberg, Regina Ingrid


    The primary aim of this thesis is to gain insight into which search facilities for spontaneously published stories facilitate breast cancer patients in finding stories by other patients in a similar situation. According to the narrative approach, social comparison theory, and social cognitive theory

  13. Application of belief theory to similarity data fusion for use in analog searching and lead hopping.

    Muchmore, Steven W; Debe, Derek A; Metz, James T; Brown, Scott P; Martin, Yvonne C; Hajduk, Philip J


    A wide variety of computational algorithms have been developed that strive to capture the chemical similarity between two compounds for use in virtual screening and lead discovery. One limitation of such approaches is that, while a returned similarity value reflects the perceived degree of relatedness between any two compounds, there is no direct correlation between this value and the expectation or confidence that any two molecules will in fact be equally active. A lack of a common framework for interpretation of similarity measures also confounds the reliable fusion of information from different algorithms. Here, we present a probabilistic framework for interpreting similarity measures that directly correlates the similarity value to a quantitative expectation that two molecules will in fact be equipotent. The approach is based on extensive benchmarking of 10 different similarity methods (MACCS keys, Daylight fingerprints, maximum common subgraphs, rapid overlay of chemical structures (ROCS) shape similarity, and six connectivity-based fingerprints) against a database of more than 150,000 compounds with activity data against 23 protein targets. Given this unified and probabilistic framework for interpreting chemical similarity, principles derived from decision theory can then be applied to combine the evidence from different similarity measures in such a way that both capitalizes on the strengths of the individual approaches and maintains a quantitative estimate of the likelihood that any two molecules will exhibit similar biological activity.

  14. Accelerating the search for global minima on potential energy surfaces using machine learning

    Carr, S. F.; Garnett, R.; Lo, C. S.


    Controlling molecule-surface interactions is key for chemical applications ranging from catalysis to gas sensing. We present a framework for accelerating the search for the global minimum on potential surfaces, corresponding to stable adsorbate-surface structures. We present a technique using Bayesian inference that enables us to predict converged density functional theory potential energies with fewer self-consistent field iterations. We then discuss how this technique fits in with the Bayesian Active Site Calculator, which applies Bayesian optimization to the problem. We demonstrate the performance of our framework using a hematite (Fe2O3) surface and present the adsorption sites found by our global optimization method for various simple hydrocarbons on the rutile TiO2 (110) surface.

  15. FSim: A Novel Functional Similarity Search Algorithm and Tool for Discovering Functionally Related Gene Products

    Qiang Hu


    Full Text Available Background. During the analysis of genomics data, it is often required to quantify the functional similarity of genes and their products based on the annotation information from gene ontology (GO with hierarchical structure. A flexible and user-friendly way to estimate the functional similarity of genes utilizing GO annotation is therefore highly desired. Results. We proposed a novel algorithm using a level coefficient-weighted model to measure the functional similarity of gene products based on multiple ontologies of hierarchical GO annotations. The performance of our algorithm was evaluated and found to be superior to the other tested methods. We implemented the proposed algorithm in a software package, FSim, based on R statistical and computing environment. It can be used to discover functionally related genes for a given gene, group of genes, or set of function terms. Conclusions. FSim is a flexible tool to analyze functional gene groups based on the GO annotation databases.

  16. SHOP: receptor-based scaffold hopping by GRID-based similarity searches

    Bergmann, Rikke; Liljefors, Tommy; Sørensen, Morten D


    A new field-derived 3D method for receptor-based scaffold hopping, implemented in the software SHOP, is presented. Information from a protein-ligand complex is utilized to substitute a fragment of the ligand with another fragment from a database of synthetically accessible scaffolds. A GRID......-based interaction profile of the receptor and geometrical descriptions of a ligand scaffold are used to obtain new scaffolds with different structural features and are able to replace the original scaffold in the protein-ligand complex. An enrichment study was successfully performed verifying the ability of SHOP...... to find known active CDK2 scaffolds in a database. Additionally, SHOP was used for suggesting new inhibitors of p38 MAP kinase. Four p38 complexes were used to perform six scaffold searches. Several new scaffolds were suggested, and the resulting compounds were successfully docked into the query proteins....

  17. Proposal for a Similar Question Search System on a Q&A Site

    Katsutoshi Kanamori


    Full Text Available There is a service to help Internet users obtain answers to specific questions when they visit a Q&A site. A Q&A site is very useful for the Internet user, but posted questions are often not answered immediately. This delay in answering occurs because in most cases another site user is answering the question manually. In this study, we propose a system that can present a question that is similar to a question posted by a user. An advantage of this system is that a user can refer to an answer to a similar question. This research measures the similarity of a candidate question based on word and dependency parsing. In an experiment, we examined the effectiveness of the proposed system for questions actually posted on the Q&A site. The result indicates that the system can show the questioner the answer to a similar question. However, the system still has a number of aspects that should be improved.

  18. HBLAST: Parallelised sequence similarity--A Hadoop MapReducable basic local alignment search tool.

    O'Driscoll, Aisling; Belogrudov, Vladislav; Carroll, John; Kropp, Kai; Walsh, Paul; Ghazal, Peter; Sleator, Roy D


    The recent exponential growth of genomic databases has resulted in the common task of sequence alignment becoming one of the major bottlenecks in the field of computational biology. It is typical for these large datasets and complex computations to require cost prohibitive High Performance Computing (HPC) to function. As such, parallelised solutions have been proposed but many exhibit scalability limitations and are incapable of effectively processing "Big Data" - the name attributed to datasets that are extremely large, complex and require rapid processing. The Hadoop framework, comprised of distributed storage and a parallelised programming framework known as MapReduce, is specifically designed to work with such datasets but it is not trivial to efficiently redesign and implement bioinformatics algorithms according to this paradigm. The parallelisation strategy of "divide and conquer" for alignment algorithms can be applied to both data sets and input query sequences. However, scalability is still an issue due to memory constraints or large databases, with very large database segmentation leading to additional performance decline. Herein, we present Hadoop Blast (HBlast), a parallelised BLAST algorithm that proposes a flexible method to partition both databases and input query sequences using "virtual partitioning". HBlast presents improved scalability over existing solutions and well balanced computational work load while keeping database segmentation and recompilation to a minimum. Enhanced BLAST search performance on cheap memory constrained hardware has significant implications for in field clinical diagnostic testing; enabling faster and more accurate identification of pathogenic DNA in human blood or tissue samples. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Development of a fingerprint reduction approach for Bayesian similarity searching based on Kullback-Leibler divergence analysis.

    Nisius, Britta; Vogt, Martin; Bajorath, Jürgen


    The contribution of individual fingerprint bit positions to similarity search performance is systematically evaluated. A method is introduced to determine bit significance on the basis of Kullback-Leibler divergence analysis of bit distributions in active and database compounds. Bit divergence analysis and Bayesian compound screening share a common methodological foundation. Hence, given the significance ranking of all individual bit positions comprising a fingerprint, subsets of bits are evaluated in the context of Bayesian screening, and minimal fingerprint representations are determined that meet or exceed the search performance of unmodified fingerprints. For fingerprints of different design evaluated on many compound activity classes, we consistently find that subsets of fingerprint bit positions are responsible for search performance. In part, these subsets are very small and contain in some cases only a few fingerprint bit positions. Structural or pharmacophore patterns captured by preferred bit positions can often be directly associated with characteristic features of active compounds. In some cases, reduced fingerprint representations clearly exceed the search performance of the original fingerprints. Thus, fingerprint reduction likely represents a promising approach for practical applications.


    Sabu Augustine


    Full Text Available In this fast developing period the use of RFID have become more significant in many application domaindue to drastic cut down in the price of the RFID tags. This technology is evolving as a means of tracking objects and inventory items. One such diversified application domain is in Supply Chain Management where RFID is being applied as the manufacturers and distributers need to analyse product and logistic information in order to get the right quantity of products arriving at the right time to the right locations. Usually the RFID tag information collected from RFID readers is stored in remote database and the RFID data is being analyzed by querying data from this database based on path encoding method by the property of prime numbers. In this paper we propose an improved encoding scheme that encodes the flows of objects in RFID tag movement. A Trajectory of moving RFID tags consists of a sequence of tagsthat changes over time. With the integration of wireless communications and positioning technologies, the concept of Trajectory Database has become increasingly important, and has posed great challenges to the data mining community.The support of efficient trajectory similarity techniques is indisputably very important for the quality of data analysis tasks in Supply Chain Traffic which will enable similar product movements.

  1. Identifying Potential Protein Targets for Toluene Using a Molecular Similarity Search, in Silico Docking and in Vitro Validation


    Molecular Similarity Search, in Silico Docking and in Vitro Validation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...hazardous in relatively low concentrations (ə mM for some, ə μM for others); and (3) appearance in multiple toxin/poison lists provided by government...removed and dis- carded while the remaining red blood cell pellet was re-sus- pended in a 0.9% isotonic saline solution and centrifuged at 3716g for 30

  2. The Research and Test of Fast Radio Burst Real-time Search Algorithm Based on GPU Acceleration

    Wang, J.; Chen, M. Z.; Pei, X.; Wang, Z. Q.


    In order to satisfy the research needs of Nanshan 25 m radio telescope of Xinjiang Astronomical Observatory (XAO) and study the key technology of the planned QiTai radio Telescope (QTT), the receiver group of XAO studied the GPU (Graphics Processing Unit) based real-time FRB searching algorithm which developed from the original FRB searching algorithm based on CPU (Central Processing Unit), and built the FRB real-time searching system. The comparison of the GPU system and the CPU system shows that: on the basis of ensuring the accuracy of the search, the speed of the GPU accelerated algorithm is improved by 35-45 times compared with the CPU algorithm.

  3. Web Similarity

    Cohen, A.R.; Vitányi, P.M.B.


    Normalized web distance (NWD) is a similarity or normalized semantic distance based on the World Wide Web or any other large electronic database, for instance Wikipedia, and a search engine that returns reliable aggregate page counts. For sets of search terms the NWD gives a similarity on a scale fr

  4. Identification of aggregation breakers for bevacizumab (Avastin®) self-association through similarity searching and interaction studies.

    Westermaier, Y; Veurink, M; Riis-Johannessen, T; Guinchard, S; Gurny, R; Scapozza, L


    Aggregation is a common challenge in the optimization of therapeutic antibody formulations. Since initial self-association of two monomers is typically a reversible process, the aim of this study is to identify different excipients that are able to shift this equilibrium to the monomeric state. The hypothesis is that a specific interaction between excipient and antibody may hinder two monomers from approaching each other, based on previous work in which dexamethasone phosphate showed the ability to partially reverse formed aggregates of the monoclonal IgG1 antibody bevacizumab back into monomers. The current study focuses on the selection of therapeutically inactive compounds with similar properties. Adenosine monophosphate, adenosine triphosphate, sucrose-6-phosphate and guanosine monophosphate were selected in silico through similarity searching and docking. All four compounds were predicted to bind to a protein-protein interaction hotspot on the Fc region of bevacizumab and thereby breaking dimer formation. The predictions were supported in vitro: An interaction between AMP and bevacizumab with a dissociation constant of 9.59±0.15 mM was observed by microscale thermophoresis. The stability of the antibody at elevated temperature (40 °C) in a 51 mM phosphate buffer pH 7 was investigated in presence and absence of the excipients. Quantification of the different aggregation species by asymmetrical flow field-flow fractionation and size exclusion chromatography demonstrates that all four excipients are able to partially overcome the initial self-association of bevacizumab monomers.

  5. Efficient generation, storage, and manipulation of fully flexible pharmacophore multiplets and their use in 3-D similarity searching.

    Abrahamian, Edmond; Fox, Peter C; Naerum, Lars; Christensen, Inge Thøger; Thøgersen, Henning; Clark, Robert D


    Pharmacophore triplets and quartets have been used by many groups in recent years, primarily as a tool for molecular diversity analysis. In most cases, slow processing speeds and the very large size of the bitsets generated have forced researchers to compromise in terms of how such multiplets were stored, manipulated, and compared, e.g., by using simple unions to represent multiplets for sets of molecules. Here we report using bitmaps in place of bitsets to reduce storage demands and to improve processing speed. Here, a bitset is taken to mean a fully enumerated string of zeros and ones, from which a compressed bitmap is obtained by replacing uniform blocks ("runs") of digits in the bitset with a pair of values identifying the content and length of the block (run-length encoding compression). High-resolution multiplets involving four features are enabled by using 64 bit executables to create and manipulate bitmaps, which "connect" to the 32 bit executables used for database access and feature identification via an extensible mark-up language (XML) data stream. The encoding system used supports simple pairs, triplets, and quartets; multiplets in which a privileged substructure is used as an anchor point; and augmented multiplets in which an additional vertex is added to represent a contingent feature such as a hydrogen bond extension point linked to a complementary feature (e.g., a donor or an acceptor atom) in a base pair or triplet. It can readily be extended to larger, more complex multiplets as well. Database searching is one particular potential application for this technology. Consensus bitmaps built up from active ligands identified in preliminary screening can be used to generate hypothesis bitmaps, a process which includes allowance for differential weighting to allow greater emphasis to be placed on bits arising from multiplets expected to be particularly discriminating. Such hypothesis bitmaps are shown to be useful queries for database searching

  6. Similar head impact acceleration measured using instrumented ear patches in a junior rugby union team during matches in comparison with other sports.

    King, Doug A; Hume, Patria A; Gissane, Conor; Clark, Trevor N


    OBJECTIVE Direct impact with the head and the inertial loading of the head have been postulated as major mechanisms of head-related injuries, such as concussion. METHODS This descriptive observational study was conducted to quantify the head impact acceleration characteristics in under-9-year-old junior rugby union players in New Zealand. The impact magnitude, frequency, and location were collected with a wireless head impact sensor that was worn by 14 junior rugby players who participated in 4 matches. RESULTS A total of 721 impacts > 10g were recorded. The median (interquartile range [IQR]) number of impacts per player was 46 (IQR 37-58), resulting in 10 (IQR 4-18) impacts to the head per player per match. The median impact magnitudes recorded were 15g (IQR 12g-21g) for linear acceleration and 2296 rad/sec(2) (IQR 1352-4152 rad/sec(2)) for rotational acceleration. CONCLUSIONS There were 121 impacts (16.8%) above the rotational injury risk limit and 1 (0.1%) impact above the linear injury risk limit. The acceleration magnitude and number of head impacts in junior rugby union players were higher than those previously reported in similar age-group sports participants. The median linear acceleration for the under-9-year-old rugby players were similar to 7- to 8-year-old American football players, but lower than 9- to 12-year-old youth American football players. The median rotational accelerations measured were higher than the median and 95th percentiles in youth, high school, and collegiate American football players.

  7. Low-complexity feed-forward carrier phase estimation for M-ary QAM based on phase search acceleration by quadratic approximation.

    Xiang, Meng; Fu, Songnian; Deng, Lei; Tang, Ming; Shum, Perry; Liu, Deming


    Blind phase search (BPS) algorithm for M-QAM has excellent tolerance to laser linewidth at the expense of rather high computation complexity (CC). Here, we first theoretically obtain the quadratic relationship between the test angle and corresponding distance matric during the BPS implementation. Afterwards, we propose a carrier phase estimation (CPE) based on a two-stage BPS with quadratic approximation (QA). Instead of searching the phase blindly with fixed step-size for the BPS algorithm, QA can significantly accelerate the speed of phase searching. As a result, a group factor of 2.96/3.05, 4.55/4.67 and 2.27/2.3 (in the form of multipliers/adders) reduction of CC is achieved for 16QAM, 64QAM and 256QAM, respectively, in comparison with the traditional BPS scheme. Meanwhile, a guideline for determining the summing filter block length is put forward during performance optimization. Under the condition of optimum filter block length, our proposed scheme shows similar performance as traditional BPS scheme. At 1 dB required E(S)/N(0) penalty @ BER = 10(-2), our proposed CPE scheme can tolerate a times symbol duration productΔf⋅T(S) of 1.7 × 10(-4), 6 × 10(-5) and 1.5 × 10(-5) for 16/64/256-QAM, respectively.

  8. Introduction of the conditional correlated Bernoulli model of similarity value distributions and its application to the prospective prediction of fingerprint search performance.

    Vogt, Martin; Bajorath, Jürgen


    A statistical approach named the conditional correlated Bernoulli model is introduced for modeling of similarity scores and predicting the potential of fingerprint search calculations to identify active compounds. Fingerprint features are rationalized as dependent Bernoulli variables and conditional distributions of Tanimoto similarity values of database compounds given a reference molecule are assessed. The conditional correlated Bernoulli model is utilized in the context of virtual screening to estimate the position of a compound obtaining a certain similarity value in a database ranking. Through the generation of receiver operating characteristic curves from cumulative distribution functions of conditional similarity values for known active and random database compounds, one can predict how successful a fingerprint search might be. The comparison of curves for different fingerprints makes it possible to identify fingerprints that are most likely to identify new active molecules in a database search given a set of known reference molecules.

  9. Efficient SAT engines for concise logics: Accelerating proof search for zero-one linear constraint systems

    Fränzle, Martin; Herde, Christian


    We investigate the problem of generalizing acceleration techniques as found in recent satisfiability engines for conjunctive normal forms (CNFs) to linear constraint systems over the Booleans. The rationale behind this research is that rewriting the propositional formulae occurring in e.g. bounde...

  10. The High Time Resolution Universe Pulsar Survey XII : Galactic plane acceleration search and the discovery of 60 pulsars

    Ng, C; Bailes, M; Barr, E D; Bates, S D; Bhat, N D R; Burgay, M; Burke-Spolaor, S; Flynn, C M L; Jameson, A; Johnston, S; Keith, M J; Kramer, M; Levin, L; Petroff, E; Possenti, A; Stappers, B W; van Straten, W; Tiburzi, C; Eatough, R P; Lyne, A G


    We present initial results from the low-latitude Galactic plane region of the High Time Resolution Universe pulsar survey conducted at the Parkes 64-m radio telescope. We discuss the computational challenges arising from the processing of the terabyte-sized survey data. Two new radio interference mitigation techniques are introduced, as well as a partially-coherent segmented acceleration search algorithm which aims to increase our chances of discovering highly-relativistic short-orbit binary systems, covering a parameter space including potential pulsar-black hole binaries. We show that under a constant acceleration approximation, a ratio of data length over orbital period of ~0.1 results in the highest effectiveness for this search algorithm. From the 50 per cent of data processed thus far, we have re-detected 435 previously known pulsars and discovered a further 60 pulsars, two of which are fast-spinning pulsars with periods less than 30ms. PSR J1101-6424 is a millisecond pulsar whose heavy white dwarf (WD)...

  11. Motivation and short-term memory in visual search: Attention's accelerator revisited.

    Schneider, Daniel; Bonmassar, Claudia; Hickey, Clayton


    A cue indicating the possibility of cash reward will cause participants to perform memory-based visual search more efficiently. A recent study has suggested that this performance benefit might reflect the use of multiple memory systems: when needed, participants may maintain the to-be-remembered object in both long-term and short-term visual memory, with this redundancy benefitting target identification during search (Reinhart, McClenahan & Woodman, 2016). Here we test this compelling hypothesis. We had participants complete a memory-based visual search task involving a reward cue that either preceded presentation of the to-be-remembered target (pre-cue) or followed it (retro-cue). Following earlier work, we tracked memory representation using two components of the event-related potential (ERP): the contralateral delay activity (CDA), reflecting short-term visual memory, and the anterior P170, reflecting long-term storage. We additionally tracked attentional preparation and deployment in the contingent negative variation (CNV) and N2pc, respectively. Results show that only the reward pre-cue impacted our ERP indices of memory. However, both types of cue elicited a robust CNV, reflecting an influence on task preparation, both had equivalent impact on deployment of attention to the target, as indexed in the N2pc, and both had equivalent impact on visual search behavior. Reward prospect thus has an influence on memory-guided visual search, but this does not appear to be necessarily mediated by a change in the visual memory representations indexed by CDA. Our results demonstrate that the impact of motivation on search is not a simple product of improved memory for target templates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Tempest: Accelerated MS/MS Database Search Software for Heterogeneous Computing Platforms.

    Adamo, Mark E; Gerber, Scott A


    MS/MS database search algorithms derive a set of candidate peptide sequences from in silico digest of a protein sequence database, and compute theoretical fragmentation patterns to match these candidates against observed MS/MS spectra. The original Tempest publication described these operations mapped to a CPU-GPU model, in which the CPU (central processing unit) generates peptide candidates that are asynchronously sent to a discrete GPU (graphics processing unit) to be scored against experimental spectra in parallel. The current version of Tempest expands this model, incorporating OpenCL to offer seamless parallelization across multicore CPUs, GPUs, integrated graphics chips, and general-purpose coprocessors. Three protocols describe how to configure and run a Tempest search, including discussion of how to leverage Tempest's unique feature set to produce optimal results. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  13. Accelerating multicriterial optimization by the intensive exploitation of accumulated search data

    Gergel, Victor; Kozinov, Evgeny


    The work proposed an efficient method for solving computationally difficult multicriterial optimization problems, which are widely used to model complex optimal decision making problems. Under the suggested approach, it is assumed that partial criteria can be multi-extremal and computationally intense, and finding a solution to multicriterial problems can require the sequential computation of several efficient (Pareto-optimal) alternatives. This multiple repetition of alternative searches leads to a substantial increase in computational costs, and the problem can be overcome by means of full usage of all search information obtained during the computations. The article provides a description of the developed approach, the efficiency of which has been substantiated by the results of computational experiments.

  14. Throughput Analysis for a High-Performance FPGA-Accelerated Real-Time Search Application

    Wim Vanderbauwhede


    Full Text Available We propose an FPGA design for the relevancy computation part of a high-throughput real-time search application. The application matches terms in a stream of documents against a static profile, held in off-chip memory. We present a mathematical analysis of the throughput of the application and apply it to the problem of scaling the Bloom filter used to discard nonmatches.

  15. Novel DOCK clique driven 3D similarity database search tools for molecule shape matching and beyond: adding flexibility to the search for ligand kin.

    Good, Andrew C


    With readily available CPU power and copious disk storage, it is now possible to undertake rapid comparison of 3D properties derived from explicit ligand overlay experiments. With this in mind, shape software tools originally devised in the 1990s are revisited, modified and applied to the problem of ligand database shape comparison. The utility of Connolly surface data is highlighted using the program MAKESITE, which leverages surface normal data to a create ligand shape cast. This cast is applied directly within DOCK, allowing the program to be used unmodified as a shape searching tool. In addition, DOCK has undergone multiple modifications to create a dedicated ligand shape comparison tool KIN. Scoring has been altered to incorporate the original incarnation of Gaussian function derived shape description based on STO-3G atomic electron density. In addition, a tabu-like search refinement has been added to increase search speed by removing redundant starting orientations produced during clique matching. The ability to use exclusion regions, again based on Gaussian shape overlap, has also been integrated into the scoring function. The use of both DOCK with MAKESITE and KIN in database screening mode is illustrated using a published ligand shape virtual screening template. The advantages of using a clique-driven search paradigm are highlighted, including shape optimization within a pharmacophore constrained framework, and easy incorporation of additional scoring function modifications. The potential for further development of such methods is also discussed.

  16. Visual search for real world targets under conditions of high target-background similarity: Exploring training and transfer in younger and older adults.

    Neider, Mark B; Boot, Walter R; Kramer, Arthur F


    Real world visual search tasks often require observers to locate a target that blends in with its surrounding environment. However, studies of the effect of target-background similarity on search processes have been relatively rare and have ignored potential age-related differences. We trained younger and older adults to search displays comprised of real world objects on either homogenous backgrounds or backgrounds that camouflaged the target. Training was followed by a transfer session in which participants searched for novel camouflaged objects. Although older adults were slower to locate the target compared to younger adults, all participants improved substantially with training. Surprisingly, camouflage-trained younger and older adults showed no performance decrements when transferred to novel camouflage displays, suggesting that observers learned age-invariant, generalizable skills relevant for searching under conditions of high target-background similarity. Camouflage training benefits at transfer for older adults appeared to be related to improvements in attentional guidance and target recognition rather than a more efficient search strategy.

  17. Developing Molecular Interaction Database and Searching for Similar Pathways (MOLECULAR BIOLOGY AND INFORMATION-Biological Information Science)

    Kawashima, Shuichi; Katayama, Toshiaki; Kanehisa, Minoru


    We have developed a database named BRITE, which contains knowledge of interacting molecules and/or genes concering cell cycle and early development. Here, we report an overview of the database and the method of automatic search for functionally common sub-pathways between two biological pathways in BRITE.

  18. Accelerating object detection via a visual-feature-directed search cascade: algorithm and field programmable gate array implementation

    Kyrkou, Christos; Theocharides, Theocharis


    Object detection is a major step in several computer vision applications and a requirement for most smart camera systems. Recent advances in hardware acceleration for real-time object detection feature extensive use of reconfigurable hardware [field programmable gate arrays (FPGAs)], and relevant research has produced quite fascinating results, in both the accuracy of the detection algorithms as well as the performance in terms of frames per second (fps) for use in embedded smart camera systems. Detecting objects in images, however, is a daunting task and often involves hardware-inefficient steps, both in terms of the datapath design and in terms of input/output and memory access patterns. We present how a visual-feature-directed search cascade composed of motion detection, depth computation, and edge detection, can have a significant impact in reducing the data that needs to be examined by the classification engine for the presence of an object of interest. Experimental results on a Spartan 6 FPGA platform for face detection indicate data search reduction of up to 95%, which results in the system being able to process up to 50 1024×768 pixels images per second with a significantly reduced number of false positives.

  19. 基于GPU加速的图像双向相似性计算%Acceleration of Bidirectional Similarity Based on GPU



    In order to deal with the inefficiency of bidirectional similarity computation, realizes an accel-eration of bidirectional similarity computation based on GPU on CUDA programming model, takes the advantages of data independence of bidirectional similarity. Comparing to CPU, the realization of bidirectional similarity computation on GPU can achieve more than 1200 times speed up in the resolution of 392×300 experiment.%  针对双向相似性计算在CPU下串行计算效率低下,无法满足实际需求的问题,利用该计算中数据独立性的特点,应用CUDA编程模型实现基于GPU加速的图像双向相似性计算。与CPU相比,在392x300的分辨率实验下,该算法在GPU上可获得超过1200倍的加速比。

  20. Using argumentation to retrieve articles with similar citations: an inquiry into improving related articles search in the MEDLINE digital library.

    Tbahriti, Imad; Chichester, Christine; Lisacek, Frédérique; Ruch, Patrick


    The aim of this study is to investigate the relationships between citations and the scientific argumentation found abstracts. We design a related article search task and observe how the argumentation can affect the search results. We extracted citation lists from a set of 3200 full-text papers originating from a narrow domain. In parallel, we recovered the corresponding MEDLINE records for analysis of the argumentative moves. Our argumentative model is founded on four classes: PURPOSE, METHODS, RESULTS and CONCLUSION. A Bayesian classifier trained on explicitly structured MEDLINE abstracts generates these argumentative categories. The categories are used to generate four different argumentative indexes. A fifth index contains the complete abstract, together with the title and the list of Medical Subject Headings (MeSH) terms. To appraise the relationship of the moves to the citations, the citation lists were used as the criteria for determining relatedness of articles, establishing a benchmark; it means that two articles are considered as "related" if they share a significant set of co-citations. Our results show that the average precision of queries with the PURPOSE and CONCLUSION features is the highest, while the precision of the RESULTS and METHODS features was relatively low. A linear weighting combination of the moves is proposed, which significantly improves retrieval of related articles.

  1. eF-seek: prediction of the functional sites of proteins by searching for similar electrostatic potential and molecular surface shape

    Kinoshita, Kengo; Murakami, Yoichi; Nakamura, Haruki


    We have developed a method to predict ligand-binding sites in a new protein structure by searching for similar binding sites in the Protein Data Bank (PDB). The similarities are measured according to the shapes of the molecular surfaces and their electrostatic potentials. A new web server, eF-seek, provides an interface to our search method. It simply requires a coordinate file in the PDB format, and generates a prediction result as a virtual complex structure, with the putative ligands in a PDB format file as the output. In addition, the predicted interacting interface is displayed to facilitate the examination of the virtual complex structure on our own applet viewer with the web browser (URL: PMID:17567616

  2. Improving Limit Surface Search Algorithms in RAVEN Using Acceleration Schemes: Level II Milestone

    Alfonsi, Andrea [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Cogliati, Joshua Joseph [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Sen, Ramazan Sonat [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Laboratory (INL), Idaho Falls, ID (United States)


    , subject of the analysis. These methodologies are named, in the RAVEN environment, adaptive sampling strategies. These methodologies infer system responses from surrogate models constructed from already existing samples (produced using high fidelity simulations) and suggest the most relevant location (coordinate in the input space) of the next sampling point to be explored in the uncertain/parametric domain. When using those methodologies, it is possible to understand features of the system response with a small number of carefully selected samples. This report focuses on the development and improvement of the limit surface search. The limit surface is an important concept in system reliability analysis. Without going into the details, which will be covered later in the report, the limit surface could be briefly described as an hyper-surface in the system uncertainty/parametric space separating the regions leading to a prescribed system outcome. For example, if the uncertainty/parametric space is the one generated by the reactor power level and the duration of the batteries, the system is a nuclear power plant and the system outcome discriminating variable is the clad failure in a station blackout scenario, then the limit surface separates the combinations of reactor power level and battery duration that lead to clad failure from the ones that do not.

  3. CUDASW++ 3.0: accelerating Smith-Waterman protein database search by coupling CPU and GPU SIMD instructions.

    Liu, Yongchao; Wirawan, Adrianto; Schmidt, Bertil


    The maximal sensitivity for local alignments makes the Smith-Waterman algorithm a popular choice for protein sequence database search based on pairwise alignment. However, the algorithm is compute-intensive due to a quadratic time complexity. Corresponding runtimes are further compounded by the rapid growth of sequence databases. We present CUDASW++ 3.0, a fast Smith-Waterman protein database search algorithm, which couples CPU and GPU SIMD instructions and carries out concurrent CPU and GPU computations. For the CPU computation, this algorithm employs SSE-based vector execution units as accelerators. For the GPU computation, we have investigated for the first time a GPU SIMD parallelization, which employs CUDA PTX SIMD video instructions to gain more data parallelism beyond the SIMT execution model. Moreover, sequence alignment workloads are automatically distributed over CPUs and GPUs based on their respective compute capabilities. Evaluation on the Swiss-Prot database shows that CUDASW++ 3.0 gains a performance improvement over CUDASW++ 2.0 up to 2.9 and 3.2, with a maximum performance of 119.0 and 185.6 GCUPS, on a single-GPU GeForce GTX 680 and a dual-GPU GeForce GTX 690 graphics card, respectively. In addition, our algorithm has demonstrated significant speedups over other top-performing tools: SWIPE and BLAST+. CUDASW++ 3.0 is written in CUDA C++ and PTX assembly languages, targeting GPUs based on the Kepler architecture. This algorithm obtains significant speedups over its predecessor: CUDASW++ 2.0, by benefiting from the use of CPU and GPU SIMD instructions as well as the concurrent execution on CPUs and GPUs. The source code and the simulated data are available at

  4. PhenoMeter: a metabolome database search tool using statistical similarity matching of metabolic phenotypes for high-confidence detection of functional links

    Adam James Carroll


    Full Text Available This article describes PhenoMeter, a new type of metabolomics database search that accepts metabolite response patterns as queries and searches the MetaPhen database of reference patterns for responses that are statistically significantly similar or inverse for the purposes of detecting functional links. To identify a similarity measure that would detect functional links as reliably as possible, we compared the performance of four statistics in correctly top-matching metabolic phenotypes of Arabidopsis thaliana metabolism mutants affected in different steps of the photorespiration metabolic pathway to reference phenotypes of mutants affected in the same enzymes by independent mutations. The best performing statistic, the PhenoMeter Score (PM Score, was a function of both Pearson correlation and Fisher’s Exact Test of directional overlap. This statistic outperformed Pearson correlation, biweight midcorrelation and Fisher’s Exact Test used alone. To demonstrate general applicability, we show that the PhenoMeter reliably retrieved the most closely functionally-linked response in the database when queried with responses to a wide variety of environmental and genetic perturbations. Attempts to match metabolic phenotypes between independent studies were met with varying success and possible reasons for this are discussed. Overall, our results suggest that integration of pattern-based search tools into metabolomics databases will aid functional annotation of newly recorded metabolic phenotypes analogously to the way sequence similarity search algorithms have aided the functional annotation of genes and proteins. PhenoMeter is freely available at MetabolomeExpress (

  5. Similarity-potency trees: a method to search for SAR information in compound data sets and derive SAR rules.

    Wawer, Mathias; Bajorath, Jürgen


    An intuitive and generally applicable analysis method, termed similarity-potency tree (SPT), is introduced to mine structure-activity relationship (SAR) information in compound data sets of any source. Only compound potency values and nearest-neighbor similarity relationships are considered. Rather than analyzing a data set as a whole, in part overlapping compound neighborhoods are systematically generated and represented as SPTs. This local analysis scheme simplifies the evaluation of SAR information and SPTs of high SAR information content are easily identified. By inspecting only a limited number of compound neighborhoods, it is also straightforward to determine whether data sets contain only little or no interpretable SAR information. Interactive analysis of SPTs is facilitated by reading the trees in two directions, which makes it possible to extract SAR rules, if available, in a consistent manner. The simplicity and interpretability of the data structure and the ease of calculation are characteristic features of this approach. We apply the methodology to high-throughput screening and lead optimization data sets, compare the approach to standard clustering techniques, illustrate how SAR rules are derived, and provide some practical guidance how to best utilize the methodology. The SPT program is made freely available to the scientific community.

  6. A relook on using the Earth Similarity Index for searching habitable zones around solar and extrasolar planets

    Biswas, S.; Shome, A.; Raha, B.; Bhattacharya, A. B.


    To study the distribution of Earth-like planets and to locate the habitable zone around extrasolar planets and their known satellites, we have emphasized in this paper the consideration of Earth similarity index (ESI) as a multi parameter quick assessment of Earth-likeness with a value between zero and one. Weight exponent values for four planetary properties have been taken into account to determine the ESI. A plot of surface ESI against the interior ESI exhibits some interesting results which provide further information when confirmed planets are examined. From the analysis of the available catalog and existing theory, none of the solar planets achieves an ESI value greater than 0.8. Though the planet Mercury has a value of 0.6, Mars exhibits a value between 0.6 and 0.8 and the planet Venus shows a value near 0.5. Finally, the locations of the habitable zone around different type of stars are critically examined and discussed.

  7. Gesture Recognition from Data Streams of Human Motion Sensor Using Accelerated PSO Swarm Search Feature Selection Algorithm

    Simon Fong


    Full Text Available Human motion sensing technology gains tremendous popularity nowadays with practical applications such as video surveillance for security, hand signing, and smart-home and gaming. These applications capture human motions in real-time from video sensors, the data patterns are nonstationary and ever changing. While the hardware technology of such motion sensing devices as well as their data collection process become relatively mature, the computational challenge lies in the real-time analysis of these live feeds. In this paper we argue that traditional data mining methods run short of accurately analyzing the human activity patterns from the sensor data stream. The shortcoming is due to the algorithmic design which is not adaptive to the dynamic changes in the dynamic gesture motions. The successor of these algorithms which is known as data stream mining is evaluated versus traditional data mining, through a case of gesture recognition over motion data by using Microsoft Kinect sensors. Three different subjects were asked to read three comic strips and to tell the stories in front of the sensor. The data stream contains coordinates of articulation points and various positions of the parts of the human body corresponding to the actions that the user performs. In particular, a novel technique of feature selection using swarm search and accelerated PSO is proposed for enabling fast preprocessing for inducing an improved classification model in real-time. Superior result is shown in the experiment that runs on this empirical data stream. The contribution of this paper is on a comparative study between using traditional and data stream mining algorithms and incorporation of the novel improved feature selection technique with a scenario where different gesture patterns are to be recognized from streaming sensor data.

  8. Current trends in non-accelerator particle physics: 1, Neutrino mass and oscillation. 2, High energy neutrino astrophysics. 3, Detection of dark matter. 4, Search for strange quark matter. 5, Magnetic monopole searches

    He, Yudong [California Univ., Berkeley, CA (United States)]|[Lawrence Berkeley Lab., CA (United States)


    This report is a compilation of papers reflecting current trends in non-accelerator particle physics, corresponding to talks that its author was invited to present at the Workshop on Tibet Cosmic Ray Experiment and Related Physics Topics held in Beijing, China, April 4--13, 1995. The papers are entitled `Neutrino Mass and Oscillation`, `High Energy Neutrino Astrophysics`, `Detection of Dark Matter`, `Search for Strange Quark Matter`, and `Magnetic Monopole Searches`. The report is introduced by a survey of the field and a brief description of each of the author`s papers.

  9. Ultra-short laser-accelerated proton pulses have similar DNA-damaging effectiveness but produce less immediate nitroxidative stress than conventional proton beams

    Raschke, S.; Spickermann, S.; Toncian, T.; Swantusch, M.; Boeker, J.; Giesen, U.; Iliakis, G.; Willi, O.; Boege, F.


    Ultra-short proton pulses originating from laser-plasma accelerators can provide instantaneous dose rates at least 107-fold in excess of conventional, continuous proton beams. The impact of such extremely high proton dose rates on A549 human lung cancer cells was compared with conventionally accelerated protons and 90 keV X-rays. Between 0.2 and 2 Gy, the yield of DNA double strand breaks (foci of phosphorylated histone H2AX) was not significantly different between the two proton sources or proton irradiation and X-rays. Protein nitroxidation after 1 h judged by 3-nitrotyrosine generation was 2.5 and 5-fold higher in response to conventionally accelerated protons compared to laser-driven protons and X-rays, respectively. This difference was significant (p DNA damaging potential as conventional proton beams, while inducing less immediate nitroxidative stress, which probably entails a distinct therapeutic potential.

  10. A spin transfer torque magnetoresistance random access memory-based high-density and ultralow-power associative memory for fully data-adaptive nearest neighbor search with current-mode similarity evaluation and time-domain minimum searching

    Ma, Yitao; Miura, Sadahiko; Honjo, Hiroaki; Ikeda, Shoji; Hanyu, Takahiro; Ohno, Hideo; Endoh, Tetsuo


    A high-density nonvolatile associative memory (NV-AM) based on spin transfer torque magnetoresistive random access memory (STT-MRAM), which achieves highly concurrent and ultralow-power nearest neighbor search with full adaptivity of the template data format, has been proposed and fabricated using the 90 nm CMOS/70 nm perpendicular-magnetic-tunnel-junction hybrid process. A truly compact current-mode circuitry is developed to realize flexibly controllable and high-parallel similarity evaluation, which makes the NV-AM adaptable to any dimensionality and component-bit of template data. A compact dual-stage time-domain minimum searching circuit is also developed, which can freely extend the system for more template data by connecting multiple NM-AM cores without additional circuits for integrated processing. Both the embedded STT-MRAM module and the computing circuit modules in this NV-AM chip are synchronously power-gated to completely eliminate standby power and maximally reduce operation power by only activating the currently accessed circuit blocks. The operations of a prototype chip at 40 MHz are demonstrated by measurement. The average operation power is only 130 µW, and the circuit density is less than 11 µm2/bit. Compared with the latest conventional works in both volatile and nonvolatile approaches, more than 31.3% circuit area reductions and 99.2% power improvements are achieved, respectively. Further power performance analyses are discussed, which verify the special superiority of the proposed NV-AM in low-power and large-memory-based VLSIs.

  11. CUDAMPF: a multi-tiered parallel framework for accelerating protein sequence search in HMMER on CUDA-enabled GPU.

    Jiang, Hanyu; Ganesan, Narayan


    HMMER software suite is widely used for analysis of homologous protein and nucleotide sequences with high sensitivity. The latest version of hmmsearch in HMMER 3.x, utilizes heuristic-pipeline which consists of MSV/SSV (Multiple/Single ungapped Segment Viterbi) stage, P7Viterbi stage and the Forward scoring stage to accelerate homology detection. Since the latest version is highly optimized for performance on modern multi-core CPUs with SSE capabilities, only a few acceleration attempts report speedup. However, the most compute intensive tasks within the pipeline (viz., MSV/SSV and P7Viterbi stages) still stand to benefit from the computational capabilities of massively parallel processors. A Multi-Tiered Parallel Framework (CUDAMPF) implemented on CUDA-enabled GPUs presented here, offers a finer-grained parallelism for MSV/SSV and Viterbi algorithms. We couple SIMT (Single Instruction Multiple Threads) mechanism with SIMD (Single Instructions Multiple Data) video instructions with warp-synchronism to achieve high-throughput processing and eliminate thread idling. We also propose a hardware-aware optimal allocation scheme of scarce resources like on-chip memory and caches in order to boost performance and scalability of CUDAMPF. In addition, runtime compilation via NVRTC available with CUDA 7.0 is incorporated into the presented framework that not only helps unroll innermost loop to yield upto 2 to 3-fold speedup than static compilation but also enables dynamic loading and switching of kernels depending on the query model size, in order to achieve optimal performance. CUDAMPF is designed as a hardware-aware parallel framework for accelerating computational hotspots within the hmmsearch pipeline as well as other sequence alignment applications. It achieves significant speedup by exploiting hierarchical parallelism on single GPU and takes full advantage of limited resources based on their own performance features. In addition to exceeding performance of other

  12. Search for major genes with progeny test data to accelerate the development of genetically superior loblolly pine



    This research project is to develop a novel approach that fully utilized the current breeding materials and genetic test information available from the NCSU-Industry Cooperative Tree Improvement Program to identify major genes that are segregating for growth and disease resistance in loblolly pine. If major genes can be identified in the existing breeding population, they can be utilized directly in the conventional loblolly pine breeding program. With the putative genotypes of parents identified, tree breeders can make effective decisions on management of breeding populations and operational deployment of genetically superior trees. Forest productivity will be significantly enhanced if genetically superior genotypes with major genes for economically important traits could be deployed in an operational plantation program. The overall objective of the project is to develop genetic model and analytical methods for major gene detection with progeny test data and accelerate the development of genetically superior loblolly pine. Specifically, there are three main tasks: (1) Develop genetic models for major gene detection and implement statistical methods and develop computer software for screening progeny test data; (2) Confirm major gene segregation with molecular markers; and (3) Develop strategies for using major genes for tree breeding.

  13. Accelerators and Dinosaurs

    Turner, Michael Stanley


    Using naturally occuring particles on which to research might have made accelerators become extinct. But in fact, results from astrophysics have made accelerator physics even more important. Not only are accelerators used in hospitals but they are also being used to understand nature's inner workings by searching for Higgs bosons, CP violation, neutrino mass and dark matter (2 pages)

  14. Combination of 2D/3D Ligand-Based Similarity Search in Rapid Virtual Screening from Multimillion Compound Repositories. Selection and Biological Evaluation of Potential PDE4 and PDE5 Inhibitors

    Krisztina Dobi


    Full Text Available Rapid in silico selection of target focused libraries from commercial repositories is an attractive and cost effective approach. If structures of active compounds are available rapid 2D similarity search can be performed on multimillion compound databases but the generated library requires further focusing by various 2D/3D chemoinformatics tools. We report here a combination of the 2D approach with a ligand-based 3D method (Screen3D which applies flexible matching to align reference and target compounds in a dynamic manner and thus to assess their structural and conformational similarity. In the first case study we compared the 2D and 3D similarity scores on an existing dataset derived from the biological evaluation of a PDE5 focused library. Based on the obtained similarity metrices a fusion score was proposed. The fusion score was applied to refine the 2D similarity search in a second case study where we aimed at selecting and evaluating a PDE4B focused library. The application of this fused 2D/3D similarity measure led to an increase of the hit rate from 8.5% (1st round, 47% inhibition at 10 µM to 28.5% (2nd round at 50% inhibition at 10 µM and the best two hits had 53 nM inhibitory activities.

  15. Accelerated Cure Project for Multiple Sclerosis

    ... main content Accelerating research toward a cure for multiple sclerosis Home Contact Us Search form Search Connect Volunteer ... is to accelerate efforts toward a cure for multiple sclerosis by rapidly advancing research that determines its causes ...

  16. Compression-based Similarity

    Vitanyi, Paul M B


    First we consider pair-wise distances for literal objects consisting of finite binary files. These files are taken to contain all of their meaning, like genomes or books. The distances are based on compression of the objects concerned, normalized, and can be viewed as similarity distances. Second, we consider pair-wise distances between names of objects, like "red" or "christianity." In this case the distances are based on searches of the Internet. Such a search can be performed by any search engine that returns aggregate page counts. We can extract a code length from the numbers returned, use the same formula as before, and derive a similarity or relative semantics between names for objects. The theory is based on Kolmogorov complexity. We test both similarities extensively experimentally.

  17. Search of phenotype related candidate genes using gene ontology-based semantic similarity and protein interaction information: application to Brugada syndrome.

    Massanet, Raimon; Gallardo-Chacon, Joan-Josep; Caminal, Pere; Perera, Alexandre


    This work presents a methodology for finding phenotype candidate genes starting from a set of known related genes. This is accomplished by automatically mining and organizing the available scientific literature using Gene Ontology-based semantic similarity. As a case study, Brugada syndrome related genes have been used as input in order to obtain a list of other possible candidate genes related with this disease. Brugada anomaly produces a typical alteration in the Electrocardiogram and carriers of the disease show an increased probability of sudden death. Results show a set of semantically coherent proteins that are shown to be related with synaptic transmission and muscle contraction physiological processes.

  18. Personalized Search



    As the volume of electronically available information grows, relevant items become harder to find. This work presents an approach to personalizing search results in scientific publication databases. This work focuses on re-ranking search results from existing search engines like Solr or ElasticSearch. This work also includes the development of Obelix, a new recommendation system used to re-rank search results. The project was proposed and performed at CERN, using the scientific publications available on the CERN Document Server (CDS). This work experiments with re-ranking using offline and online evaluation of users and documents in CDS. The experiments conclude that the personalized search result outperform both latest first and word similarity in terms of click position in the search result for global search in CDS.

  19. Efficient Video Similarity Measurement and Search


    sequence matching techniques for video copy detection,” in Proceedings of SPIE – Storage and Retrieval for Media Databases 2002, San Jose , CA, January 2002...Proceedings of the Storage and Retrieval for Media Datbases 2001, San Jose , USA, jan 2001, vol. 4315, pp. 188–195. [19] D. Adjeroh, I. King, and M.C. Lee, “A... Vasconcelos , “On the complexity of probabilistic image retrieval,” in Pro- ceedings Eighth IEEE International Conference on Computer Vision, Vancou- ver, B.C

  20. Similarity Scaling

    Schnack, Dalton D.

    In Lecture 10, we introduced a non-dimensional parameter called the Lundquist number, denoted by S. This is just one of many non-dimensional parameters that can appear in the formulations of both hydrodynamics and MHD. These generally express the ratio of the time scale associated with some dissipative process to the time scale associated with either wave propagation or transport by flow. These are important because they define regions in parameter space that separate flows with different physical characteristics. All flows that have the same non-dimensional parameters behave in the same way. This property is called similarity scaling.

  1. Proteomic analysis of cellular soluble proteins from human bronchial smooth muscle cells by combining nondenaturing micro 2DE and quantitative LC-MS/MS. 2. Similarity search between protein maps for the analysis of protein complexes.

    Jin, Ya; Yuan, Qi; Zhang, Jun; Manabe, Takashi; Tan, Wen


    Human bronchial smooth muscle cell soluble proteins were analyzed by a combined method of nondenaturing micro 2DE, grid gel-cutting, and quantitative LC-MS/MS and a native protein map was prepared for each of the identified 4323 proteins [1]. A method to evaluate the degree of similarity between the protein maps was developed since we expected the proteins comprising a protein complex would be separated together under nondenaturing conditions. The following procedure was employed using Excel macros; (i) maps that have three or more squares with protein quantity data were selected (2328 maps), (ii) within each map, the quantity values of the squares were normalized setting the highest value to be 1.0, (iii) in comparing a map with another map, the smaller normalized quantity in two corresponding squares was taken and summed throughout the map to give an "overlap score," (iv) each map was compared against all the 2328 maps and the largest overlap score, obtained when a map was compared with itself, was set to be 1.0 thus providing 2328 "overlap factors," (v) step (iv) was repeated for all maps providing 2328 × 2328 matrix of overlap factors. From the matrix, protein pairs that showed overlap factors above 0.65 from both protein sides were selected (431 protein pairs). Each protein pair was searched in a database (UniProtKB) on complex formation and 301 protein pairs, which comprise 35 protein complexes, were found to be documented. These results demonstrated that native protein maps and their similarity search would enable simultaneous analysis of multiple protein complexes in cells.

  2. Proteomic analysis of cellular soluble proteins from human bronchial smooth muscle cells by combining nondenaturing micro 2DE and quantitative LC‐MS/MS. 2. Similarity search between protein maps for the analysis of protein complexes

    Jin, Ya; Yuan, Qi; Zhang, Jun; Manabe, Takashi


    Human bronchial smooth muscle cell soluble proteins were analyzed by a combined method of nondenaturing micro 2DE, grid gel‐cutting, and quantitative LC‐MS/MS and a native protein map was prepared for each of the identified 4323 proteins [1]. A method to evaluate the degree of similarity between the protein maps was developed since we expected the proteins comprising a protein complex would be separated together under nondenaturing conditions. The following procedure was employed using Excel macros; (i) maps that have three or more squares with protein quantity data were selected (2328 maps), (ii) within each map, the quantity values of the squares were normalized setting the highest value to be 1.0, (iii) in comparing a map with another map, the smaller normalized quantity in two corresponding squares was taken and summed throughout the map to give an “overlap score,” (iv) each map was compared against all the 2328 maps and the largest overlap score, obtained when a map was compared with itself, was set to be 1.0 thus providing 2328 “overlap factors,” (v) step (iv) was repeated for all maps providing 2328 × 2328 matrix of overlap factors. From the matrix, protein pairs that showed overlap factors above 0.65 from both protein sides were selected (431 protein pairs). Each protein pair was searched in a database (UniProtKB) on complex formation and 301 protein pairs, which comprise 35 protein complexes, were found to be documented. These results demonstrated that native protein maps and their similarity search would enable simultaneous analysis of multiple protein complexes in cells. PMID:26031785

  3. Accelerating Smith-Waterman Alignment for Protein Database Search Using Frequency Distance Filtration Scheme Based on CPU-GPU Collaborative System.

    Liu, Yu; Hong, Yang; Lin, Chun-Yuan; Hung, Che-Lun


    The Smith-Waterman (SW) algorithm has been widely utilized for searching biological sequence databases in bioinformatics. Recently, several works have adopted the graphic card with Graphic Processing Units (GPUs) and their associated CUDA model to enhance the performance of SW computations. However, these works mainly focused on the protein database search by using the intertask parallelization technique, and only using the GPU capability to do the SW computations one by one. Hence, in this paper, we will propose an efficient SW alignment method, called CUDA-SWfr, for the protein database search by using the intratask parallelization technique based on a CPU-GPU collaborative system. Before doing the SW computations on GPU, a procedure is applied on CPU by using the frequency distance filtration scheme (FDFS) to eliminate the unnecessary alignments. The experimental results indicate that CUDA-SWfr runs 9.6 times and 96 times faster than the CPU-based SW method without and with FDFS, respectively.

  4. $A$ searches

    Beacham, James

    The Standard Model of particle physics encompasses three of the four known fundamental forces of nature, and has remarkably withstood all of the experimental tests of its predictions, a fact solidified by the discovery, in 2012, of the Higgs boson. However, it cannot be the complete picture. Many measurements have been made that hint at physics beyond the Standard Model, and the main task of the high- energy experimental physics community is to conduct searches for new physics in as many di↵erent places and regimes as possible. I present three searches for new phenomena in three di↵erent high-energy collider experiments, namely, a search for events with at least three photons in the final state, which is sensitive to an exotic decay of a Higgs boson into four photons via intermediate pseudoscalar particles, a, with ATLAS, at the Large Hadron Collider; a search for a dark photon, also known as an A0 , with APEX, at Thomas Je↵erson National Accelerator Facility; and a search for a Higgs decaying into four...

  5. Particle accelerator; the Universe machine

    Yurkewicz, Katie


    "In summer 2008, scientists will switch on one of the largest machines in the world to search for the smallest of particle. CERN's Large Hadron Collider particle accelerator has the potential to chagne our understanding of the Universe."

  6. Wavelet transform in similarity paradigm

    Z.R. Struzik; A.P.J.M. Siebes (Arno)


    textabstract[INS-R9802] Searching for similarity in time series finds still broader applications in data mining. However, due to the very broad spectrum of data involved, there is no possibility of defining one single notion of similarity suitable to serve all applications. We present a powerful

  7. Studies of accelerated compact toruses

    Hartman, C.W.; Eddleman, J.; Hammer, J.H.


    In an earlier publication we considered acceleration of plasma rings (Compact Torus). Several possible accelerator configurations were suggested and the possibility of focusing the accelerated rings was discussed. In this paper we consider one scheme, acceleration of a ring between coaxial electrodes by a B/sub theta/ field as in a coaxial rail-gun. If the electrodes are conical, a ring accelerated towards the apex of the cone undergoes self-similar compression (focusing) during acceleration. Because the allowable acceleration force, F/sub a/ = kappaU/sub m//R where (kappa < 1), increases as R/sup -2/, the accelerating distance for conical electrodes is considerably shortened over that required for coaxial electrodes. In either case, however, since the accelerating flux can expand as the ring moves, most of the accelerating field energy can be converted into kinetic energy of the ring leading to high efficiency.

  8. Ground-Based Gamma-Ray Astronomy at Energies Above 10 TeV: Searching for Galactic PeV Cosmic-Ray Accelerators

    Rowell, G; Plyasheshnikov, A


    The origin of Galactic CRs up the knee energy remains unanswered and provides strong motivation for the study of gamma-ray sources at energies above 10 TeV. We discuss recent results from ground-based gamma-ray Cherenkov imaging systems at these energies as well as future observational efforts in this direction. The exciting results of H.E.S.S. give clues as to the nature of Galactic CR accelerators, and suggest that there is a population of Galactic gamma-ray sources with emission extending beyond 10 TeV. A dedicated system of Cherenkov imaging telescopes optimised for higher energies appears to be a promising way to study the multi-TeV gamma-ray sky.

  9. Applying ligands profiling using multiple extended electron distribution based field templates and feature trees similarity searching in the discovery of new generation of urea-based antineoplastic kinase inhibitors.

    Eman M Dokla

    Full Text Available This study provides a comprehensive computational procedure for the discovery of novel urea-based antineoplastic kinase inhibitors while focusing on diversification of both chemotype and selectivity pattern. It presents a systematic structural analysis of the different binding motifs of urea-based kinase inhibitors and the corresponding configurations of the kinase enzymes. The computational model depends on simultaneous application of two protocols. The first protocol applies multiple consecutive validated virtual screening filters including SMARTS, support vector-machine model (ROC = 0.98, Bayesian model (ROC = 0.86 and structure-based pharmacophore filters based on urea-based kinase inhibitors complexes retrieved from literature. This is followed by hits profiling against different extended electron distribution (XED based field templates representing different kinase targets. The second protocol enables cancericidal activity verification by using the algorithm of feature trees (Ftrees similarity searching against NCI database. Being a proof-of-concept study, this combined procedure was experimentally validated by its utilization in developing a novel series of urea-based derivatives of strong anticancer activity. This new series is based on 3-benzylbenzo[d]thiazol-2(3H-one scaffold which has interesting chemical feasibility and wide diversification capability. Antineoplastic activity of this series was assayed in vitro against NCI 60 tumor-cell lines showing very strong inhibition of GI(50 as low as 0.9 uM. Additionally, its mechanism was unleashed using KINEX™ protein kinase microarray-based small molecule inhibitor profiling platform and cell cycle analysis showing a peculiar selectivity pattern against Zap70, c-src, Mink1, csk and MeKK2 kinases. Interestingly, it showed activity on syk kinase confirming the recent studies finding of the high activity of diphenyl urea containing compounds against this kinase. Allover, the new series

  10. Applying ligands profiling using multiple extended electron distribution based field templates and feature trees similarity searching in the discovery of new generation of urea-based antineoplastic kinase inhibitors.

    Dokla, Eman M; Mahmoud, Amr H; Elsayed, Mohamed S A; El-Khatib, Ahmed H; Linscheid, Michael W; Abouzid, Khaled A


    This study provides a comprehensive computational procedure for the discovery of novel urea-based antineoplastic kinase inhibitors while focusing on diversification of both chemotype and selectivity pattern. It presents a systematic structural analysis of the different binding motifs of urea-based kinase inhibitors and the corresponding configurations of the kinase enzymes. The computational model depends on simultaneous application of two protocols. The first protocol applies multiple consecutive validated virtual screening filters including SMARTS, support vector-machine model (ROC = 0.98), Bayesian model (ROC = 0.86) and structure-based pharmacophore filters based on urea-based kinase inhibitors complexes retrieved from literature. This is followed by hits profiling against different extended electron distribution (XED) based field templates representing different kinase targets. The second protocol enables cancericidal activity verification by using the algorithm of feature trees (Ftrees) similarity searching against NCI database. Being a proof-of-concept study, this combined procedure was experimentally validated by its utilization in developing a novel series of urea-based derivatives of strong anticancer activity. This new series is based on 3-benzylbenzo[d]thiazol-2(3H)-one scaffold which has interesting chemical feasibility and wide diversification capability. Antineoplastic activity of this series was assayed in vitro against NCI 60 tumor-cell lines showing very strong inhibition of GI(50) as low as 0.9 uM. Additionally, its mechanism was unleashed using KINEX™ protein kinase microarray-based small molecule inhibitor profiling platform and cell cycle analysis showing a peculiar selectivity pattern against Zap70, c-src, Mink1, csk and MeKK2 kinases. Interestingly, it showed activity on syk kinase confirming the recent studies finding of the high activity of diphenyl urea containing compounds against this kinase. Allover, the new series, which is based on

  11. Future accelerators (?)

    John Womersley


    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  12. Fast Structural Search in Phylogenetic Databases

    William H. Piel


    Full Text Available As the size of phylogenetic databases grows, the need for efficiently searching these databases arises. Thanks to previous and ongoing research, searching by attribute value and by text has become commonplace in these databases. However, searching by topological or physical structure, especially for large databases and especially for approximate matches, is still an art. We propose structural search techniques that, given a query or pattern tree P and a database of phylogenies D, find trees in D that are sufficiently close to P . The “closeness” is a measure of the topological relationships in P that are found to be the same or similar in a tree D in D. We develop a filtering technique that accelerates searches and present algorithms for rooted and unrooted trees where the trees can be weighted or unweighted. Experimental results on comparing the similarity measure with existing tree metrics and on evaluating the efficiency of the search techniques demonstrate that the proposed approach is promising

  13. Custom Search Engines: Tools & Tips

    Notess, Greg R.


    Few have the resources to build a Google or Yahoo! from scratch. Yet anyone can build a search engine based on a subset of the large search engines' databases. Use Google Custom Search Engine or Yahoo! Search Builder or any of the other similar programs to create a vertical search engine targeting sites of interest to users. The basic steps to…

  14. Accelerating Value Creation with Accelerators

    Jonsson, Eythor Ivar


    accelerator programs. Microsoft runs accelerators in seven different countries. Accelerators have grown out of the infancy stage and are now an accepted approach to develop new ventures based on cutting-edge technology like the internet of things, mobile technology, big data and virtual reality. It is also...... with the traditional audit and legal universes and industries are examples of emerging potentials both from a research and business point of view to exploit and explore further. The accelerator approach may therefore be an Idea Watch to consider, no matter which industry you are in, because in essence accelerators...

  15. Accelerating Value Creation with Accelerators

    Jonsson, Eythor Ivar


    Accelerators can help to accelerate value creation. Accelerators are short-term programs that have the objective of creating innovative and fast growing ventures. They have gained attraction as larger corporations like Microsoft, Barclays bank and Nordea bank have initiated and sponsored accelera......Accelerators can help to accelerate value creation. Accelerators are short-term programs that have the objective of creating innovative and fast growing ventures. They have gained attraction as larger corporations like Microsoft, Barclays bank and Nordea bank have initiated and sponsored...... an approach to facilitate implementation and realization of business ideas and is a lucrative approach to transform research into ventures and to revitalize regions and industries in transition. Investors have noticed that the accelerator approach is a way to increase the possibility of success by funnelling...

  16. On Solar Wind Origin and Acceleration: Measurements from ACE

    Stakhiv, Mark; Lepri, Susan T.; Landi, Enrico; Tracy, Patrick; Zurbuchen, Thomas H.


    The origin and acceleration of the solar wind are still debated. In this paper, we search for signatures of the source region and acceleration mechanism of the solar wind in the plasma properties measured in situ by the Advanced Composition Explorer spacecraft. Using the elemental abundances as a proxy for the source region and the differential velocity and ion temperature ratios as a proxy for the acceleration mechanism, we are able to identify signatures pointing toward possible source regions and acceleration mechanisms. We find that the fast solar wind in the ecliptic plane is the same as that observed from the polar regions and is consistent with wave acceleration and coronal-hole origin. We also find that the slow wind is composed of two components: one similar to the fast solar wind (with slower velocity) and the other likely originating from closed magnetic loops. Both components of the slow solar wind show signatures of wave acceleration. From these findings, we draw a scenario that envisions two types of wind, with different source regions and release mechanisms, but the same wave acceleration mechanism.

  17. Efficient protein structure search using indexing methods.

    Kim, Sungchul; Sael, Lee; Yu, Hwanjo


    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively.




    This paper compares various types of recirculating accelerators, outlining the advantages and disadvantages of various approaches. The accelerators are characterized according to the types of arcs they use: whether there is a single arc for the entire recirculator or there are multiple arcs, and whether the arc(s) are isochronous or non-isochronous.

  19. LIBO accelerates


    The prototype module of LIBO, a linear accelerator project designed for cancer therapy, has passed its first proton-beam acceleration test. In parallel a new version - LIBO-30 - is being developed, which promises to open up even more interesting avenues.

  20. Accelerating Inspire



    CERN has been involved in the dissemination of scientific results since its early days and has continuously updated the distribution channels. Currently, Inspire hosts catalogues of articles, authors, institutions, conferences, jobs, experiments, journals and more. Successful orientation among this amount of data requires comprehensive linking between the content. Inspire has lacked a system for linking experiments and articles together based on which accelerator they were conducted at. The purpose of this project has been to create such a system. Records for 156 accelerators were created and all 2913 experiments on Inspire were given corresponding MARC tags. Records of 18404 accelerator physics related bibliographic entries were also tagged with corresponding accelerator tags. Finally, as a part of the endeavour to broaden CERN's presence on Wikipedia, existing Wikipedia articles of accelerators were updated with short descriptions and links to Inspire. In total, 86 Wikipedia articles were updated. This repo...

  1. Domain similarity based orthology detection.

    Bitard-Feildel, Tristan; Kemena, Carsten; Greenwood, Jenny M; Bornberg-Bauer, Erich


    Orthologous protein detection software mostly uses pairwise comparisons of amino-acid sequences to assert whether two proteins are orthologous or not. Accordingly, when the number of sequences for comparison increases, the number of comparisons to compute grows in a quadratic order. A current challenge of bioinformatic research, especially when taking into account the increasing number of sequenced organisms available, is to make this ever-growing number of comparisons computationally feasible in a reasonable amount of time. We propose to speed up the detection of orthologous proteins by using strings of domains to characterize the proteins. We present two new protein similarity measures, a cosine and a maximal weight matching score based on domain content similarity, and new software, named porthoDom. The qualities of the cosine and the maximal weight matching similarity measures are compared against curated datasets. The measures show that domain content similarities are able to correctly group proteins into their families. Accordingly, the cosine similarity measure is used inside porthoDom, the wrapper developed for proteinortho. porthoDom makes use of domain content similarity measures to group proteins together before searching for orthologs. By using domains instead of amino acid sequences, the reduction of the search space decreases the computational complexity of an all-against-all sequence comparison. We demonstrate that representing and comparing proteins as strings of discrete domains, i.e. as a concatenation of their unique identifiers, allows a drastic simplification of search space. porthoDom has the advantage of speeding up orthology detection while maintaining a degree of accuracy similar to proteinortho. The implementation of porthoDom is released using python and C++ languages and is available under the GNU GPL licence 3 at .

  2. Horizontal Accelerator

    Federal Laboratory Consortium — The Horizontal Accelerator (HA) Facility is a versatile research tool available for use on projects requiring simulation of the crash environment. The HA Facility is...

  3. Future accelerators

    Hübner, K


    An overview of the various schemes for electron-positron linear colliders is given and the status of the development of key components and the various test facilities is given. The present studies of muon-muon colliders and very large hadron colliders are summarized including the plans for component development and tests. Accelerator research and development to achieve highest gradients in linear accelerators is outlined. (44 refs).

  4. Contextual Bandits with Similarity Information

    Slivkins, Aleksandrs


    In a multi-armed bandit (MAB) problem, an online algorithm makes a sequence of choices. In each round it chooses from a time-invariant set of alternatives and receives the payoff associated with this alternative. While the case of small strategy sets is by now well-understood, a lot of recent work has focused on MAB problems with exponentially or infinitely large strategy sets, where one needs to assume extra structure in order to make the problem tractable. In particular, recent literature considered information on similarity between arms. We consider similarity information in the setting of "contextual bandits", a natural extension of the basic MAB problem where before each round an algorithm is given the "context" -- a hint about the payoffs in this round. Contextual bandits are directly motivated by placing advertisements on webpages, one of the crucial problems in sponsored search. A particularly simple way to represent similarity information in the contextual bandit setting is via a "similarity distance...

  5. Accelerated Parallel Texture Optimization

    Hao-Da Huang; Xin Tong; Wen-Cheng Wang


    Texture optimization is a texture synthesis method that can efficiently reproduce various features of exemplar textures. However, its slow synthesis speed limits its usage in many interactive or real time applications. In this paper, we propose a parallel texture optimization algorithm to run on GPUs. In our algorithm, k-coherence search and principle component analysis (PCA) are used for hardware acceleration, and two acceleration techniques are further developed to speed up our GPU-based texture optimization. With a reasonable precomputation cost, the online synthesis speed of our algorithm is 4000+ times faster than that of the original texture optimization algorithm and thus our algorithm is capable of interactive applications. The advantages of the new scheme are demonstrated by applying it to interactive editing of flow-guided synthesis.

  6. Professional Microsoft search fast search, Sharepoint search, and search server

    Bennett, Mark; Kehoe, Miles; Voskresenskaya, Natalya


    Use Microsoft's latest search-based technology-FAST search-to plan, customize, and deploy your search solutionFAST is Microsoft's latest intelligent search-based technology that boasts robustness and an ability to integrate business intelligence with Search. This in-depth guide provides you with advanced coverage on FAST search and shows you how to use it to plan, customize, and deploy your search solution, with an emphasis on SharePoint 2010 and Internet-based search solutions.With a particular appeal for anyone responsible for implementing and managing enterprise search, this book presents t

  7. Accelerated Unification

    Arkani-Hamed, Nima; Cohen, Andrew; Georgi, Howard


    We construct four dimensional gauge theories in which the successful supersymmetric unification of gauge couplings is preserved but accelerated by N-fold replication of the MSSM gauge and Higgs structure. This results in a low unification scale of $10^{13/N}$ TeV.

  8. Requirements for very high energy accelerators

    Richter, B.


    In this introductory paper at the second Workshop on Laser Acceleration my main goal is to set what I believe to be the energy and luminosity requirements of the machines of the future. These specifications are independent of the technique of accelerations. But, before getting to these technical questions, I will briefly review where we are in particle physics, for it is the large number of unanswered questions in physics that motivates the search for effective accelerators.

  9. Search Cloud

    ... of this page: Search Cloud To use the sharing features on this ... of Top 110 zoster vaccine Share the MedlinePlus search cloud with your users by embedding our search ...

  10. Exceptional Ground Accelerations and Velocities Caused by Earthquakes

    Anderson, John


    This project aims to understand the characteristics of the free-field strong-motion records that have yielded the 100 largest peak accelerations and the 100 largest peak velocities recorded to date. The peak is defined as the maximum magnitude of the acceleration or velocity vector during the strong shaking. This compilation includes 35 records with peak acceleration greater than gravity, and 41 records with peak velocities greater than 100 cm/s. The results represent an estimated 150,000 instrument-years of strong-motion recordings. The mean horizontal acceleration or velocity, as used for the NGA ground motion models, is typically 0.76 times the magnitude of this vector peak. Accelerations in the top 100 come from earthquakes as small as magnitude 5, while velocities in the top 100 all come from earthquakes with magnitude 6 or larger. Records are dominated by crustal earthquakes with thrust, oblique-thrust, or strike-slip mechanisms. Normal faulting mechanisms in crustal earthquakes constitute under 5% of the records in the databases searched, and an even smaller percentage of the exceptional records. All NEHRP site categories have contributed exceptional records, in proportions similar to the extent that they are represented in the larger database.

  11. Perceptual grouping by similarity of surface roughness in haptics: the influence of task difficulty.

    Van Aarsen, V; Overvliet, K E


    We investigated grouping by similarity of surface roughness in the context of task difficulty. We hypothesized that grouping yields a larger benefit at higher levels of task complexity, because efficient processing is more helpful when more cognitive resources are needed to execute a task. Participants searched for a patch of a different roughness as compared to the distractors in two strips of similar or dissimilar roughness values. We reasoned that if the distractors could be grouped based on similar roughness values, exploration time would be shorter and fewer errors would occur. To manipulate task complexity, we varied task difficulty (high target saliency equalling low task difficulty), and we varied the fingers used to explore the display (two fingers of one hand being more cognitive demanding than two fingers of opposite hands). We found much better performance in the easy condition as compared to the difficult condition (in both error rates and mean search slopes). Moreover, we found a larger effect for the similarity manipulation in the difficult condition as compared to the easy condition. Within the difficult condition, we found a larger effect for the one-hand condition as compared to the two-hand condition. These results show that haptic search is accelerated by the use of grouping by similarity of surface roughness, especially when the task is relatively complex. We conclude that the effect of perceptual grouping is more prominent when more cognitive resources are needed to perform a task.

  12. Particle Accelerators in China

    Zhang, Chuang; Fang, Shouxian

    As the special machines that can accelerate charged particle beams to high energy by using electromagnetic fields, particle accelerators have been widely applied in scientific research and various areas of society. The development of particle accelerators in China started in the early 1950s. After a brief review of the history of accelerators, this article describes in the following sections: particle colliders, heavy-ion accelerators, high-intensity proton accelerators, accelerator-based light sources, pulsed power accelerators, small scale accelerators, accelerators for applications, accelerator technology development and advanced accelerator concepts. The prospects of particle accelerators in China are also presented.




    One of the major motivations driving recent interest in FFAGs is their use for the cost-effective acceleration of muons. This paper summarizes the progress in this area that was achieved leading up to and at the FFAG workshop at KEK from July 7-12, 2003. Much of the relevant background and references are also given here, to give a context to the progress we have made.

  14. Laser acceleration

    Tajima, T.; Nakajima, K.; Mourou, G.


    The fundamental idea of Laser Wakefield Acceleration (LWFA) is reviewed. An ultrafast intense laser pulse drives coherent wakefield with a relativistic amplitude robustly supported by the plasma. While the large amplitude of wakefields involves collective resonant oscillations of the eigenmode of the entire plasma electrons, the wake phase velocity ˜ c and ultrafastness of the laser pulse introduce the wake stability and rigidity. A large number of worldwide experiments show a rapid progress of this concept realization toward both the high-energy accelerator prospect and broad applications. The strong interest in this has been spurring and stimulating novel laser technologies, including the Chirped Pulse Amplification, the Thin Film Compression, the Coherent Amplification Network, and the Relativistic Mirror Compression. These in turn have created a conglomerate of novel science and technology with LWFA to form a new genre of high field science with many parameters of merit in this field increasing exponentially lately. This science has triggered a number of worldwide research centers and initiatives. Associated physics of ion acceleration, X-ray generation, and astrophysical processes of ultrahigh energy cosmic rays are reviewed. Applications such as X-ray free electron laser, cancer therapy, and radioisotope production etc. are considered. A new avenue of LWFA using nanomaterials is also emerging.

  15. Search Recipes

    ... Tips A to Z Map Search Enter your search term 98 results • Advanced Search Everything News Videos e- ... We would like ...

  16. Search Patterns

    Morville, Peter


    What people are saying about Search Patterns "Search Patterns is a delight to read -- very thoughtful and thought provoking. It's the most comprehensive survey of designing effective search experiences I've seen." --Irene Au, Director of User Experience, Google "I love this book! Thanks to Peter and Jeffery, I now know that search (yes, boring old yucky who cares search) is one of the coolest ways around of looking at the world." --Dan Roam, author, The Back of the Napkin (Portfolio Hardcover) "Search Patterns is a playful guide to the practical concerns of search interface design. It cont

  17. Accelerators and the Accelerator Community

    Malamud, Ernest; Sessler, Andrew


    In this paper, standing back--looking from afar--and adopting a historical perspective, the field of accelerator science is examined. How it grew, what are the forces that made it what it is, where it is now, and what it is likely to be in the future are the subjects explored. Clearly, a great deal of personal opinion is invoked in this process.

  18. accelerating cavity

    On the inside of the cavity there is a layer of niobium. Operating at 4.2 degrees above absolute zero, the niobium is superconducting and carries an accelerating field of 6 million volts per metre with negligible losses. Each cavity has a surface of 6 m2. The niobium layer is only 1.2 microns thick, ten times thinner than a hair. Such a large area had never been coated to such a high accuracy. A speck of dust could ruin the performance of the whole cavity so the work had to be done in an extremely clean environment.

  19. Impact accelerations

    Vongierke, H. E.; Brinkley, J. W.


    The degree to which impact acceleration is an important factor in space flight environments depends primarily upon the technology of capsule landing deceleration and the weight permissible for the associated hardware: parachutes or deceleration rockets, inflatable air bags, or other impact attenuation systems. The problem most specific to space medicine is the potential change of impact tolerance due to reduced bone mass and muscle strength caused by prolonged weightlessness and physical inactivity. Impact hazards, tolerance limits, and human impact tolerance related to space missions are described.

  20. Mass spectrometry with accelerators.

    Litherland, A E; Zhao, X-L; Kieser, W E


    As one in a series of articles on Canadian contributions to mass spectrometry, this review begins with an outline of the history of accelerator mass spectrometry (AMS), noting roles played by researchers at three Canadian AMS laboratories. After a description of the unique features of AMS, three examples, (14)C, (10)Be, and (129)I are given to illustrate the methods. The capabilities of mass spectrometry have been extended by the addition of atomic isobar selection, molecular isobar attenuation, further ion acceleration, followed by ion detection and ion identification at essentially zero dark current or ion flux. This has been accomplished by exploiting the techniques and accelerators of atomic and nuclear physics. In 1939, the first principles of AMS were established using a cyclotron. In 1977 the selection of isobars in the ion source was established when it was shown that the (14)N(-) ion was very unstable, or extremely difficult to create, making a tandem electrostatic accelerator highly suitable for assisting the mass spectrometric measurement of the rare long-lived radioactive isotope (14)C in the environment. This observation, together with the large attenuation of the molecular isobars (13)CH(-) and (12)CH 2(-) during tandem acceleration and the observed very low background contamination from the ion source, was found to facilitate the mass spectrometry of (14)C to at least a level of (14)C/C ~ 6 × 10(-16), the equivalent of a radiocarbon age of 60,000 years. Tandem Accelerator Mass Spectrometry, or AMS, has now made possible the accurate radiocarbon dating of milligram-sized carbon samples by ion counting as well as dating and tracing with many other long-lived radioactive isotopes such as (10)Be, (26)Al, (36)Cl, and (129)I. The difficulty of obtaining large anion currents with low electron affinities and the difficulties of isobar separation, especially for the heavier mass ions, has prompted the use of molecular anions and the search for alternative

  1. Query Language for Complex Similarity Queries

    Budikova, Petra; Zezula, Pavel


    For complex data types such as multimedia, traditional data management methods are not suitable. Instead of attribute matching approaches, access methods based on object similarity are becoming popular. Recently, this resulted in an intensive research of indexing and searching methods for the similarity-based retrieval. Nowadays, many efficient methods are already available, but using them to build an actual search system still requires specialists that tune the methods and build the system manually. Several attempts have already been made to provide a more convenient high-level interface in a form of query languages for such systems, but these are limited to support only basic similarity queries. In this paper, we propose a new language that allows to formulate content-based queries in a flexible way, taking into account the functionality offered by a particular search engine in use. To ensure this, the language is based on a general data model with an abstract set of operations. Consequently, the language s...

  2. A measurement of hadron production cross sections for the simulation of accelerator neutrino beams and a search for muon-neutrino to electron-neutrino oscillations in the Δm2 about equals 1-eV2 region

    Schmitz, David W. [Columbia Univ., New York, NY (United States)


    A measurement of hadron production cross-sections for the simulation of accelerator neutrino beams and a search for muon neutrino to electron neutrino oscillations in the Δm2 ~ 1 eV2} region. This dissertation presents measurements from two different high energy physics experiments with a very strong connection: the Hadron Production (HARP) experiment located at CERN in Geneva, Switzerland, and the Mini Booster Neutrino Experiment (Mini-BooNE) located at Fermilab in Batavia, Illinois.

  3. Similarity transformations of MAPs

    Andersen Allan T.


    Full Text Available We introduce the notion of similar Markovian Arrival Processes (MAPs and show that the event stationary point processes related to two similar MAPs are stochastically equivalent. This holds true for the time stationary point processes too. We show that several well known stochastical equivalences as e.g. that between the H 2 renewal process and the Interrupted Poisson Process (IPP can be expressed by the similarity transformations of MAPs. In the appendix the valid region of similarity transformations for two-state MAPs is characterized.

  4. Search Combinators

    Schrijvers, Tom; Wuille, Pieter; Samulowitz, Horst; Stuckey, Peter J


    The ability to model search in a constraint solver can be an essential asset for solving combinatorial problems. However, existing infrastructure for defining search heuristics is often inadequate. Either modeling capabilities are extremely limited or users are faced with a general-purpose programming language whose features are not tailored towards writing search heuristics. As a result, major improvements in performance may remain unexplored. This article introduces search combinators, a lightweight and solver-independent method that bridges the gap between a conceptually simple modeling language for search (high-level, functional and naturally compositional) and an efficient implementation (low-level, imperative and highly non-modular). By allowing the user to define application-tailored search strategies from a small set of primitives, search combinators effectively provide a rich domain-specific language (DSL) for modeling search to the user. Remarkably, this DSL comes at a low implementation cost to the...

  5. Clustering by Pattern Similarity

    Hai-xun Wang; Jian Pei


    The task of clustering is to identify classes of similar objects among a set of objects. The definition of similarity varies from one clustering model to another. However, in most of these models the concept of similarity is often based on such metrics as Manhattan distance, Euclidean distance or other Lp distances. In other words, similar objects must have close values in at least a set of dimensions. In this paper, we explore a more general type of similarity. Under the pCluster model we proposed, two objects are similar if they exhibit a coherent pattern on a subset of dimensions. The new similarity concept models a wide range of applications. For instance, in DNA microarray analysis, the expression levels of two genes may rise and fall synchronously in response to a set of environmental stimuli. Although the magnitude of their expression levels may not be close, the patterns they exhibit can be very much alike. Discovery of such clusters of genes is essential in revealing significant connections in gene regulatory networks. E-commerce applications, such as collaborative filtering, can also benefit from the new model, because it is able to capture not only the closeness of values of certain leading indicators but also the closeness of (purchasing, browsing, etc.) patterns exhibited by the customers. In addition to the novel similarity model, this paper also introduces an effective and efficient algorithm to detect such clusters, and we perform tests on several real and synthetic data sets to show its performance.

  6. Judgments of brand similarity

    Bijmolt, THA; Wedel, M; Pieters, RGM; DeSarbo, WS

    This paper provides empirical insight into the way consumers make pairwise similarity judgments between brands, and how familiarity with the brands, serial position of the pair in a sequence, and the presentation format affect these judgments. Within the similarity judgment process both the

  7. New Similarity Functions

    Yazdani, Hossein; Ortiz-Arroyo, Daniel; Kwasnicka, Halina


    In data science, there are important parameters that affect the accuracy of the algorithms used. Some of these parameters are: the type of data objects, the membership assignments, and distance or similarity functions. This paper discusses similarity functions as fundamental elements in membership...

  8. Judgments of brand similarity

    Bijmolt, THA; Wedel, M; Pieters, RGM; DeSarbo, WS


    This paper provides empirical insight into the way consumers make pairwise similarity judgments between brands, and how familiarity with the brands, serial position of the pair in a sequence, and the presentation format affect these judgments. Within the similarity judgment process both the formatio

  9. Visual search

    Toet, A.; Bijl, P.


    Visual search, with or without the aid of optical or electro-optical instruments, plays a significant role in various types of military and civilian operations (e.g., reconnaissance, surveillance, and search and rescue). Advance knowledge of human visual search and target acquisition performance is

  10. New Similarity Functions

    Yazdani, Hossein; Ortiz-Arroyo, Daniel; Kwasnicka, Halina


    In data science, there are important parameters that affect the accuracy of the algorithms used. Some of these parameters are: the type of data objects, the membership assignments, and distance or similarity functions. This paper discusses similarity functions as fundamental elements in membership...... assignments. The paper introduces Weighted Feature Distance (WFD), and Prioritized Weighted Feature Distance (PWFD), two new distance functions that take into account the diversity in feature spaces. WFD functions perform better in supervised and unsupervised methods by comparing data objects on their feature...... spaces, in addition to their similarity in the vector space. Prioritized Weighted Feature Distance (PWFD) works similarly as WFD, but provides the ability to give priorities to desirable features. The accuracy of the proposed functions are compared with other similarity functions on several data sets...

  11. A study of reflex tandem accelerator

    Nakajima, Takao; Morinobu, Shunpei; Gono, Yasuyuki; Sagara, Kenji; Sugimitsu, Tsuyoshi; Mitarai, Shiro; Nakamura, Hiroyuki; Ikeda, Nobuo; Morikawa, Tsuneyasu [Kyushu Univ., Fukuoka (Japan). Faculty of Science


    An investigation on `developing research theme and its realizing experimental apparatus` based on the tandem accelerator facility is executed. At a standpoint of recognition on essentiality of preparation, improvement or novel technical development capable of extreme increase in capacity of the tandem accelerator facility to form COE with high uniqueness, proposal of numerous ideas and their investigations and searches were conducted. In this paper, consideration results of `beam reacceleration using tandem accelerator` were shown as follows: (1) Short life unstable nuclei formed by nuclear reaction using tandem acceleration primary beam is ionized to negative and to reaccelerate by using the same tandem accelerator. And (2) by combination of plural electrons with the tandem primary accelerated beam, numbers of charge is reduced to reaccelerate by the tandem. (G.K.)

  12. The semantic similarity ensemble

    Andrea Ballatore


    Full Text Available Computational measures of semantic similarity between geographic terms provide valuable support across geographic information retrieval, data mining, and information integration. To date, a wide variety of approaches to geo-semantic similarity have been devised. A judgment of similarity is not intrinsically right or wrong, but obtains a certain degree of cognitive plausibility, depending on how closely it mimics human behavior. Thus selecting the most appropriate measure for a specific task is a significant challenge. To address this issue, we make an analogy between computational similarity measures and soliciting domain expert opinions, which incorporate a subjective set of beliefs, perceptions, hypotheses, and epistemic biases. Following this analogy, we define the semantic similarity ensemble (SSE as a composition of different similarity measures, acting as a panel of experts having to reach a decision on the semantic similarity of a set of geographic terms. The approach is evaluated in comparison to human judgments, and results indicate that an SSE performs better than the average of its parts. Although the best member tends to outperform the ensemble, all ensembles outperform the average performance of each ensemble's member. Hence, in contexts where the best measure is unknown, the ensemble provides a more cognitively plausible approach.

  13. Similar component analysis

    ZHANG Hong; WANG Xin; LI Junwei; CAO Xianguang


    A new unsupervised feature extraction method called similar component analysis (SCA) is proposed in this paper. SCA method has a self-aggregation property that the data objects will move towards each other to form clusters through SCA theoretically,which can reveal the inherent pattern of similarity hidden in the dataset. The inputs of SCA are just the pairwise similarities of the dataset,which makes it easier for time series analysis due to the variable length of the time series. Our experimental results on many problems have verified the effectiveness of SCA on some engineering application.

  14. Trajectory similarity join in spatial networks

    Shang, Shuo


    The matching of similar pairs of objects, called similarity join, is fundamental functionality in data management. We consider the case of trajectory similarity join (TS-Join), where the objects are trajectories of vehicles moving in road networks. Thus, given two sets of trajectories and a threshold θ, the TS-Join returns all pairs of trajectories from the two sets with similarity above θ. This join targets applications such as trajectory near-duplicate detection, data cleaning, ridesharing recommendation, and traffic congestion prediction. With these applications in mind, we provide a purposeful definition of similarity. To enable efficient TS-Join processing on large sets of trajectories, we develop search space pruning techniques and take into account the parallel processing capabilities of modern processors. Specifically, we present a two-phase divide-and-conquer algorithm. For each trajectory, the algorithm first finds similar trajectories. Then it merges the results to achieve a final result. The algorithm exploits an upper bound on the spatiotemporal similarity and a heuristic scheduling strategy for search space pruning. The algorithm\\'s per-trajectory searches are independent of each other and can be performed in parallel, and the merging has constant cost. An empirical study with real data offers insight in the performance of the algorithm and demonstrates that is capable of outperforming a well-designed baseline algorithm by an order of magnitude.

  15. Active browsing using similarity pyramids

    Chen, Jau-Yuen; Bouman, Charles A.; Dalton, John C.


    In this paper, we describe a new approach to managing large image databases, which we call active browsing. Active browsing integrates relevance feedback into the browsing environment, so that users can modify the database's organization to suit the desired task. Our method is based on a similarity pyramid data structure, which hierarchically organizes the database, so that it can be efficiently browsed. At coarse levels, the similarity pyramid allows users to view the database as large clusters of similar images. Alternatively, users can 'zoom into' finer levels to view individual images. We discuss relevance feedback for the browsing process, and argue that it is fundamentally different from relevance feedback for more traditional search-by-query tasks. We propose two fundamental operations for active browsing: pruning and reorganization. Both of these operations depend on a user-defined relevance set, which represents the image or set of images desired by the user. We present statistical methods for accurately pruning the database, and we propose a new 'worm hole' distance metric for reorganizing the database, so that members of the relevance set are grouped together.

  16. Gender similarities and differences.

    Hyde, Janet Shibley


    Whether men and women are fundamentally different or similar has been debated for more than a century. This review summarizes major theories designed to explain gender differences: evolutionary theories, cognitive social learning theory, sociocultural theory, and expectancy-value theory. The gender similarities hypothesis raises the possibility of theorizing gender similarities. Statistical methods for the analysis of gender differences and similarities are reviewed, including effect sizes, meta-analysis, taxometric analysis, and equivalence testing. Then, relying mainly on evidence from meta-analyses, gender differences are reviewed in cognitive performance (e.g., math performance), personality and social behaviors (e.g., temperament, emotions, aggression, and leadership), and psychological well-being. The evidence on gender differences in variance is summarized. The final sections explore applications of intersectionality and directions for future research.

  17. Cluster Tree Based Hybrid Document Similarity Measure

    M. Varshana Devi


    Full Text Available similarity measure is established to measure the hybrid similarity. In cluster tree, the hybrid similarity measure can be calculated for the random data even it may not be the co-occurred and generate different views. Different views of tree can be combined and choose the one which is significant in cost. A method is proposed to combine the multiple views. Multiple views are represented by different distance measures into a single cluster. Comparing the cluster tree based hybrid similarity with the traditional statistical methods it gives the better feasibility for intelligent based search. It helps in improving the dimensionality reduction and semantic analysis.

  18. Music Retrieval based on Melodic Similarity

    Typke, R.


    This thesis introduces a method for measuring melodic similarity for notated music such as MIDI files. This music search algorithm views music as sets of notes that are represented as weighted points in the two-dimensional space of time and pitch. Two point sets can be compared by calculating how mu

  19. Efficient Similarity Retrieval in Music Databases

    Ruxanda, Maria Magdalena; Jensen, Christian Søndergaard


    Audio music is increasingly becoming available in digital form, and the digital music collections of individuals continue to grow. Addressing the need for effective means of retrieving music from such collections, this paper proposes new techniques for content-based similarity search. Each music ...

  20. Similarity or difference?

    Villadsen, Anders Ryom


    While the organizational structures and strategies of public organizations have attracted substantial research attention among public management scholars, little research has explored how these organizational core dimensions are interconnected and influenced by pressures for similarity....... In this paper I address this topic by exploring the relation between expenditure strategy isomorphism and structure isomorphism in Danish municipalities. Different literatures suggest that organizations exist in concurrent pressures for being similar to and different from other organizations in their field......-shaped relation exists between expenditure strategy isomorphism and structure isomorphism in a longitudinal quantitative study of Danish municipalities....

  1. Segmentation Similarity and Agreement

    Fournier, Chris


    We propose a new segmentation evaluation metric, called segmentation similarity (S), that quantifies the similarity between two segmentations as the proportion of boundaries that are not transformed when comparing them using edit distance, essentially using edit distance as a penalty function and scaling penalties by segmentation size. We propose several adapted inter-annotator agreement coefficients which use S that are suitable for segmentation. We show that S is configurable enough to suit a wide variety of segmentation evaluations, and is an improvement upon the state of the art. We also propose using inter-annotator agreement coefficients to evaluate automatic segmenters in terms of human performance.

  2. Quantum search by measurement

    Childs, A M; Farhi, E; Goldstone, J; Gutmann, S; Landahl, A J; Childs, Andrew M.; Deotto, Enrico; Farhi, Edward; Goldstone, Jeffrey; Gutmann, Sam; Landahl, Andrew J.


    We propose a quantum algorithm for solving combinatorial search problems that uses only a sequence of measurements. The algorithm is similar in spirit to quantum computation by adiabatic evolution, in that the goal is to remain in the ground state of a time-varying Hamiltonian. Indeed, we show that the running times of the two algorithms are closely related. We also show how to achieve the quadratic speedup for Grover's unstructured search problem with only two measurements. Finally, we discuss some similarities and differences between the adiabatic and measurement algorithms.

  3. Faceted Search

    Tunkelang, Daniel


    We live in an information age that requires us, more than ever, to represent, access, and use information. Over the last several decades, we have developed a modern science and technology for information retrieval, relentlessly pursuing the vision of a "memex" that Vannevar Bush proposed in his seminal article, "As We May Think." Faceted search plays a key role in this program. Faceted search addresses weaknesses of conventional search approaches and has emerged as a foundation for interactive information retrieval. User studies demonstrate that faceted search provides more

  4. Incremental Similarity and Turbulence

    Barndorff-Nielsen, Ole E.; Hedevang, Emil; Schmiegel, Jürgen

    This paper discusses the mathematical representation of an empirically observed phenomenon, referred to as Incremental Similarity. We discuss this feature from the viewpoint of stochastic processes and present a variety of non-trivial examples, including those that are of relevance for turbulence...

  5. The Application of Similar Image Retrieval in Electronic Commerce

    Hu, YuPing; Yin, Hua; Han, Dezhi; Yu, Fei


    Traditional online shopping platform (OSP), which searches product information by keywords, faces three problems: indirect search mode, large search space, and inaccuracy in search results. For solving these problems, we discuss and research the application of similar image retrieval in electronic commerce. Aiming at improving the network customers' experience and providing merchants with the accuracy of advertising, we design a reasonable and extensive electronic commerce application system, which includes three subsystems: image search display subsystem, image search subsystem, and product information collecting subsystem. This system can provide seamless connection between information platform and OSP, on which consumers can automatically and directly search similar images according to the pictures from information platform. At the same time, it can be used to provide accuracy of internet marketing for enterprises. The experiment shows the efficiency of constructing the system. PMID:24883411

  6. The Application of Similar Image Retrieval in Electronic Commerce

    YuPing Hu


    Full Text Available Traditional online shopping platform (OSP, which searches product information by keywords, faces three problems: indirect search mode, large search space, and inaccuracy in search results. For solving these problems, we discuss and research the application of similar image retrieval in electronic commerce. Aiming at improving the network customers’ experience and providing merchants with the accuracy of advertising, we design a reasonable and extensive electronic commerce application system, which includes three subsystems: image search display subsystem, image search subsystem, and product information collecting subsystem. This system can provide seamless connection between information platform and OSP, on which consumers can automatically and directly search similar images according to the pictures from information platform. At the same time, it can be used to provide accuracy of internet marketing for enterprises. The experiment shows the efficiency of constructing the system.

  7. The application of similar image retrieval in electronic commerce.

    Hu, YuPing; Yin, Hua; Han, Dezhi; Yu, Fei


    Traditional online shopping platform (OSP), which searches product information by keywords, faces three problems: indirect search mode, large search space, and inaccuracy in search results. For solving these problems, we discuss and research the application of similar image retrieval in electronic commerce. Aiming at improving the network customers' experience and providing merchants with the accuracy of advertising, we design a reasonable and extensive electronic commerce application system, which includes three subsystems: image search display subsystem, image search subsystem, and product information collecting subsystem. This system can provide seamless connection between information platform and OSP, on which consumers can automatically and directly search similar images according to the pictures from information platform. At the same time, it can be used to provide accuracy of internet marketing for enterprises. The experiment shows the efficiency of constructing the system.

  8. Performance analysis of acceleration resolution for radar signal

    ZHAO; Hongzhong; (赵宏钟); FU; Qiang; (付; 强)


    The high acceleration of moving targets has brought severe problems in radar signal processing, such as the decrease in output signal-noise-ratio and the deterioration of Doppler resolution. This paper presents an acceleration ambiguity function (AAF) for characterizing the acceleration effects and the acceleration resolution property in radar signal processing. The definition of the acceleration resolution based on AAF is also presented. Using AAF as an analyzing tool, some factors are derived including the loss factor of output SNR, the broadening factor of Doppler resolution, and the optimal accumulative time (OPT) caused by acceleration in linear-phase matched filtering. The convergent property of quadratic-phase matched-filter for searching for and estimating the acceleration is discussed. The results and conclusions are helpful for the quantitative analysis of the acceleration effects on signal processing, and for evaluation of the performance of acceleration in radar signal waveform design.

  9. Microwave View on Particle Acceleration in Flares

    Fleishman, Gregory D


    The thermal-to-nonthermal partition was found to vary greatly from one flare to another resulting in a broad variety of cases from 'heating without acceleration' to 'acceleration without heating'. Recent analysis of microwave data of these differing cases suggests that a similar acceleration mechanism, forming a power-law nonthermal tail up to a few MeV or even higher, operates in all the cases. However, the level of this nonthermal spectrum compared to the original thermal distribution differs significantly from one case to another, implying a highly different thermal-to-nonthermal energy partition in various cases. This further requires a specific mechanism capable of extracting the charged particles from the thermal pool and supplying them to a bulk acceleration process to operate in flares \\textit{in addition} to the bulk acceleration process itself, which, in contrast, efficiently accelerates the seed particles, while cannot accelerate the thermal particles. Within this 'microwave' view on the flare ener...

  10. Searches for Magnetic Monopoles and ... beyond

    Giacomelli, G; Sahnoun, Z


    The searches for classical Magnetic Monopoles (MMs) at accelerators, for GUT Superheavy MMs in the penetrating cosmic radiation and for Intermediate Mass MMs at high altitudes are discussed. The status of the search for other massive exotic particles such as nuclearites and Q-balls is briefly reviewed.

  11. Unifying physics of accelerators, lasers and plasma

    Seryi, Andrei


    Unifying Physics of Accelerators, Lasers and Plasma introduces the physics of accelerators, lasers and plasma in tandem with the industrial methodology of inventiveness, a technique that teaches that similar problems and solutions appear again and again in seemingly dissimilar disciplines. This unique approach builds bridges and enhances connections between the three aforementioned areas of physics that are essential for developing the next generation of accelerators.

  12. A New Search for $ \

    Dore, U; Kodama, K; Ushida, N; Loverre, P F


    % WA95\\\\ \\\\ The question whether neutrino flavours mix at some level - and the related question whether neutrinos have non-zero mass - is one of the remaining great challenges of experimental physics. Neutrinos from supernovae, from the sun, from the earth's atmosphere, from nuclear reactors and from radioactive decays are currently under study; in this frame, experiments using accelerators play a privileged role because the well known neutrino source properties allow high precision measurements and background control.\\\\ \\\\The main goal of the CHORUS experiment is to search for neutrino oscillations in the $\

  13. Accelerating the Response of Query in Semantic Web

    Nooshin Azimi


    Full Text Available Today, XML has become one of the important formats of saving and exchanging data. XML structure flexibility enhances its use, and the content of XML documents is increasing constantly. As a result, since file management system is not able to manage such content of data, managing XML documents requires a comprehensive management system. With the striking growth of such databases, the necessity of accelerating the implementing operation of queries is felt. In this paper, we are searching for a method that has required ability for a large set of queries; the method that would access fewer nodes and would get the answer through a shorter period of time, compared to similar ways; the method which has the ability of matching with similar ways indicator, and can use them to accelerate the queries. We are seeking a method which is able to jump over the useless nodes and produces intermediate data, as compared to similar ones. A method by which nodes processing are not performed directly and automatically through a pattern matching guide.

  14. Search for persons

    Vogel, H. [Asklepios Klinik St. Georg, Radiology, Lohmuehlenstr. 5 20099 Hamburg (Germany)], E-mail:


    X-rays and gamma-rays are used to detect hidden persons in vehicles, containers, and railway wagons. They are produced with accelerators, X-ray tubes, cobalt 60 and caesium 137. Fan beams adjusted to a line of digital detectors produce the image. The resolution is sufficient to recognise a human being. The recognition of persons with transmission images is limited by superimposition; backscatter imaging produces clearer images but of one single layer only. The future will bring new applications of search for persons with X-rays. Crimes and terrorist attacks will induce added demand for security, where search with X-rays and gamma-rays will keep its important role or even increase it.

  15. More Similar Than Different

    Pedersen, Mogens Jin


    What role do employee features play into the success of different personnel management practices for serving high performance? Using data from a randomized survey experiment among 5,982 individuals of all ages, this article examines how gender conditions the compliance effects of different...... incentive treatments—each relating to the basic content of distinct types of personnel management practices. The findings show that males and females are more similar than different in terms of the incentive treatments’ effects: Significant average effects are found for three out of five incentive...

  16. Similar dissection of sets

    Akiyama, Shigeki; Okazaki, Ryotaro; Steiner, Wolfgang; Thuswaldner, Jörg


    In 1994, Martin Gardner stated a set of questions concerning the dissection of a square or an equilateral triangle in three similar parts. Meanwhile, Gardner's questions have been generalized and some of them are already solved. In the present paper, we solve more of his questions and treat them in a much more general context. Let $D\\subset \\mathbb{R}^d$ be a given set and let $f_1,...,f_k$ be injective continuous mappings. Does there exist a set $X$ such that $D = X \\cup f_1(X) \\cup ... \\cup f_k(X)$ is satisfied with a non-overlapping union? We prove that such a set $X$ exists for certain choices of $D$ and $\\{f_1,...,f_k\\}$. The solutions $X$ often turn out to be attractors of iterated function systems with condensation in the sense of Barnsley. Coming back to Gardner's setting, we use our theory to prove that an equilateral triangle can be dissected in three similar copies whose areas have ratio $1:1:a$ for $a \\ge (3+\\sqrt{5})/2$.

  17. Status of Searches for Magnetic Monopoles

    Patrizii, L


    The searches for magnetic monopoles (Ms) is a fascinating interdisciplinary field with implications in fundamental theories, in particle physics, astrophysics, and cosmology. The quantum theory of Ms and its consistency with electrodynamics was derived by Dirac. This marked the start of the searches for classical monopoles at every new accelerator, up to the LHC. Magnetic monopoles are required by Grand Unification Theories, but unlike classical monopoles they would be incredibly massive, out of the reach of any conceivable accelerator. Large efforts have been made to search for them in the cosmic radiation as relic particles from the early Universe in the widest range of mass and velocity experimentally accessible. In this paper the status of the searches for classical Ms at accelerators, for GUT, superheavy Ms in the penetrating cosmic radiation and for Intermediate Mass Ms at high altitudes is discussed, with emphasis on the most recent results and future perspectives.

  18. New algorithms for radio pulsar search

    Smith, Kendrick M


    The computational cost of searching for new pulsars is a limiting factor for upcoming radio telescopes such as SKA. We introduce four new algorithms: an optimal constant-period search, a coherent tree search which permits optimal searching with O(1) cost per model, a semicoherent search which combines information from coherent subsearches while preserving as much phase information as possible, and a hierarchical search which interpolates between the coherent and semicoherent limits. Taken together, these algorithms improve the computational cost of pulsar search by several orders of magnitude. In this paper, we consider the simple case of a constant-acceleration phase model, but our methods should generalize to more complex search spaces.

  19. Advancements in Catheter-Directed Ultrasound-Accelerated Thrombolysis

    Doomernik, Denise E.; Schrijver, A. Marjolein; Zeebregts, Clark J.; de Vries, Jean-Paul P. M.; Reijnen, Michel M. P. J.


    Purpose: To review all available literature on catheter-directed ultrasound-accelerated thrombolysis for peripheral artery occlusions, stroke, deep venous thrombosis, and pulmonary embolism. Methods: A systematic literature search was performed, using MEDLINE, EMBASE and Cochrane databases. A total

  20. Advancements in catheter-directed ultrasound-accelerated thrombolysis.

    Doomernik, D.E.; Schrijver, A.M.; Zeebregts, C.J.A.; Vries, J.P. de; Reijnen, M.M.P.J.


    PURPOSE: To review all available literature on catheter-directed ultrasound-accelerated thrombolysis for peripheral artery occlusions, stroke, deep venous thrombosis, and pulmonary embolism. METHODS: A systematic literature search was performed, using MEDLINE, EMBASE and Cochrane databases. A total

  1. Similarity transformed semiclassical dynamics

    Van Voorhis, Troy; Heller, Eric J.


    In this article, we employ a recently discovered criterion for selecting important contributions to the semiclassical coherent state propagator [T. Van Voorhis and E. J. Heller, Phys. Rev. A 66, 050501 (2002)] to study the dynamics of many dimensional problems. We show that the dynamics are governed by a similarity transformed version of the standard classical Hamiltonian. In this light, our selection criterion amounts to using trajectories generated with the untransformed Hamiltonian as approximate initial conditions for the transformed boundary value problem. We apply the new selection scheme to some multidimensional Henon-Heiles problems and compare our results to those obtained with the more sophisticated Herman-Kluk approach. We find that the present technique gives near-quantitative agreement with the the standard results, but that the amount of computational effort is less than Herman-Kluk requires even when sophisticated integral smoothing techniques are employed in the latter.

  2. OpenMP for Accelerators

    Beyer, J C; Stotzer, E J; Hart, A; de Supinski, B R


    OpenMP [13] is the dominant programming model for shared-memory parallelism in C, C++ and Fortran due to its easy-to-use directive-based style, portability and broad support by compiler vendors. Similar characteristics are needed for a programming model for devices such as GPUs and DSPs that are gaining popularity to accelerate compute-intensive application regions. This paper presents extensions to OpenMP that provide that programming model. Our results demonstrate that a high-level programming model can provide accelerated performance comparable to hand-coded implementations in CUDA.

  3. Piezoelectric particle accelerator

    Kemp, Mark A.; Jongewaard, Erik N.; Haase, Andrew A.; Franzi, Matthew


    A particle accelerator is provided that includes a piezoelectric accelerator element, where the piezoelectric accelerator element includes a hollow cylindrical shape, and an input transducer, where the input transducer is disposed to provide an input signal to the piezoelectric accelerator element, where the input signal induces a mechanical excitation of the piezoelectric accelerator element, where the mechanical excitation is capable of generating a piezoelectric electric field proximal to an axis of the cylindrical shape, where the piezoelectric accelerator is configured to accelerate a charged particle longitudinally along the axis of the cylindrical shape according to the piezoelectric electric field.

  4. Prospects for searching axion-like particle dark matter with dipole, toroidal and wiggler magnets

    Baker, Oliver K. [Yale Univ., New Haven, CT (United States). Dept. of Physics; Betz, Michael; Caspers, Fritz [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Jaeckel, Joerg [Institute for Particle Physics Phenomenology, Durham (United Kingdom); Lindner, Axel; Ringwald, Andreas [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Semertzidis, Yannis [Brookhaven National Lab., Upton, NY (United States); Sikivie, Pierre [Florida Univ., Gainesville, FL (United States). Dept. of Physics; Zioutas, Konstantin [Patras Univ. (Greece)


    In this work we consider searches for dark matter made of axions or axion-like particles (ALPs) using resonant radio frequency cavities inserted into dipole magnets from particle accelerators, wiggler magnets developed for accelerator based advanced light sources, and toroidal magnets similar to those used in particle physics detectors. We investigate the expected sensitivity of such ALP dark matter detectors and discuss the engineering aspects of building and tuning them. Brief mention is also made of even stronger field magnets that are becoming available due to improvements in magnetic technology. It is concluded that new experiments utilizing already existing magnets could greatly enlarge the mass region in searches for axion-like dark matter particles. (orig.)

  5. Autonomous search

    Hamadi, Youssef; Saubion, Frédéric


    Autonomous combinatorial search (AS) represents a new field in combinatorial problem solving. Its major standpoint and originality is that it considers that problem solvers must be capable of self-improvement operations. This is the first book dedicated to AS.

  6. Acceleration without Horizons

    Doria, Alaric


    We derive the metric of an accelerating observer moving with non-constant proper acceleration in flat spacetime. With the exception of a limiting case representing a Rindler observer, there are no horizons. In our solution, observers can accelerate to any desired terminal speed $v_{\\infty} < c$. The motion of the accelerating observer is completely determined by the distance of closest approach and terminal velocity or, equivalently, by an acceleration parameter and terminal velocity.

  7. Accelerating flight: Edge with arbitrary acceleration

    Gledhill, Irvy MA


    Full Text Available ? temporal scales ? Euler ? convection ? Reynolds ? translational viscous ? Ekman ? rotational viscous ? Translational acceleration ? related to g ? Rotational accleration ? Rossby ? Coriolis ? Centrifugal ? Gravitational ? CSIR 2009...

  8. Simrank: Rapid and sensitive general-purpose k-mer search tool

    DeSantis, T.Z.; Keller, K.; Karaoz, U.; Alekseyenko, A.V; Singh, N.N.S.; Brodie, E.L; Pei, Z.; Andersen, G.L; Larsen, N.


    Terabyte-scale collections of string-encoded data are expected from consortia efforts such as the Human Microbiome Project ( Intra- and inter-project data similarity searches are enabled by rapid k-mer matching strategies. Software applications for sequence database partitioning, guide tree estimation, molecular classification and alignment acceleration have benefited from embedded k-mer searches as sub-routines. However, a rapid, general-purpose, open-source, flexible, stand-alone k-mer tool has not been available. Here we present a stand-alone utility, Simrank, which allows users to rapidly identify database strings the most similar to query strings. Performance testing of Simrank and related tools against DNA, RNA, protein and human-languages found Simrank 10X to 928X faster depending on the dataset. Simrank provides molecular ecologists with a high-throughput, open source choice for comparing large sequence sets to find similarity.

  9. Applications of accelerator mass spectrometry to nuclear physics and astrophysics

    Guo, Z Y


    As an ultra high sensitive analyzing method, accelerator mass spectrometry is playing an important role in the studies of nuclear physics and astrophysics. The accelerator mass spectrometry (AMS) applications in searching for violation of Pauli exclusion principle and study on supernovae are discussed as examples

  10. Radiative damping in plasma-based accelerators

    Kostyukov, I. Yu.; Nerush, E. N.; Litvak, A. G.


    The electrons accelerated in a plasma-based accelerator undergo betatron oscillations and emit synchrotron radiation. The energy loss to synchrotron radiation may seriously affect electron acceleration. The electron dynamics under combined influence of the constant accelerating force and the classical radiation reaction force is studied. It is shown that electron acceleration cannot be limited by radiation reaction. If initially the accelerating force was stronger than the radiation reaction force, then the electron acceleration is unlimited. Otherwise the electron is decelerated by radiative damping up to a certain instant of time and then accelerated without limits. It is shown that regardless of the initial conditions the infinite-time asymptotic behavior of an electron is governed by a self-similar solution providing that the radiative damping becomes exactly equal to 2/3 of the accelerating force. The relative energy spread induced by the radiative damping decreases with time in the infinite-time limit. The multistage schemes operating in the asymptotic acceleration regime when electron dynamics is determined by the radiation reaction are discussed.

  11. 2014 CERN Accelerator Schools: Plasma Wake Acceleration


    A specialised school on Plasma Wake Acceleration will be held at CERN, Switzerland from 23-29 November, 2014.   This course will be of interest to staff and students in accelerator laboratories, university departments and companies working in or having an interest in the field of new acceleration techniques. Following introductory lectures on plasma and laser physics, the course will cover the different components of a plasma wake accelerator and plasma beam systems. An overview of the experimental studies, diagnostic tools and state of the art wake acceleration facilities, both present and planned, will complement the theoretical part. Topical seminars and a visit of CERN will complete the programme. Further information can be found at:

  12. Integrated Semantic Similarity Model Based on Ontology

    LIU Ya-Jun; ZHAO Yun


    To solve the problem of the inadequacy of semantic processing in the intelligent question answering system, an integrated semantic similarity model which calculates the semantic similarity using the geometric distance and information content is presented in this paper.With the help of interrelationship between concepts, the information content of concepts and the strength of the edges in the ontology network, we can calculate the semantic similarity between two concepts and provide information for the further calculation of the semantic similarity between user's question and answers in knowlegdge base.The results of the experiments on the prototype have shown that the semantic problem in natural language processing can also be solved with the help of the knowledge and the abundant semantic information in ontology.More than 90% accuracy with less than 50 ms average searching time in the intelligent question answering prototype system based on ontology has been reached.The result is very satisfied.

  13. BIOCONAID System (Bionic Control of Acceleration Induced Dimming). Final Report.

    Rogers, Dana B.; And Others

    The system described represents a new technique for enhancing the fidelity of flight simulators during high acceleration maneuvers. This technique forces the simulator pilot into active participation and energy expenditure similar to the aircraft pilot undergoing actual accelerations. The Bionic Control of Acceleration Induced Dimming (BIOCONAID)…

  14. MINOS Sterile Neutrino Search

    Koskinen, David Jason [Univ. College London, Bloomsbury (United Kingdom)


    The Main Injector Neutrino Oscillation Search (MINOS) is a long-baseline accelerator neutrino experiment designed to measure properties of neutrino oscillation. Using a high intensity muon neutrino beam, produced by the Neutrinos at Main Injector (NuMI) complex at Fermilab, MINOS makes two measurements of neutrino interactions. The first measurement is made using the Near Detector situated at Fermilab and the second is made using the Far Detector located in the Soudan Underground laboratory in northern Minnesota. The primary goal of MINOS is to verify, and measure the properties of, neutrino oscillation between the two detectors using the v μ→ Vτ transition. A complementary measurement can be made to search for the existence of sterile neutrinos; an oft theorized, but experimentally unvalidated particle. The following thesis will show the results of a sterile neutrino search using MINOS RunI and RunII data totaling ~2.5 x 1020 protons on target. Due to the theoretical nature of sterile neutrinos, complete formalism that covers transition probabilities for the three known active states with the addition of a sterile state is also presented.

  15. Face Search at Scale.

    Wang, Dayong; Otto, Charles; Jain, Anil K


    rsons of interest among the billions of shared photos on these websites. Despite significant progress in face recognition, searching a large collection of unconstrained face images remains a difficult problem. To address this challenge, we propose a face search system which combines a fast search procedure, coupled with a state-of-the-art commercial off the shelf (COTS) matcher, in a cascaded framework. Given a probe face, we first filter the large gallery of photos to find the top-k most similar faces using features learned by a convolutional neural network. The k retrieved candidates are re-ranked by combining similarities based on deep features and those output by the COTS matcher. We evaluate the proposed face search system on a gallery containing 80 million web-downloaded face images. Experimental results demonstrate that while the deep features perform worse than the COTS matcher on a mugshot dataset (93.7% vs. 98.6% TAR@FAR of 0.01%), fusing the deep features with the COTS matcher improves the overall performance (99.5% TAR@FAR of 0.01%). This shows that the learned deep features provide complementary information over representations used in state-of-the-art face matchers. On the unconstrained face image benchmarks, the performance of the learned deep features is competitive with reported accuracies. LFW database: 98.20% accuracy under the standard protocol and 88.03% TAR@FAR of 0.1% under the BLUFR protocol; IJB-A benchmark: 51.0% TAR@FAR of 0.1% (verification), rank 1 retrieval of 82.2% (closed-set search), 61.5% FNIR@FAR of 1% (open-set search). The proposed face search system offers an excellent trade-off between accuracy and scalability on galleries with millions of images. Additionally, in a face search experiment involving photos of the Tsarnaev brothers, convicted of the Boston Marathon bombing, the proposed cascade face search system could find the younger brother's (Dzhokhar Tsarnaev) photo at rank 1 in 1 second on a 5M gallery and at rank 8 in 7

  16. High Energy Particle Accelerators

    Audio Productions, Inc, New York


    Film about the different particle accelerators in the US. Nuclear research in the US has developed into a broad and well-balanced program.Tour of accelerator installations, accelerator development work now in progress and a number of typical experiments with high energy particles. Brookhaven, Cosmotron. Univ. Calif. Berkeley, Bevatron. Anti-proton experiment. Negative k meson experiment. Bubble chambers. A section on an electron accelerator. Projection of new accelerators. Princeton/Penn. build proton synchrotron. Argonne National Lab. Brookhaven, PS construction. Cambridge Electron Accelerator; Harvard/MIT. SLAC studying a linear accelerator. Other research at Madison, Wisconsin, Fixed Field Alternate Gradient Focusing. (FFAG) Oakridge, Tenn., cyclotron. Two-beam machine. Comments : Interesting overview of high energy particle accelerators installations in the US in these early years. .

  17. Improved plasma accelerator

    Cheng, D. Y.


    Converging, coaxial accelerator electrode configuration operates in vacuum as plasma gun. Plasma forms by periodic injections of high pressure gas that is ionized by electrical discharges. Deflagration mode of discharge provides acceleration, and converging contours of plasma gun provide focusing.

  18. Accelerator Technology Division


    In fiscal year (FY) 1991, the Accelerator Technology (AT) division continued fulfilling its mission to pursue accelerator science and technology and to develop new accelerator concepts for application to research, defense, energy, industry, and other areas of national interest. This report discusses the following programs: The Ground Test Accelerator Program; APLE Free-Electron Laser Program; Accelerator Transmutation of Waste; JAERI, OMEGA Project, and Intense Neutron Source for Materials Testing; Advanced Free-Electron Laser Initiative; Superconducting Super Collider; The High-Power Microwave Program; (Phi) Factory Collaboration; Neutral Particle Beam Power System Highlights; Accelerator Physics and Special Projects; Magnetic Optics and Beam Diagnostics; Accelerator Design and Engineering; Radio-Frequency Technology; Free-Electron Laser Technology; Accelerator Controls and Automation; Very High-Power Microwave Sources and Effects; and GTA Installation, Commissioning, and Operations.

  19. Accelerators, Colliders, and Snakes

    Courant, Ernest D.


    The author traces his involvement in the evolution of particle accelerators over the past 50 years. He participated in building the first billion-volt accelerator, the Brookhaven Cosmotron, which led to the introduction of the "strong-focusing" method that has in turn led to the very large accelerators and colliders of the present day. The problems of acceleration of spin-polarized protons are also addressed, with discussions of depolarizing resonances and "Siberian snakes" as a technique for mitigating these resonances.

  20. Internet Search Engines

    Fatmaa El Zahraa Mohamed Abdou


    A general study about the internet search engines, the study deals main 7 points; the differance between search engines and search directories, components of search engines, the percentage of sites covered by search engines, cataloging of sites, the needed time for sites appearance in search engines, search capabilities, and types of search engines.

  1. Internet Search Engines

    Fatmaa El Zahraa Mohamed Abdou


    Full Text Available A general study about the internet search engines, the study deals main 7 points; the differance between search engines and search directories, components of search engines, the percentage of sites covered by search engines, cataloging of sites, the needed time for sites appearance in search engines, search capabilities, and types of search engines.

  2. Perceptual training for visual search.

    Schuster, David; Rivera, Javier; Sellers, Brittany C; Fiore, Stephen M; Jentsch, Florian


    People are better at visual search than the best fully automated methods. Despite this, visual search remains a difficult perceptual task. The goal of this investigation was to experimentally test the ways in which visual search performance could be improved through two categories of training interventions: perceptual training and conceptual training. To determine the effects of each training on a later performance task, the two types of trainings were manipulated using a between-subjects design (conceptual vs. perceptual × training present vs. training absent). Perceptual training led to speed and accuracy improvements in visual search. Issues with the design and administration of the conceptual training limited conclusions on its effectiveness but provided useful lessons for conceptual training design. The results suggest that when the visual search task involves detecting heterogeneous or otherwise unpredictable stimuli, perceptual training can improve visual search performance. Similarly, careful consideration of the performance task and training design is required to evaluate the effectiveness of conceptual training. Visual search is a difficult, yet critical, task in industries such as baggage screening and radiology. This study investigated the effectiveness of perceptual training for visual search. The results suggest that when visual search involves detecting heterogeneous or otherwise unpredictable stimuli, perceptual training may improve the speed and accuracy of visual search.

  3. The CERN Accelerator School


    Introduction to accelerator physics The CERN Accelerator School: Introduction to Accelerator Physics, which should have taken place in Istanbul, Turkey, later this year has now been relocated to Budapest, Hungary.  Further details regarding the new hotel and dates will be made available as soon as possible on a new Indico site at the end of May.

  4. Far field acceleration

    Fernow, R.C.


    Far fields are propagating electromagnetic waves far from their source, boundary surfaces, and free charges. The general principles governing the acceleration of charged particles by far fields are reviewed. A survey of proposed field configurations is given. The two most important schemes, Inverse Cerenkov acceleration and Inverse free electron laser acceleration, are discussed in detail.

  5. Acceleration: It's Elementary

    Willis, Mariam


    Acceleration is one tool for providing high-ability students the opportunity to learn something new every day. Some people talk about acceleration as taking a student out of step. In actuality, what one is doing is putting a student in step with the right curriculum. Whole-grade acceleration, also called grade-skipping, usually happens between…

  6. Textual and chemical information processing: different domains but similar algorithms

    Peter Willett


    Full Text Available This paper discusses the extent to which algorithms developed for the processing of textual databases are also applicable to the processing of chemical structure databases, and vice versa. Applications discussed include: an algorithm for distribution sorting that has been applied to the design of screening systems for rapid chemical substructure searching; the use of measures of inter-molecular structural similarity for the analysis of hypertext graphs; a genetic algorithm for calculating term weights for relevance feedback searching for determining whether a molecule is likely to exhibit biological activity; and the use of data fusion to combine the results of different chemical similarity searches.

  7. Conjunctive Wildcard Search over Encrypted Data

    Bösch, Christoph; Brinkman, Richard; Hartel, Pieter; Jonker, Willem; Jonker, Willem; Petkovic, Milan


    Searchable encryption allows a party to search over encrypted data without decrypting it. Prior schemes in the symmetric setting deal only with exact or similar keyword matches. We describe a scheme for the problem of wildcard searches over encrypted data to make search queries more flexible, provid

  8. Low voltage electron beam accelerators

    Ochi, Masafumi [Iwasaki Electric Co., Ltd., Tokyo (Japan)


    Widely used electron accelerators in industries are the electron beams with acceleration voltage at 300 kV or less. The typical examples are shown on manufactures in Japan, equipment configuration, operation, determination of process parameters, and basic maintenance requirement of the electron beam processors. New electron beam processors with acceleration voltage around 100 kV were introduced maintaining the relatively high dose speed capability of around 10,000 kGy x mpm at production by ESI (Energy Science Inc. USA, Iwasaki Electric Group). The application field like printing and coating for packaging requires treating thickness of 30 micron or less. It does not require high voltage over 110 kV. Also recently developed is a miniature bulb type electron beam tube with energy less than 60 kV. The new application area for this new electron beam tube is being searched. The drive force of this technology to spread in the industries would be further development of new application, process and market as well as the price reduction of the equipment, upon which further acknowledgement and acceptance of the technology to societies and industries would entirely depend. (Y. Tanaka)

  9. Notions of similarity for computational biology models

    Waltemath, Dagmar


    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  10. The Accelerator Reliability Forum

    Lüdeke, Andreas; Giachino, R


    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  11. Estimating similarity of XML Schemas using path similarity measure

    Veena Trivedi


    Full Text Available In this paper, an attempt has been made to develop an algorithm which estimates the similarity for XML Schemas using multiple similarity measures. For performing the task, the XML Schema element information has been represented in the form of string and four different similarity measure approaches have been employed. To further improve the similarity measure, an overall similarity measure has also been calculated. The approach used in this paper is a distinguished one, as it calculates the similarity between two XML schemas using four approaches and gives an integrated values for the similarity measure. Keywords-componen

  12. Industrial Application of Accelerators

    CERN. Geneva


    At CERN, we are very familiar with large, high energy particle accelerators. However, in the world outside CERN, there are more than 35000 accelerators which are used for applications ranging from treating cancer, through making better electronics to removing harmful micro-organisms from food and water. These are responsible for around $0.5T of commerce each year. Almost all are less than 20 MeV and most use accelerator types that are somewhat different from what is at CERN. These lectures will describe some of the most common applications, some of the newer applications in development and the accelerator technology used for them. It will also show examples of where technology developed for particle physics is now being studied for these applications. Rob Edgecock is a Professor of Accelerator Science, with a particular interest in the medical applications of accelerators. He works jointly for the STFC Rutherford Appleton Laboratory and the International Institute for Accelerator Applications at the Univer...

  13. Industrial Application of Accelerators

    CERN. Geneva


    At CERN, we are very familiar with large, high energy particle accelerators. However, in the world outside CERN, there are more than 35000 accelerators which are used for applications ranging from treating cancer, through making better electronics to removing harmful micro-organisms from food and water. These are responsible for around $0.5T of commerce each year. Almost all are less than 20 MeV and most use accelerator types that are somewhat different from what is at CERN. These lectures will describe some of the most common applications, some of the newer applications in development and the accelerator technology used for them. It will also show examples of where technology developed for particle physics is now being studied for these applications. Rob Edgecock is a Professor of Accelerator Science, with a particular interest in the medical applications of accelerators. He works jointly for the STFC Rutherford Appleton Laboratory and the International Institute for Accelerator Applications at the Uni...

  14. Acceleration in astrophysics

    Colgate, S.A.


    The origin of cosmic rays and applicable laboratory experiments are discussed. Some of the problems of shock acceleration for the production of cosmic rays are discussed in the context of astrophysical conditions. These are: The presumed unique explanation of the power law spectrum is shown instead to be a universal property of all lossy accelerators; the extraordinary isotropy of cosmic rays and the limited diffusion distances implied by supernova induced shock acceleration requires a more frequent and space-filling source than supernovae; the near perfect adiabaticity of strong hydromagnetic turbulence necessary for reflecting the accelerated particles each doubling in energy roughly 10{sup 5} to {sup 6} scatterings with negligible energy loss seems most unlikely; the evidence for acceleration due to quasi-parallel heliosphere shocks is weak. There is small evidence for the expected strong hydromagnetic turbulence, and instead, only a small number of particles accelerate after only a few shock traversals; the acceleration of electrons in the same collisionless shock that accelerates ions is difficult to reconcile with the theoretical picture of strong hydromagnetic turbulence that reflects the ions. The hydromagnetic turbulence will appear adiabatic to the electrons at their much higher Larmor frequency and so the electrons should not be scattered incoherently as they must be for acceleration. Therefore the electrons must be accelerated by a different mechanism. This is unsatisfactory, because wherever electrons are accelerated these sites, observed in radio emission, may accelerate ions more favorably. The acceleration is coherent provided the reconnection is coherent, in which case the total flux, as for example of collimated radio sources, predicts single charge accelerated energies much greater than observed.

  15. Search for $\

    Astier, P.; Autiero, D.; Baldisseri, A.; Baldo-Ceolin, M.; Banner, M.; Bassompierre, G.; Benslama, K.; Besson, N.; Bird, I.; Blumenfeld, B.; Bobisut, F.; J. Bouchez; Boyd, S.; A. Bueno; Bunyatov, S.


    Neutrinos; We present the results of a search for nu_mu → nu_e oscillations in the NOMAD experiment at Cern. The experiment looked for the appearance of nu_e in a predominantly nu_mu wide-band neutrino beam at the CERN SPS. No evidence for oscillations was found. The 90% confidence limits obtained are Delta m^2 ~ 10 eV^2.

  16. Search for $\

    Astier, Pierre; Baldisseri, Alberto; Baldo-Ceolin, Massimilla; Banner, M; Bassompierre, Gabriel; Benslama, K; Besson, N; Bird, I; Blumenfeld, B; Bobisut, F; Bouchez, J; Boyd, S; Bueno, A G; Bunyatov, S A; Camilleri, L L; Cardini, A; Cattaneo, Paolo Walter; Cavasinni, V; Cervera-Villanueva, A; Challis, R C; Chukanov, A; Collazuol, G; Conforto, G; Conta, C; Contalbrigo, M; Cousins, R D; Daniels, D; De Santo, A; Degaudenzi, H M; Del Prete, T; Di Lella, L; Dignan, T; Dumarchez, J; Feldman, G J; Ferrari, A; Ferrari, R; Ferrère, D; Flaminio, Vincenzo; Fraternali, M; Gaillard, J M; Gangler, E; Geiser, A; Geppert, D; Gibin, D; Gninenko, S N; Godley, A; Gosset, J; Gouanère, M; Grant, A; Graziani, G; Guglielmi, A M; Gómez-Cadenas, J J; Gössling, C; Hagner, C; Hernando, J; Hong, T M; Hubbard, D B; Hurst, P; Hyett, N; Iacopini, E; Joseph, C L; Juget, F R; Kent, N; Kirsanov, M M; Klimov, O; Kokkonen, J; Kovzelev, A; Krasnoperov, A V; Kustov, D; La Rotonda, L; Lacaprara, S; Lachaud, C; Lakic, B; Lanza, A; Laveder, M; Letessier-Selvon, A A; Linssen, Lucie; Ljubicic, A; Long, J; Lupi, A; Lévy, J M; Marchionni, A; Martelli, F; Mendiburu, J P; Meyer, J P; Mezzetto, Mauro; Mishra, S R; Moorhead, G F; Méchain, X; Naumov, D V; Nefedov, Yu A; Nguyen-Mau, C; Nédélec, P; Orestano, D; Pastore, F; Peak, L S; Pennacchio, E; Pessard, H; Petti, R; Placci, A; Polesello, G; Pollmann, D; Polyarush, A Yu; Popov, B; Poulsen, C; Rebuffi, L; Renò, R; Rico, J; Riemann, P; Roda, C; Rubbia, André; Salvatore, F; Schahmaneche, K; Schmidt, B; Schmidt, T; Sconza, A; Sevior, M E; Shih, D; Sillou, D; Soler, F J P; Sozzi, G; Steele, D; Stiegler, U; Stipcevic, M; Stolarczyk, T; Tareb-Reyes, M; Taylor, G N; Tereshchenko, V V; Toropin, A N; Touchard, A M; Tovey, Stuart N; Tran, M T; Tsesmelis, E; Ulrichs, J; Vacavant, L; Valdata-Nappi, M; Valuev, V Yu; Vannucci, François; Varvell, K E; Veltri, M; Vercesi, V; Vidal-Sitjes, G; Vieira, J M; Vinogradova, T G; Weber, F V; Weisse, T; Wilson, F F; Winton, L J; Yabsley, B D; Zaccone, Henri; Zuber, K; Zuccon, P; do Couto e Silva, E


    We present the results of a search for nu_mu → nu_e oscillations in the NOMAD experiment at Cern. The experiment looked for the appearance of nu_e in a predominantly nu_mu wide-band neutrino beam at the CERN SPS. No evidence for oscillations was found. The 90% confidence limits obtained are Delta m^2 ~ 10 eV^2.

  17. Search for $\

    Astier, Pierre; Baldisseri, Alberto; Baldo-Ceolin, Massimilla; Banner, M; Bassompierre, Gabriel; Benslama, K; Besson, N; Bird, I; Blumenfeld, B; Bobisut, F; Bouchez, J; Boyd, S; Bueno, A G; Bunyatov, S; Camilleri, L L; Cardini, A; Cattaneo, Paolo Walter; Cavasinni, V; Cervera-Villanueva, A; Challis, R C; Chukanov, A; Collazuol, G; Conforto, G; Conta, C; Contalbrigo, M; Cousins, R; Daniels, D; Degaudenzi, H M; Del Prete, T; De Santo, A; Dignan, T; Di Lella, L; do Couto e Silva, E; Dumarchez, J; Ellis, M; Feldman, G J; Ferrari, R; Ferrère, D; Flaminio, Vincenzo; Fraternali, M; Gaillard, J M; Gangler, E; Geiser, A; Geppert, D; Gibin, D; Gninenko, S N; Godley, A; Gómez-Cadenas, J J; Gosset, J; Gössling, C; Gouanère, M; Grant, A; Graziani, G; Guglielmi, A M; Hagner, C; Hernando, J A; Hubbard, D B; Hurst, P; Hyett, N; Iacopini, E; Joseph, C L; Juget, F R; Kent, N; Kirsanov, M M; Klimov, O; Kokkonen, J; Kovzelev, A; Krasnoperov, A V; Kustov, D; Lacaprara, S; Lachaud, C; Lakic, B; Lanza, A; La Rotonda, L; Laveder, M; Letessier-Selvon, A A; Lévy, J M; Linssen, Lucie; Ljubicic, A; Long, J; Lupi, A; Marchionni, A; Martelli, F; Méchain, X; Mendiburu, J P; Meyer, J P; Mezzetto, Mauro; Mishra, S R; Moorhead, G F; Naumov, D V; Nédélec, P; Nefedov, Yu A; Nguyen-Mau, C; Orestano, D; Pastore, F; Peak, L S; Pennacchio, E; Pessard, H; Petti, R; Placci, A; Polesello, G; Pollmann, D; Polyarush, A Yu; Popov, B; Poulsen, C; Rebuffi, L; Renò, R; Rico, J; Riemann, P; Roda, C; Rubbia, André; Salvatore, F; Schahmaneche, K; Schmidt, B; Schmidt, T; Sconza, A; Sevior, M E; Sillou, D; Soler, F J P; Sozzi, G; Steele, D; Stiegler, U; Stipcevic, M; Stolarczyk, T; Tareb-Reyes, M; Taylor, G; Tereshchenko, V V; Toropin, A N; Touchard, A M; Tovey, Stuart N; Tran, M T; Tsesmelis, E; Ulrichs, J; Vacavant, L; Valdata-Nappi, M; Valuev, V Y; Vannucci, François; Varvell, K E; Veltri, M; Vercesi, V; Vidal-Sitjes, G; Vieira, J M; Vinogradova, T G; Weber, F V; Weisse, T; Wilson, F F; Winton, L J; Yabsley, B D; Zaccone, Henri; Zuber, K; Zuccon, P


    We present the results of a search for nu(mu)-->nu(e) oscillations in the NOMAD experiment at CERN. The experiment looked for the appearance of nu(e) in a predominantly nu(mu) wide-band neutrino beam at the CERN SPS. No evidence for oscillations was found. The 90% confidence limits obtained are delta m^2 10 eV^2.

  18. Arabic Stemmer for Search Engines Information Retrieval

    Ahmed Khalid


    Full Text Available Arabic language is very different and difficult structure than other languages, that’s because it is a very rich language with complex morphology. Many stemmers have been developed for Arabic language but still there are many weakness and problems. There is still lack of usage of Arabic stemming in search engines. This paper introduces a rooted word Arabic stemmer technique. The results of the introduced technique for six Arabic sentences are used in famous search engines Google Chrome, Internet Explore and Mozilla Firefox to check the effect of using Arabic stemming in these search engines in terms of the total number of searched pages and the search time ratio for actual sentences and their stemming results. The results show that Arabic words stemming increase and accelerate the search engines output.

  19. CADGbased neighbor search and bounding box algorithms for geometry navigation acceleration in Monte Carlo particle transport simulation%基于CAD邻居列表和包围盒的蒙特卡罗粒子输运几何跟踪加速方法研究

    陈珍平; 宋婧; 吴斌; 郝丽娟; 胡丽琴; 孙光耀


    Geometry navigation plays the most fundamental role in Monte Carlo particle transport simulation. It’s mainly responsible for locating a particle inside which geometry volume it is and computing the distance to the volume boundary along the certain particle traj ectory during each particle history. Geometry navigation directly affects the run-time performance of the Monte Carlo particle transport simulation, especially for complicated fusion reactor models. Thus, two CAD-based geometry acceleration algorithms,the neighbor search and the bounding box,are presented for improving geometry navigation performance. The algorithms have been implemented in the Super Monte Carlo Simulation Program for Nuclear and Radiation Process (SuperMC). The fusion reactors of FDS-Ⅱ and ITER benchmark models have been tested to highlight the efficiency gains that can be achieved by using the acceleration algorithms. Testing results showed that efficiency of Monte Carlo simulation can be considerably enhanced by 50% to 60% with the acceleration algorithms.%几何跟踪主要进行蒙特卡罗粒子输运计算中粒子位置和径迹长度的计算,它是蒙特卡罗粒子输运计算的关键技术之一。由于聚变堆几何结构极其复杂,使得几何跟踪在整个蒙特卡罗粒子输运计算中占据30%~80%的计算时间,因此几何跟踪方法的效率是决定聚变堆蒙特卡罗粒子输运计算效率的重要因素之一。本文提出了基于CAD的邻居列表和包围盒加速方法,并基于 FDS 团队自主研发的超级蒙特卡罗核计算仿真软件系统 SuperMC进行实现。利用聚变堆 FDS-Ⅱ和 ITER模型对本文方法进行了数值验证,测试结果表明本文方法不影响计算结果,并能使程序计算效率提高50%~60%,证明了本文方法的正确性和有效性。

  20. Particle-accelerator decommissioning

    Opelka, J.H.; Mundis, R.L.; Marmer, G.J.; Peterson, J.M.; Siskind, B.; Kikta, M.J.


    Generic considerations involved in decommissioning particle accelerators are examined. There are presently several hundred accelerators operating in the United States that can produce material containing nonnegligible residual radioactivity. Residual radioactivity after final shutdown is generally short-lived induced activity and is localized in hot spots around the beam line. The decommissioning options addressed are mothballing, entombment, dismantlement with interim storage, and dismantlement with disposal. The recycle of components or entire accelerators following dismantlement is a definite possibility and has occurred in the past. Accelerator components can be recycled either immediately at accelerator shutdown or following a period of storage, depending on the nature of induced activation. Considerations of cost, radioactive waste, and radiological health are presented for four prototypic accelerators. Prototypes considered range from small accelerators having minimal amounts of radioactive mmaterial to a very large accelerator having massive components containing nonnegligible amounts of induced activation. Archival information on past decommissionings is presented, and recommendations concerning regulations and accelerator design that will aid in the decommissioning of an accelerator are given.

  1. Triplet Focusing for Recirculating Linear Muon Accelerators

    Keil, Eberhard


    Focusing by symmetrical triplets is studied for the linear accelerator lattices in recirculating muon accelerators with several passes where the ratio of final to initial muon energy is about four. Triplet and FODO lattices are compared. At similar acceptance, triplet lattices have straight sections for the RF cavities that are about twice as long as in FODO lat-tices. For the same energy gain, the total lengths of the linear accelerators with triplet lattices are about the same as of those with FODO lattices.

  2. On the structure of acceleration in turbulence

    Liberzon, A.; Lüthi, B.; Holzner, M.


    vorticity. Geometrical alignments with respect to vorticity vector and to the strain eigenvectors, curvature of Lagrangian trajectories and of streamlines for total acceleration, and for its convective part, , are studied in detail. We discriminate the alignment features of total and convective acceleration...... statistics, which are genuine features of turbulent nature from those of kinematic nature. We find pronounced alignment of acceleration with vorticity. Similarly, and especially are predominantly aligned at 45°with the most stretching and compressing eigenvectors of the rate of the strain tensor...

  3. On the structure of acceleration in turbulence

    Liberzon, Alex; Lüthi, Beat; Holzner, Markus; Ott, Søren; Berg, Jacob; Mann, Jakob


    Acceleration and spatial velocity gradients are obtained simultaneously in an isotropic turbulent flow via three dimensional particle tracking velocimetry. We observe two distinct populations of intense acceleration events: one in flow regions of strong strain and another in regions of strong vorticity. Geometrical alignments with respect to vorticity vector and to the strain eigenvectors, curvature of Lagrangian trajectories and of streamlines for total acceleration, a=Du/Dt and for its convective part, a=(uṡ∇)u, are studied in detail. We discriminate the alignment features of total and convective acceleration statistics, which are genuine features of turbulent nature from those of kinematic nature. We find pronounced alignment of acceleration with vorticity. Similarly, a and especially a are predominantly aligned at 45°with the most stretching and compressing eigenvectors of the rate of the strain tensor, λ, and λ, respectively. Via autocorrelation functions of acceleration, conditioned on preferential directions, the vorticity vector field is found to play an important role as an ordering reference axis for acceleration orientation. Associating a velocity-acceleration structure function with an energy flux gives a clear indication that a strong energy flux occurs via compression in strain dominated events and via stretching in vorticity dominated events.

  4. Leaky Fermi accelerators

    Shah, Kushal; Rom-Kedar, Vered; Turaev, Dmitry


    A Fermi accelerator is a billiard with oscillating walls. A leaky accelerator interacts with an environment of an ideal gas at equilibrium by exchange of particles through a small hole on its boundary. Such interaction may heat the gas: we estimate the net energy flow through the hole under the assumption that the particles inside the billiard do not collide with each other and remain in the accelerator for sufficiently long time. The heat production is found to depend strongly on the type of the Fermi accelerator. An ergodic accelerator, i.e. one which has a single ergodic component, produces a weaker energy flow than a multi-component accelerator. Specifically, in the ergodic case the energy gain is independent of the hole size, whereas in the multi-component case the energy flow may be significantly increased by shrinking the hole size.

  5. Accelerator reliability workshop

    Hardy, L.; Duru, Ph.; Koch, J.M.; Revol, J.L.; Van Vaerenbergh, P.; Volpe, A.M.; Clugnet, K.; Dely, A.; Goodhew, D


    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.

  6. Accelerator and radiation physics

    Basu, Samita; Nandy, Maitreyee


    "Accelerator and radiation physics" encompasses radiation shielding design and strategies for hadron therapy accelerators, neutron facilities and laser based accelerators. A fascinating article describes detailed transport theory and its application to radiation transport. Detailed information on planning and design of a very high energy proton accelerator can be obtained from the article on radiological safety of J-PARC. Besides safety for proton accelerators, the book provides information on radiological safety issues for electron synchrotron and prevention and preparedness for radiological emergencies. Different methods for neutron dosimetry including LET based monitoring, time of flight spectrometry, track detectors are documented alongwith newly measured experimental data on radiation interaction with dyes, polymers, bones and other materials. Design of deuteron accelerator, shielding in beam line hutches in synchrotron and 14 MeV neutron generator, various radiation detection methods, their characteriza...

  7. Pentaquark searches with ALICE

    Bobulska, Dana


    In this report we present the results of the data analysis for searching for possible invariant mass signals from pentaquarks in the ALICE data. Analysis was based on filtered data from real p-Pb events at psNN=5.02 TeV collected in 2013. The motivation for this project was the recent discovery of pentaquark states by the LHCb collaboration (c ¯ cuud resonance P+ c ) [1]. The search for similar not yet observed pentaquarks is an interesting research topic [2]. In this analysis we searched for a s ¯ suud pentaquark resonance P+ s and its possible decay channel to f meson and proton. The ALICE detector is well suited for the search of certain candidates thanks to its low material budget and strong PID capabilities. Additionally we might expect the production of such particles in ALICE as in heavy-ion and proton-ion collisions the thermal models describes well the particle yields and ratios [3]. Therefore it is reasonable to expect other species of hadrons, including also possible pentaquarks, to be produced w...

  8. Autonomous Search

    Hamadi, Youssef; Saubion, Frédéric


    Decades of innovations in combinatorial problem solving have produced better and more complex algorithms. These new methods are better since they can solve larger problems and address new application domains. They are also more complex which means that they are hard to reproduce and often harder to fine-tune to the peculiarities of a given problem. This last point has created a paradox where efficient tools are out of reach of practitioners. Autonomous search (AS) represents a new research field defined to precisely address the above challenge. Its major strength and originality consist in the

  9. Search in

    Gaona Román, Alejandro


    "Search in" consiste en una instalación artística compuesta por escultura y video con un trasfondo conceptual sobre la identidad. Es una obra que invita al espectador a rodearla e introducirse en ella viéndose así como parte de la obra, al igual que el concepto de identidad puede vivirse desde la sensación del “yo” separado del mundo y a su vez desde el “yo” como parte de la sociedad. Nos hace viajar desde nuestros inicios como sociedad y seres conscientes hasta la actualidad, la era de las c...

  10. Power Converters for Accelerators

    Visintini, R


    Particle accelerators use a great variety of power converters for energizing their sub-systems; while the total number of power converters usually depends on the size of the accelerator or combination of accelerators (including the experimental setup), the characteristics of power converters depend on their loads and on the particle physics requirements: this paper aims to provide an overview of the magnet power converters in use in several facilities worldwide.

  11. Miniaturization Techniques for Accelerators

    Spencer, James E.


    The possibility of laser driven accelerators [1] suggests the need for new structures based on micromachining and integrated circuit technology because of the comparable scales. Thus, we are exploring fully integrated structures including sources, optics (for both light and particle) and acceleration in a common format--an accelerator-on-chip (AOC). Tests suggest a number of preferred materials and techniques but no technical or fundamental roadblocks at scales of order 1 {micro}m or larger.

  12. Naked singularities as particle accelerators

    Patil, Mandar; 10.1103/PhysRevD.82.104049


    We investigate here the particle acceleration by naked singularities to arbitrarily high center of mass energies. Recently it has been suggested that black holes could be used as particle accelerators to probe the Planck scale physics. We show that the naked singularities serve the same purpose and probably would do better than their black hole counterparts. We focus on the scenario of a self-similar gravitational collapse starting from a regular initial data, leading to the formation of a globally naked singularity. It is seen that when particles moving along timelike geodesics interact and collide near the Cauchy horizon, the energy of collision in the center of mass frame will be arbitrarily high, thus offering a window to Planck scale physics.

  13. FFAGS for rapid acceleration

    Carol J. Johnstone and Shane Koscielniak


    When large transverse and longitudinal emittances are to be transported through a circular machine, extremely rapid acceleration holds the advantage that the beam becomes immune to nonlinear resonances because there is insufficient time for amplitudes to build up. Uncooled muon beams exhibit large emittances and require fast acceleration to avoid decay losses and would benefit from this style of acceleration. The approach here employs a fixed-field alternating gradient or FFAG magnet structure and a fixed frequency acceleration system. Acceptance is enhanced by the use only of linear lattice elements, and fixed-frequency rf enables the use of cavities with large shunt resistance and quality factor.




    Due to their finite lifetime, muons must be accelerated very rapidly. It is challenging to make the magnets ramp fast enough to accelerate in a synchrotron, and accelerating in a linac is very expensive. One can use a recirculating accelerator (like CEBAF), but one needs a different arc for each turn, and this limits the number of turns one can use to accelerate, and therefore requires significant amounts of RF to achieve the desired energy gain. An alternative method for muon acceleration is using a fixed field alternating gradient (FFAG) accelerator. Such an accelerator has a very large energy acceptance (a factor of two or three), allowing one to use the same arc with a magnetic field that is constant over time. Thus, one can in principle make as many turns as one can tolerate due to muon decay, therefore reducing the RF cost without increasing the arc cost. This paper reviews the current status of research into the design of FFAGs for muon acceleration. Several current designs are described and compared. General design considerations are also discussed.

  15. High Gradient Accelerator Research

    Temkin, Richard [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States). Dept. of Physics. Plasma Science and Fusion Center


    The goal of the MIT program of research on high gradient acceleration is the development of advanced acceleration concepts that lead to a practical and affordable next generation linear collider at the TeV energy level. Other applications, which are more near-term, include accelerators for materials processing; medicine; defense; mining; security; and inspection. The specific goals of the MIT program are: • Pioneering theoretical research on advanced structures for high gradient acceleration, including photonic structures and metamaterial structures; evaluation of the wakefields in these advanced structures • Experimental research to demonstrate the properties of advanced structures both in low-power microwave cold test and high-power, high-gradient test at megawatt power levels • Experimental research on microwave breakdown at high gradient including studies of breakdown phenomena induced by RF electric fields and RF magnetic fields; development of new diagnostics of the breakdown process • Theoretical research on the physics and engineering features of RF vacuum breakdown • Maintaining and improving the Haimson / MIT 17 GHz accelerator, the highest frequency operational accelerator in the world, a unique facility for accelerator research • Providing the Haimson / MIT 17 GHz accelerator facility as a facility for outside users • Active participation in the US DOE program of High Gradient Collaboration, including joint work with SLAC and with Los Alamos National Laboratory; participation of MIT students in research at the national laboratories • Training the next generation of Ph. D. students in the field of accelerator physics.

  16. Interdisciplinary glossary — particle accelerators and medicine

    Dmitrieva, V. V.; Dyubkov, V. S.; Nikitaev, V. G.; Ulin, S. E.


    A general concept of a new interdisciplinary glossary, which includes particle accelerator terminology used in medicine, as well as relevant medical concepts, is presented. Its structure and usage rules are described. An example, illustrating the quickly searching technique of relevant information in this Glossary, is considered. A website address, where one can get an access to the Glossary, is specified. Glossary can be refined and supplemented.

  17. Senescence-accelerated OXYS rats

    Stefanova, Natalia A; Kozhevnikova, Oyuna S; Vitovtov, Anton O; Maksimova, Kseniya Yi; Logvinov, Sergey V; Rudnitskaya, Ekaterina A; Korbolina, Elena E; Muraleva, Natalia A; Kolosova, Nataliya G


    Senescence-accelerated OXYS rats are an experimental model of accelerated aging that was established from Wistar stock via selection for susceptibility to cataractogenic effects of a galactose-rich diet and via subsequent inbreeding of highly susceptible rats. Currently, we have the 102nd generation of OXYS rats with spontaneously developing cataract and accelerated senescence syndrome, which means early development of a phenotype similar to human geriatric disorders, including accelerated brain aging. In recent years, our group found strong evidence that OXYS rats are a promising model for studies of the mechanisms of brain aging and neurodegenerative processes similar to those seen in Alzheimer disease (AD). The manifestation of behavioral alterations and learning and memory deficits develop since the fourth week of age, i.e., simultaneously with first signs of neurodegeneration detectable on magnetic resonance imaging and under a light microscope. In addition, impaired long-term potentiation has been demonstrated in OXYS rats by the age of 3 months. With age, neurodegenerative changes in the brain of OXYS rats become amplified. We have shown that this deterioration happens against the background of overproduction of amyloid precursor protein (AβPP), accumulation of β-amyloid (Aβ), and hyperphosphorylation of the tau protein in the hippocampus and cortex. The development of AMD-like retinopathy in OXYS rats is also accompanied by increased accumulation of Aβ in the retina. These published data suggest that the OXYS strain may serve as a spontaneous rat model of AD-like pathology and could help to decipher the pathogenesis of AD. PMID:24552807

  18. Similarity measures for protein ensembles

    Lindorff-Larsen, Kresten; Ferkinghoff-Borg, Jesper


    Analyses of similarities and changes in protein conformation can provide important information regarding protein function and evolution. Many scores, including the commonly used root mean square deviation, have therefore been developed to quantify the similarities of different protein conformatio...

  19. A new adaptive fast motion estimation algorithm based on local motion similarity degree (LMSD)

    LIU Long; HAN Chongzhao; BAI Yan


    In the motion vector field adaptive search technique (MVFAST) and the predictive motion vector field adaptive search technique (PMVFAST), the size of the largest motion vector from the three adjacent blocks (left, top, top-right) is compared with the threshold to select different search scheme. But a suitable search center and search pattern will not be selected in the adaptive search technique when the adjacent motion vectors are not coherent in local region. This paper presents an efficient adaptive search algorithm. The motion vector variation degree (MVVD) is considered a reasonable factor for adaptive search selection. By the relationship between local motion similarity degree (LMSD) and the variation degree of motion vector (MVVD), the motion vectors are classified as three categories according to corresponding LMSD; then different proposed search schemes are adopted for motion estimation. The experimental results show that the proposed algorithm has a significant computational speedup compared with MVFAST and PMVFAST algorithms, and offers a similar, even better performance.

  20. Functional Similarity and Interpersonal Attraction.

    Neimeyer, Greg J.; Neimeyer, Robert A.


    Students participated in dyadic disclosure exercises over a five-week period. Results indicated members of high functional similarity dyads evidenced greater attraction to one another than did members of low functional similarity dyads. "Friendship" pairs of male undergraduates displayed greater functional similarity than did…

  1. Functional Similarity and Interpersonal Attraction.

    Neimeyer, Greg J.; Neimeyer, Robert A.


    Students participated in dyadic disclosure exercises over a five-week period. Results indicated members of high functional similarity dyads evidenced greater attraction to one another than did members of low functional similarity dyads. "Friendship" pairs of male undergraduates displayed greater functional similarity than did "nominal" pairs from…

  2. Asia honours accelerator physicists


    "Steve Meyers of Cern and Jie Wei of Beijing's Tsinghua University are the first recipients of a new prize for particle physics. The pair were honoured for their contributions to numerous particle-accelerator projects - including Cern's Large Hadron Collider - by the Asian Committee for Future Accelerators (ACFA)..." (1 paragraph)


    Sessler, Andrew M.


    Diverse methods proposed for the acceleration of particles by means of collective fields are reviewed. A survey is made of the various currently active experimental programs devoted to investigating collective acceleration, and the present status of the research is briefly noted.

  4. Accelerators Beyond The Tevatron?

    Lach, Joseph; /Fermilab


    Following the successful operation of the Fermilab superconducting accelerator three new higher energy accelerators were planned. They were the UNK in the Soviet Union, the LHC in Europe, and the SSC in the United States. All were expected to start producing physics about 1995. They did not. Why?

  5. KEK digital accelerator

    Iwashita, T.; Adachi, T.; Takayama, K.; Leo, K. W.; Arai, T.; Arakida, Y.; Hashimoto, M.; Kadokura, E.; Kawai, M.; Kawakubo, T.; Kubo, Tomio; Koyama, K.; Nakanishi, H.; Okazaki, K.; Okamura, K.; Someya, H.; Takagi, A.; Tokuchi, A.; Wake, M.


    The High Energy Accelerator Research Organization KEK digital accelerator (KEK-DA) is a renovation of the KEK 500 MeV booster proton synchrotron, which was shut down in 2006. The existing 40 MeV drift tube linac and rf cavities have been replaced by an electron cyclotron resonance (ECR) ion source embedded in a 200 kV high-voltage terminal and induction acceleration cells, respectively. A DA is, in principle, capable of accelerating any species of ion in all possible charge states. The KEK-DA is characterized by specific accelerator components such as a permanent magnet X-band ECR ion source, a low-energy transport line, an electrostatic injection kicker, an extraction septum magnet operated in air, combined-function main magnets, and an induction acceleration system. The induction acceleration method, integrating modern pulse power technology and state-of-art digital control, is crucial for the rapid-cycle KEK-DA. The key issues of beam dynamics associated with low-energy injection of heavy ions are beam loss caused by electron capture and stripping as results of the interaction with residual gas molecules and the closed orbit distortion resulting from relatively high remanent fields in the bending magnets. Attractive applications of this accelerator in materials and biological sciences are discussed.

  6. KEK digital accelerator

    T. Iwashita


    Full Text Available The High Energy Accelerator Research Organization KEK digital accelerator (KEK-DA is a renovation of the KEK 500 MeV booster proton synchrotron, which was shut down in 2006. The existing 40 MeV drift tube linac and rf cavities have been replaced by an electron cyclotron resonance (ECR ion source embedded in a 200 kV high-voltage terminal and induction acceleration cells, respectively. A DA is, in principle, capable of accelerating any species of ion in all possible charge states. The KEK-DA is characterized by specific accelerator components such as a permanent magnet X-band ECR ion source, a low-energy transport line, an electrostatic injection kicker, an extraction septum magnet operated in air, combined-function main magnets, and an induction acceleration system. The induction acceleration method, integrating modern pulse power technology and state-of-art digital control, is crucial for the rapid-cycle KEK-DA. The key issues of beam dynamics associated with low-energy injection of heavy ions are beam loss caused by electron capture and stripping as results of the interaction with residual gas molecules and the closed orbit distortion resulting from relatively high remanent fields in the bending magnets. Attractive applications of this accelerator in materials and biological sciences are discussed.

  7. Web Search Engines: Search Syntax and Features.

    Ojala, Marydee


    Presents a chart that explains the search syntax, features, and commands used by the 12 most widely used general Web search engines. Discusses Web standardization, expanded types of content searched, size of databases, and search engines that include both simple and advanced versions. (LRW)

  8. Web Search Engines: Search Syntax and Features.

    Ojala, Marydee


    Presents a chart that explains the search syntax, features, and commands used by the 12 most widely used general Web search engines. Discusses Web standardization, expanded types of content searched, size of databases, and search engines that include both simple and advanced versions. (LRW)

  9. Accelerators, Beams And Physical Review Special Topics - Accelerators And Beams

    Siemann, R.H.; /SLAC


    Accelerator science and technology have evolved as accelerators became larger and important to a broad range of science. Physical Review Special Topics - Accelerators and Beams was established to serve the accelerator community as a timely, widely circulated, international journal covering the full breadth of accelerators and beams. The history of the journal and the innovations associated with it are reviewed.

  10. The Accelerated Kepler Problem

    Namouni, Fathi


    The accelerated Kepler problem is obtained by adding a constant acceleration to the classical two-body Kepler problem. This setting models the dynamics of a jet-sustaining accretion disk and its content of forming planets as the disk loses linear momentum through the asymmetric jet-counterjet system it powers. The dynamics of the accelerated Kepler problem is analyzed using physical as well as parabolic coordinates. The latter naturally separate the problem's Hamiltonian into two unidimensional Hamiltonians. In particular, we identify the origin of the secular resonance in the accelerated Kepler problem and determine analytically the radius of stability boundary of initially circular orbits that are of particular interest to the problem of radial migration in binary systems as well as to the truncation of accretion disks through stellar jet acceleration.

  11. On Accelerated Black Holes

    Letelier, P S; Letelier, Patricio S.; Oliveira, Samuel R.


    The C-metric is revisited and global interpretation of some associated spacetimes are studied in some detail. Specially those with two event horizons, one for the black hole and another for the acceleration. We found that the spacetime fo an accelerated Schwarzschild black hole is plagued by either conical singularities or lack of smoothness and compactness of the black hole horizon. By using standard black hole thermodynamics we show that accelerated black holes have higher Hawking temperature than Unruh temperature. We also show that the usual upper bound on the product of the mass and acceleration parameters (<1/sqrt(27)) is just a coordinate artifact. The main results are extended to accelerated Kerr black holes. We found that they are not changed by the black hole rotation.

  12. Fundamentals of database indexing and searching

    Bhattacharya, Arnab


    Fundamentals of Database Indexing and Searching presents well-known database searching and indexing techniques. It focuses on similarity search queries, showing how to use distance functions to measure the notion of dissimilarity.After defining database queries and similarity search queries, the book organizes the most common and representative index structures according to their characteristics. The author first describes low-dimensional index structures, memory-based index structures, and hierarchical disk-based index structures. He then outlines useful distance measures and index structures


    Q. X. Xu


    Full Text Available The semantic similarities are important in concept definition, recognition, categorization, interpretation, and integration. Many semantic similarity models have been established to evaluate semantic similarities of objects or/and concepts. To find out the suitability and performance of different models in evaluating concept similarities, we make a comparison of four main types of models in this paper: the geometric model, the feature model, the network model, and the transformational model. Fundamental principles and main characteristics of these models are introduced and compared firstly. Land use and land cover concepts of NLCD92 are employed as examples in the case study. The results demonstrate that correlations between these models are very high for a possible reason that all these models are designed to simulate the similarity judgement of human mind.

  14. a Comparison of Semantic Similarity Models in Evaluating Concept Similarity

    Xu, Q. X.; Shi, W. Z.


    The semantic similarities are important in concept definition, recognition, categorization, interpretation, and integration. Many semantic similarity models have been established to evaluate semantic similarities of objects or/and concepts. To find out the suitability and performance of different models in evaluating concept similarities, we make a comparison of four main types of models in this paper: the geometric model, the feature model, the network model, and the transformational model. Fundamental principles and main characteristics of these models are introduced and compared firstly. Land use and land cover concepts of NLCD92 are employed as examples in the case study. The results demonstrate that correlations between these models are very high for a possible reason that all these models are designed to simulate the similarity judgement of human mind.

  15. Learning Multi-modal Similarity

    McFee, Brian


    In many applications involving multi-media data, the definition of similarity between items is integral to several key tasks, e.g., nearest-neighbor retrieval, classification, and recommendation. Data in such regimes typically exhibits multiple modalities, such as acoustic and visual content of video. Integrating such heterogeneous data to form a holistic similarity space is therefore a key challenge to be overcome in many real-world applications. We present a novel multiple kernel learning technique for integrating heterogeneous data into a single, unified similarity space. Our algorithm learns an optimal ensemble of kernel transfor- mations which conform to measurements of human perceptual similarity, as expressed by relative comparisons. To cope with the ubiquitous problems of subjectivity and inconsistency in multi- media similarity, we develop graph-based techniques to filter similarity measurements, resulting in a simplified and robust training procedure.

  16. Renewing the Respect for Similarity

    Shimon eEdelman


    Full Text Available In psychology, the concept of similarity has traditionally evoked a mixture of respect, stemmingfrom its ubiquity and intuitive appeal, and concern, due to its dependence on the framing of the problemat hand and on its context. We argue for a renewed focus on similarity as an explanatory concept, bysurveying established results and new developments in the theory and methods of similarity-preservingassociative lookup and dimensionality reduction — critical components of many cognitive functions, aswell as of intelligent data management in computer vision. We focus in particular on the growing familyof algorithms that support associative memory by performing hashing that respects local similarity, andon the uses of similarity in representing structured objects and scenes. Insofar as these similarity-basedideas and methods are useful in cognitive modeling and in AI applications, they should be included inthe core conceptual toolkit of computational neuroscience.

  17. Positive selection, relaxation, and acceleration in the evolution of the human and chimp genome.

    Leonardo Arbiza


    Full Text Available For years evolutionary biologists have been interested in searching for the genetic bases underlying humanness. Recent efforts at a large or a complete genomic scale have been conducted to search for positively selected genes in human and in chimp. However, recently developed methods allowing for a more sensitive and controlled approach in the detection of positive selection can be employed. Here, using 13,198 genes, we have deduced the sets of genes involved in rate acceleration, positive selection, and relaxation of selective constraints in human, in chimp, and in their ancestral lineage since the divergence from murids. Significant deviations from the strict molecular clock were observed in 469 human and in 651 chimp genes. The more stringent branch-site test of positive selection detected 108 human and 577 chimp positively selected genes. An important proportion of the positively selected genes did not show a significant acceleration in rates, and similarly, many of the accelerated genes did not show significant signals of positive selection. Functional differentiation of genes under rate acceleration, positive selection, and relaxation was not statistically significant between human and chimp with the exception of terms related to G-protein coupled receptors and sensory perception. Both of these were over-represented under relaxation in human in relation to chimp. Comparing differences between derived and ancestral lineages, a more conspicuous change in trends seems to have favored positive selection in the human lineage. Since most of the positively selected genes are different under the same functional categories between these species, we suggest that the individual roles of the alternative positively selected genes may be an important factor underlying biological differences between these species.

  18. Evidence and Search for Sterile Neutrinos at Accelerators

    W. C. Louis


    Full Text Available The LSND short-baseline neutrino experiment has published evidence for antineutrino oscillations at a mass scale of ~1 eV2. The MiniBooNE experiment, designed to test this evidence for oscillations at an order of magnitude higher neutrino energy and distance, observes excesses of events in both neutrino mode and antineutrino mode. While the MiniBooNE neutrino excess has a neutrino energy spectrum that is softer than expected from LSND, the MiniBooNE antineutrino excess is consistent with neutrino oscillations and with the LSND oscillation signal. When combined with oscillation measurements at the solar and atmospheric mass scales, assuming that the LSND and MiniBooNE signals are due to neutrino oscillations, these experiments imply the existence of more than three neutrino mass states and, therefore, one or more sterile neutrinos. Such sterile neutrinos, if proven to exist, would have a big impact on particle physics, nuclear physics, and astrophysics and would contribute to the dark matter of the universe. Future experiments under construction or proposed at Fermilab, ORNL, CERN, and in Japan will provide a definitive test of short-baseline neutrino oscillations and will have the capability of proving the existence of sterile neutrinos.

  19. Search for Krypton 81 at Alice Accelerator Facility

    Sabir, A.; Brissaud, I.; Kalifa, J.; Laurent, H.; Roynette, J. C.


    81Kr concentration measurements is a good clock for the old groundwater dating because of its chemical stability and of its atmospheric production. Unfortunately its presence in natural samples is very low. In this paper we report an experiment to measure the 81Kr concentration by means of the ALICE facility.

  20. Similarity Learning of Manifold Data.

    Chen, Si-Bao; Ding, Chris H Q; Luo, Bin


    Without constructing adjacency graph for neighborhood, we propose a method to learn similarity among sample points of manifold in Laplacian embedding (LE) based on adding constraints of linear reconstruction and least absolute shrinkage and selection operator type minimization. Two algorithms and corresponding analyses are presented to learn similarity for mix-signed and nonnegative data respectively. The similarity learning method is further extended to kernel spaces. The experiments on both synthetic and real world benchmark data sets demonstrate that the proposed LE with new similarity has better visualization and achieves higher accuracy in classification.

  1. Particle acceleration by plasma

    Ogata, A


    Plasma acceleration is carried out by using potential of plasma wave. It is classified by generation method of plasma wave such as the laser wake-field acceleration and the beat wave acceleration. Other method using electron beam is named the plasma wake-field acceleration (or beam wake-field acceleration). In this paper, electron acceleration by laser wake-field in gas plasma, ion source by laser radiation of solid target and nanoion beam generation by one component of plasma in trap are explained. It is an applicable method that ions, which run out from the solid target irradiated by laser, are used as ion source of accelerator. The experimental system using 800 nm laser, 50 mJ pulse energy and 50 fs pulse width was studied. The laser intensity is 4x10 sup 1 sup 6 Wcm sup - sup 2 at the focus. The target film of metal and organic substance film was used. When laser irradiated Al target, two particles generated, in front and backward. It is new fact that the neutral particle was obtained in front, because it...

  2. The miniature accelerator

    Antonella Del Rosso


    The image that most people have of CERN is of its enormous accelerators and their capacity to accelerate particles to extremely high energies. But thanks to some cutting-edge studies on beam dynamics and radiofrequency technology, along with innovative construction techniques, teams at CERN have now created the first module of a brand-new accelerator, which will be just 2 metres long. The potential uses of this miniature accelerator will include deployment in hospitals for the production of medical isotopes and the treatment of cancer. It’s a real David-and-Goliath story.   Serge Mathot, in charge of the construction of the "mini-RFQ", pictured with the first of the four modules that will make up the miniature accelerator. The miniature accelerator consists of a radiofrequency quadrupole (RFQ), a component found at the start of all proton accelerator chains around the world, from the smallest to the largest. The LHC is designed to produce very high-intensity beams ...

  3. Cosmic particle acceleration

    Zimbardo, Gaetano; Perri, Silvia [Universita della Calabria, Dipartimento di Fisica, 87036 Rende (Italy)


    The most popular mechanism for the acceleration of cosmic rays, which is thought to operate in supernova remnant shocks as well as at heliospheric shocks, is the diffusive shock acceleration, which is a Fermi mechanism based on normal diffusion. On the other hand, in the last few years it has been shown that the transport of plasma particles in the presence of electric and magnetic turbulence can be superdiffusive rather than normal diffusive. The term 'superdiffusive' refers to the mean square displacement of particle positions growing superlinearly with time, as compared to the normal linear growth. In particular, superdiffusion is characterized by a non Gaussian statistical process called Levy random walk. We show how diffusive shock acceleration is modified by superdiffusion, and how this yields new predictions for the cosmic ray spectral index, for the acceleration time, and for the spatial profile of energetic particles. A comparison with observations of particle acceleration at heliospheric shocks and at supernova remnant shocks is done. We discuss how superdiffusive shock acceleration allows to explain the observations of hard ion spectra at the solar wind termination shock detected by Voyager 2, of hard radio spectra due to synchrotron emission of electrons accelerated at supernova remnant shocks, and how it can help to explain the observations of 'thin rims' in the X-ray synchrotron emission.

  4. Multicavity proton cyclotron accelerator

    J. L. Hirshfield


    Full Text Available A mechanism for acceleration of protons is described, in which energy gain occurs near cyclotron resonance as protons drift through a sequence of rotating-mode TE_{111} cylindrical cavities in a strong nearly uniform axial magnetic field. Cavity resonance frequencies decrease in sequence from one another with a fixed frequency interval Δf between cavities, so that synchronism can be maintained between the rf fields and proton bunches injected at intervals of 1/Δf. An example is presented in which a 122 mA, 1 MeV proton beam is accelerated to 961 MeV using a cascade of eight cavities in an 8.1 T magnetic field, with the first cavity resonant at 120 MHz and with Δf=8 MHz. Average acceleration gradient exceeds 40 MV/m, average effective shunt impedance is 223 MΩ/m, but maximum surface field in the cavities does not exceed 7.2 MV/m. These features occur because protons make many orbital turns in each cavity and thus experience acceleration from each cavity field many times. Longitudinal and transverse stability appear to be intrinsic properties of the acceleration mechanism, and an example to illustrate this is presented. This acceleration concept could be developed into a proton accelerator for a high-power neutron spallation source, such as that required for transmutation of nuclear waste or driving a subcritical fission burner, provided a number of significant practical issues can be addressed.

  5. Accelerating DSMC data extraction.

    Gallis, Michail A.; Piekos, Edward Stanley


    In many direct simulation Monte Carlo (DSMC) simulations, the majority of computation time is consumed after the flowfield reaches a steady state. This situation occurs when the desired output quantities are small compared to the background fluctuations. For example, gas flows in many microelectromechanical systems (MEMS) have mean speeds more than two orders of magnitude smaller than the thermal speeds of the molecules themselves. The current solution to this problem is to collect sufficient samples to achieve the desired resolution. This can be an arduous process because the error is inversely proportional to the square root of the number of samples so we must, for example, quadruple the samples to cut the error in half. This work is intended to improve this situation by employing more advanced techniques, from fields other than solely statistics, for determining the output quantities. Our strategy centers on exploiting information neglected by current techniques, which collect moments in each cell without regard to one another, values in neighboring cells, nor their evolution in time. Unlike many previous acceleration techniques that modify the method itself, the techniques examined in this work strictly post-process so they may be applied to any DSMC code without affecting its fidelity or generality. Many potential methods are drawn from successful applications in a diverse range of areas, from ultrasound imaging to financial market analysis. The most promising methods exploit relationships between variables in space, which always exist in DSMC due to the absence of shocks. Disparate techniques were shown to produce similar error reductions, suggesting that the results shown in this report may be typical of what is possible using these methods. Sample count reduction factors of approximately three to five were found to be typical, although factors exceeding ten were shown on some variables under some techniques.

  6. Secular Acceleration of Barnard's Star

    Bartlett, Jennifer L.; Ianna, P. A.


    Barnard's Star should have significant secular acceleration because it lies close to the Sun and has the highest known proper motion along with a large radial velocity. It will pass within about 1.4 pc in another 9,750 years. Secular changes in proper motion and radial velocity are essentially the Coriolis and centrifugal accelerations, respectively, arising from use of a rotating coordinate system defined by the Sun-star radius vector. Although stellar space velocities measured with respect to the Sun are essentially constant, these perspective effects arise with changing distance and viewing angle. Hipparcos-2 plus Nidever et al. (2002) predict a perspective change in the proper motion of 1.285±0.006 mas yr-2 for Barnard's Star. Recent analysis of 900+ photographic plates between 1968 and 1998 with the 26.25-in (0.67-m) McCormick refractor detected a secular acceleration of 1.25±0.04 mas yr-2, which agrees with the predicted value within the measurement errors. Earlier, Benedict et al. (1999) measured its secular acceleration to be 1.2±0.2 mas yr-2 using 3 years of HST FGS observations. Similarly, a perspective change in radial velocity of 4.50±0.01 m s-1 yr-1 can be predicted for Barnard's Star. Kürster et al. (2003) detected variations in their observations of it that are largely attributable to secular acceleration along the line of sight with some contribution from stellar activity. Although secular acceleration effects have been limited for past studies of stellar motions, they can be significant for observations extending over decades or for high-precision measurements required to detect extrasolar planets. Future studies will need to consider this factor for the nearest stars and for those with large proper motions or radial velocities. NSF grant AST 98-20711; Litton Marine Systems; Peninsula Community Foundation Levinson Fund; UVa Governor's Fellowship, Dean's F&A Fellowship, and Graduate School of Arts and Sciences; and, US Naval Observatory

  7. Statistical analysis in search for anisotropies for the observed UHECRs

    Ghosh, Soumavo


    Cosmic accelerators produce particles with energies in a wide range (PeV to EeV, 1 PeV ˜ 10-15 eV, 1 EeV ˜ 10-18 eV) . The energy spectrum follows three power laws like a ` leg' structure of which the `knee' part ( ˜ 3 PeV) is of Galactic origin, the `ankle' part is unassociated with Glaxy and the highest energy source above the ankle shows evidence for extragalactic origin. In the present work various cross correlation functions are studied between the samples of observed UHECRs and and the catalogue of nearby galaxies to search for anisotropies , if any, in the arrival directions of UHECRs for identifying their possible sources. The robustness of the functions are studies through many random realizations of the original samples under considerations. Similar procedure is followed for catalogues also for comparison.

  8. Application of Accelerators and Storage Rings: Accelerators in Medicine

    Amaldi, U


    This document is part of Subvolume C 'Accelerators and Colliders' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the the Section '11.3 Accelerators in Medicine' of the Chapter '11 Application of Accelerators and Storage Rings' with the content: 11.3 Accelerators in Medicine 11.3.1 Accelerators and Radiopharmaceuticals 11.3.2 Accelerators and Cancer Therapy

  9. A study of Consistency in the Selection of Search Terms and Search Concepts: A Case Study in National Taiwan University

    Mu-hsuan Huang


    Full Text Available This article analyzes the consistency in the selection of search terms and search contents of college and graduate students in National Taiwan University when they are using PsycLIT CD-ROM database. 31 students conducted pre-assigned searches, doing 59 searches generating 609 search terms. The study finds the consistency in selection of search terms of first level is 22.14% and second level is 35%. These results are similar with others’ researches. About the consistency in search concepts, no matter the overlaps of searched articles or judge relevant articles are lower than other researches. [Article content in Chinese

  10. Confronting Twin Paradox Acceleration

    Murphy, Thomas W.


    The resolution to the classic twin paradox in special relativity rests on the asymmetry of acceleration. Yet most students are not exposed to a satisfactory analysis of what exactly happens during the acceleration phase that results in the nonaccelerated observer's more rapid aging. The simple treatment presented here offers both graphical and quantitative solutions to the problem, leading to the correct result that the acceleration-induced age gap is 2Lβ years when the one-way distance L is expressed in light-years and velocity β ≡v/c .

  11. Improved Scatter Search Using Cuckoo Search

    Ahmed T.Sadiq Al-Obaidi


    Full Text Available The Scatter Search (SS is a deterministic strategy that has been applied successfully to some combinatorial and continuous optimization problems. Cuckoo Search (CS is heuristic search algorithm which is inspired by the reproduction strategy of cuckoos. This paper presents enhanced scatter search algorithm using CS algorithm. The improvement provides Scatter Search with random exploration for search space of problem and more of diversity and intensification for promising solutions. The original and improved Scatter Search has been tested on Traveling Salesman Problem. A computational experiment with benchmark instances is reported. The results demonstrate that the improved Scatter Search algorithms produce better performance than original Scatter Search algorithm. The improvement in the value of average fitness is 23.2% comparing with original SS. The developed algorithm has been compared with other algorithms for the same problem, and the result was competitive with some algorithm and insufficient with another.

  12. Multiple-Goal Heuristic Search

    Davidov, D; 10.1613/jair.1940


    This paper presents a new framework for anytime heuristic search where the task is to achieve as many goals as possible within the allocated resources. We show the inadequacy of traditional distance-estimation heuristics for tasks of this type and present alternative heuristics that are more appropriate for multiple-goal search. In particular, we introduce the marginal-utility heuristic, which estimates the cost and the benefit of exploring a subtree below a search node. We developed two methods for online learning of the marginal-utility heuristic. One is based on local similarity of the partial marginal utility of sibling nodes, and the other generalizes marginal-utility over the state feature space. We apply our adaptive and non-adaptive multiple-goal search algorithms to several problems, including focused crawling, and show their superiority over existing methods.

  13. Attacks on Local Searching Tools

    Nielson, Seth James; Wallach, Dan S


    The Google Desktop Search is an indexing tool, currently in beta testing, designed to allow users fast, intuitive, searching for local files. The principle interface is provided through a local web server which supports an interface similar to's normal web page. Indexing of local files occurs when the system is idle, and understands a number of common file types. A optional feature is that Google Desktop can integrate a short summary of a local search results with web searches. This summary includes 30-40 character snippets of local files. We have uncovered a vulnerability that would release private local data to an unauthorized remote entity. Using two different attacks, we expose the small snippets of private local data to a remote third party.

  14. The Search for Directed Intelligence

    Lubin, Philip


    We propose a search for sources of directed energy systems such as those now becoming technologically feasible on Earth. Recent advances in our own abilities allow us to foresee our own capability that will radically change our ability to broadcast our presence. We show that systems of this type have the ability to be detected at vast distances and indeed can be detected across the entire horizon. This profoundly changes the possibilities for searches for extra-terrestrial technology advanced civilizations. We show that even modest searches can be extremely effective at detecting or limiting many civilization classes. We propose a search strategy that will observe more than 10 12 stellar and planetary systems with possible extensions to more than 10 20 systems allowing us to test the hypothesis that other similarly or more advanced civilization with this same capability, and are broadcasting, exist.

  15. Dynamic similarity in erosional processes

    Scheidegger, A.E.


    A study is made of the dynamic similarity conditions obtaining in a variety of erosional processes. The pertinent equations for each type of process are written in dimensionless form; the similarity conditions can then easily be deduced. The processes treated are: raindrop action, slope evolution and river erosion. ?? 1963 Istituto Geofisico Italiano.

  16. Vibration control in accelerators

    Montag, C.


    In the vast majority of accelerator applications, ground vibration amplitudes are well below tolerable magnet jitter amplitudes. In these cases, it is necessary and sufficient to design a rigid magnet support structure that does not amplify ground vibration. Since accelerator beam lines are typically installed at an elevation of 1-2m above ground level, special care has to be taken in order to avoid designing a support structure that acts like an inverted pendulum with a low resonance frequency, resulting in untolerable lateral vibration amplitudes of the accelerator components when excited by either ambient ground motion or vibration sources within the accelerator itself, such as cooling water pumps or helium flow in superconducting magnets. In cases where ground motion amplitudes already exceed the required jiter tolerances, for instance in future linear colliders, passive vibration damping or active stabilization may be considered.

  17. Compact particle accelerator

    Elizondo-Decanini, Juan M.


    A compact particle accelerator having an input portion configured to receive power to produce particles for acceleration, where the input portion includes a switch, is provided. In a general embodiment, a vacuum tube receives particles produced from the input portion at a first end, and a plurality of wafer stacks are positioned serially along the vacuum tube. Each of the plurality of wafer stacks include a dielectric and metal-oxide pair, wherein each of the plurality of wafer stacks further accelerate the particles in the vacuum tube. A beam shaper coupled to a second end of the vacuum tube shapes the particles accelerated by the plurality of wafer stacks into a beam and an output portion outputs the beam.

  18. Rejuvenating CERN's Accelerators


    In the coming years and especially in 2005, CERN's accelerators are going to receive an extensive renovation programme to ensure they will perform reliably and effectively when the LHC comes into service.

  19. Joint International Accelerator School

    CERN Accelerator School


    The CERN and US Particle Accelerator Schools recently organised a Joint International Accelerator School on Beam Loss and Accelerator Protection, held at the Hyatt Regency Hotel, Newport Beach, California, USA from 5-14 November 2014. This Joint School was the 13th in a series of such schools, which started in 1985 and also involves the accelerator communities in Japan and Russia.   Photo courtesy of Alfonse Pham, Michigan State University.   The school attracted 58 participants representing 22 different nationalities, with around half from Europe and the other half from Asia and the Americas. The programme comprised 26 lectures, each of 90 minutes, and 13 hours of case study. The students were given homework each day and had an opportunity to sit a final exam, which counted towards university credit. Feedback from the participants was extremely positive, praising the expertise and enthusiasm of the lecturers, as well as the high standard and quality of their lectures. Initial dis...

  20. Dielectric assist accelerating structure

    Satoh, D.; Yoshida, M.; Hayashizaki, N.


    A higher-order TM02 n mode accelerating structure is proposed based on a novel concept of dielectric loaded rf cavities. This accelerating structure consists of ultralow-loss dielectric cylinders and disks with irises which are periodically arranged in a metallic enclosure. Unlike conventional dielectric loaded accelerating structures, most of the rf power is stored in the vacuum space near the beam axis, leading to a significant reduction of the wall loss, much lower than that of conventional normal-conducting linac structures. This allows us to realize an extremely high quality factor and a very high shunt impedance at room temperature. A simulation of a 5 cell prototype design with an existing alumina ceramic indicates an unloaded quality factor of the accelerating mode over 120 000 and a shunt impedance exceeding 650 M Ω /m at room temperature.

  1. Coherent and incoherent nonparaxial self-accelerating Weber beams

    Zhang, Yiqi; Wen, Feng; Li, Changbiao; Zhang, Zhaoyang; Zhang, Yanpeng; Belić, Milivoj R


    We investigate the coherent and incoherent nonparaxial Weber beams, theoretically and numerically. We show that the superposition of coherent self-accelerating Weber beams with transverse displacement cannot display the nonparaxial accelerating Talbot effect. The reason is that their lobes do not accelerate in unison, which is a requirement for the appearance of the effect. While for the incoherent Weber beams, they naturally cannot display the accelerating Talbot effect but can display the nonparaxial accelerating properties, although the transverse coherence length is smaller than the beam width, based on the second-order coherence theory. Our research method directly applies to the nonparaxial Mathieu beams as well, and one will obtain similar conclusions as for the Weber beams, although this is not discussed in the paper. Our investigation identifies families of nonparaxial accelerating beams that do not exhibit the accelerating Talbot effect, and in addition broadens the understanding of coherence proper...

  2. LHCb GPU Acceleration Project

    AUTHOR|(SzGeCERN)744808; Campora Perez, Daniel Hugo; Neufeld, Niko; Vilasis Cardona, Xavier


    The LHCb detector is due to be upgraded for processing high-luminosity collisions, which will increase the load on its computation infrastructure from 100 GB/s to 4 TB/s, encouraging us to look for new ways of accelerating the Online reconstruction. The Coprocessor Manager is our new framework for integrating LHCb’s existing computation pipelines with massively parallel algorithms running on GPUs and other accelerators. This paper describes the system and analyzes its performance.

  3. Accelerating News Issue 2

    Kahle, K; Wildner, E


    In this summer issue we look at how developments in collimator materials could have applications in aerospace and beyond, and how Polish researchers are harnessing accelerators for medical and industrial uses. We see how the LHC luminosity upgrade is linking with European industry and US researchers, and how the neutrino oscillation community is progressing. We find out the mid-term status of TIARA-PP and how it is mapping European accelerator education resources.

  4. Accelerating Cosmologies from Compactification

    Townsend, P K; Townsend, Paul K.; Wohlfarth, Mattias N.R.


    A solution of the (4+n)-dimensional vacuum Einstein equations is found for which spacetime is compactified on a compact hyperbolic manifold of time-varying volume to a flat four-dimensional FLRW cosmology undergoing accelerated expansion in Einstein conformal frame. This shows that the `no-go' theorem forbidding acceleration in `standard' (time-independent) compactifications of string/M-theory does not apply to `cosmological' (time-dependent) hyperbolic compactifications.

  5. Similarity of samples and trimming

    Álvarez-Esteban, Pedro C; Cuesta-Albertos, Juan A; Matrán, Carlos; 10.3150/11-BEJ351


    We say that two probabilities are similar at level $\\alpha$ if they are contaminated versions (up to an $\\alpha$ fraction) of the same common probability. We show how this model is related to minimal distances between sets of trimmed probabilities. Empirical versions turn out to present an overfitting effect in the sense that trimming beyond the similarity level results in trimmed samples that are closer than expected to each other. We show how this can be combined with a bootstrap approach to assess similarity from two data samples.

  6. Biomedical accelerator mass spectrometry

    Freeman, Stewart P. H. T.; Vogel, John S.


    Ultrasensitive SIMS with accelerator based spectrometers has recently begun to be applied to biomedical problems. Certain very long-lived radioisotopes of very low natural abundances can be used to trace metabolism at environmental dose levels ( [greater-or-equal, slanted] z mol in mg samples). 14C in particular can be employed to label a myriad of compounds. Competing technologies typically require super environmental doses that can perturb the system under investigation, followed by uncertain extrapolation to the low dose regime. 41Ca and 26Al are also used as elemental tracers. Given the sensitivity of the accelerator method, care must be taken to avoid contamination of the mass spectrometer and the apparatus employed in prior sample handling including chemical separation. This infant field comprises the efforts of a dozen accelerator laboratories. The Center for Accelerator Mass Spectrometry has been particularly active. In addition to collaborating with groups further afield, we are researching the kinematics and binding of genotoxins in-house, and we support innovative uses of our capability in the disciplines of chemistry, pharmacology, nutrition and physiology within the University of California. The field can be expected to grow further given the numerous potential applications and the efforts of several groups and companies to integrate more the accelerator technology into biomedical research programs; the development of miniaturized accelerator systems and ion sources capable of interfacing to conventional HPLC and GMC, etc. apparatus for complementary chemical analysis is anticipated for biomedical laboratories.

  7. Accelerators for America's Future

    Bai, Mei


    Particle accelerator, a powerful tool to energize beams of charged particles to a desired speed and energy, has been the working horse for investigating the fundamental structure of matter and fundermental laws of nature. Most known examples are the 2-mile long Stanford Linear Accelerator at SLAC, the high energy proton and anti-proton collider Tevatron at FermiLab, and Large Hadron Collider that is currently under operation at CERN. During the less than a century development of accelerator science and technology that led to a dazzling list of discoveries, particle accelerators have also found various applications beyond particle and nuclear physics research, and become an indispensible part of the economy. Today, one can find a particle accelerator at almost every corner of our lives, ranging from the x-ray machine at the airport security to radiation diagnostic and therapy in hospitals. This presentation will give a brief introduction of the applications of this powerful tool in fundermental research as well as in industry. Challenges in accelerator science and technology will also be briefly presented

  8. Production of S-band Accelerating Structures

    Piel, C; Vogel, H; Vom Stein, P


    ACCEL currently produces accelerating structures for several scientific laboratories. Multi-cell cavities at S-band frequencies are required for the projects CLIC-driver-linac, DLS and ASP pre-injector linac and the MAMI-C microtron. Based on those projects differences and similarities in design, production technologies and requirements will be addressed.

  9. Self-similar aftershock rates

    Davidsen, Jörn; Baiesi, Marco


    In many important systems exhibiting crackling noise—an intermittent avalanchelike relaxation response with power-law and, thus, self-similar distributed event sizes—the "laws" for the rate of activity after large events are not consistent with the overall self-similar behavior expected on theoretical grounds. This is particularly true for the case of seismicity, and a satisfying solution to this paradox has remained outstanding. Here, we propose a generalized description of the aftershock rates which is both self-similar and consistent with all other known self-similar features. Comparing our theoretical predictions with high-resolution earthquake data from Southern California we find excellent agreement, providing particularly clear evidence for a unified description of aftershocks and foreshocks. This may offer an improved framework for time-dependent seismic hazard assessment and earthquake forecasting.

  10. Unmixing of spectrally similar minerals

    Debba, Pravesh


    Full Text Available -bearing oxide/hydroxide/sulfate minerals in complex mixtures be obtained using hyperspectral data? Debba (CSIR) Unmixing of spectrally similar minerals MERAKA 2009 3 / 18 Method of spectral unmixing Old method: problem Linear Spectral Mixture Analysis (LSMA...

  11. Self-similar aftershock rates

    Davidsen, Jörn


    In many important systems exhibiting crackling noise --- intermittent avalanche-like relaxation response with power-law and, thus, self-similar distributed event sizes --- the "laws" for the rate of activity after large events are not consistent with the overall self-similar behavior expected on theoretical grounds. This is in particular true for the case of seismicity and a satisfying solution to this paradox has remained outstanding. Here, we propose a generalized description of the aftershock rates which is both self-similar and consistent with all other known self-similar features. Comparing our theoretical predictions with high resolution earthquake data from Southern California we find excellent agreement, providing in particular clear evidence for a unified description of aftershocks and foreshocks. This may offer an improved way of time-dependent seismic hazard assessment and earthquake forecasting.

  12. Contextual Factors for Finding Similar Experts

    Hofmann, Katja; Balog, Krisztian; Bogers, Toine;


    Expertise-seeking research studies how people search for expertise and choose whom to contact in the context of a specific task. An important outcome are models that identify factors that influence expert finding. Expertise retrieval addresses the same problem, expert finding, but from a system......-seeking models, are rarely taken into account. In this article, we extend content-based expert-finding approaches with contextual factors that have been found to influence human expert finding. We focus on a task of science communicators in a knowledge-intensive environment, the task of finding similar experts......, given an example expert. Our approach combines expertise-seeking and retrieval research. First, we conduct a user study to identify contextual factors that may play a role in the studied task and environment. Then, we design expert retrieval models to capture these factors. We combine these with content...

  13. Diffusive Shock Acceleration and Reconnection Acceleration Processes

    Zank, G. P.; Hunana, P.; Mostafavi, P.; Le Roux, J. A.; Li, Gang; Webb, G. M.; Khabarova, O.; Cummings, A.; Stone, E.; Decker, R.


    Shock waves, as shown by simulations and observations, can generate high levels of downstream vortical turbulence, including magnetic islands. We consider a combination of diffusive shock acceleration (DSA) and downstream magnetic-island-reconnection-related processes as an energization mechanism for charged particles. Observations of electron and ion distributions downstream of interplanetary shocks and the heliospheric termination shock (HTS) are frequently inconsistent with the predictions of classical DSA. We utilize a recently developed transport theory for charged particles propagating diffusively in a turbulent region filled with contracting and reconnecting plasmoids and small-scale current sheets. Particle energization associated with the anti-reconnection electric field, a consequence of magnetic island merging, and magnetic island contraction, are considered. For the former only, we find that (i) the spectrum is a hard power law in particle speed, and (ii) the downstream solution is constant. For downstream plasmoid contraction only, (i) the accelerated spectrum is a hard power law in particle speed; (ii) the particle intensity for a given energy peaks downstream of the shock, and the distance to the peak location increases with increasing particle energy, and (iii) the particle intensity amplification for a particular particle energy, f(x,c/{c}0)/f(0,c/{c}0), is not 1, as predicted by DSA, but increases with increasing particle energy. The general solution combines both the reconnection-induced electric field and plasmoid contraction. The observed energetic particle intensity profile observed by Voyager 2 downstream of the HTS appears to support a particle acceleration mechanism that combines both DSA and magnetic-island-reconnection-related processes.

  14. Community Detection by Neighborhood Similarity

    LIU Xu; XIE Zheng; YI Dong-Yun


    Detection of the community structure in a network is important for understanding the structure and dynamics of the network.By exploring the neighborhood of vertices,a local similarity metric is proposed,which can be quickly computed.The resulting similarity matrix retains the same support as the adjacency matrix.Based on local similarity,an agglomerative hierarchical clustering algorithm is proposed for community detection.The algorithm is implemented by an efficient max-heap data structure and runs in nearly linear time,thus is capable of dealing with large sparse networks with tens of thousands of nodes.Experiments on synthesized and real-world networks demonstrate that our method is efficient to detect community structures,and the proposed metric is the most suitable one among all the tested similarity indices.%Detection of the community structure in a network is important for understanding the structure and dynamics of the network. By exploring the neighborhood of vertices, a local similarity metric is proposed, which can be quickly computed. The resulting similarity matrix retains the same support as the adjacency matrix. Based on local similarity, an agglomerative hierarchical clustering algorithm is proposed for community detection. The algorithm is implemented by an efficient max-heap data structure and runs in nearly linear time, thus is capable of dealing with large sparse networks with tens of thousands of nodes. Experiments on synthesized and real-world networks demonstrate that our method is efficient to detect community structures, and the proposed metric is the most suitable one among all the tested similarity indices.

  15. Similarity measures for protein ensembles

    Lindorff-Larsen, Kresten; Ferkinghoff-Borg, Jesper


    Analyses of similarities and changes in protein conformation can provide important information regarding protein function and evolution. Many scores, including the commonly used root mean square deviation, have therefore been developed to quantify the similarities of different protein conformations...... a synthetic example from molecular dynamics simulations. We then apply the algorithms to revisit the problem of ensemble averaging during structure determination of proteins, and find that an ensemble refinement method is able to recover the correct distribution of conformations better than standard single...

  16. Accelerating Inexact Newton Schemes for Large Systems of Nonlinear Equations

    Fokkema, D.R.; Sleijpen, G.L.G.; Vorst, H.A. van der


    Classical iteration methods for linear systems, such as Jacobi iteration, can be accelerated considerably by Krylov subspace methods like GMRES. In this paper, we describe how inexact Newton methods for nonlinear problems can be accelerated in a similar way and how this leads to a general framework

  17. Small type accelerator. Try for accelerator driven system

    Mori, Y


    FFAG (Fixed-field alternating gradient) accelerator for accelerator driven subcritical reactor, which aims to change from long-lived radioactive waste to short-lived radioactivity, is introduced. It is ring accelerator. The performance needed is proton as accelerator particle, 10MW (total) beam power, about 1GeV beam energy, >30% power efficiency and continuous beam. The feature of FFAG accelerator is constant magnetic field. PoP (Proof-of-principle)-FFAG accelerator, radial type, was run at first in Japan in 2000. The excursion is about some ten cm. In principle, beam can be injected and extracted at any place of ring. The 'multi-fish' acceleration can accelerate beams to 100% duty by repeating acceleration. 150MeV-FFAG accelerator has been started since 2001. It tried to practical use, for example, treatment of cancer. (S.Y.)

  18. Relativistic mergers of black hole binaries have large, similar masses, low spins and are circular

    Amaro-Seoane, Pau; Chen, Xian


    Gravitational waves are a prediction of general relativity, and with ground-based detectors now running in their advanced configuration, we will soon be able to measure them directly for the first time. Binaries of stellar-mass black holes are among the most interesting sources for these detectors. Unfortunately, the many different parameters associated with the problem make it difficult to promptly produce a large set of waveforms for the search in the data stream. To reduce the number of templates to develop, one must restrict some of the physical parameters to a certain range of values predicted by either (electromagnetic) observations or theoretical modelling. In this work, we show that `hyperstellar' black holes (HSBs) with masses 30 ≲ MBH/M⊙ ≲ 100, i.e black holes significantly larger than the nominal 10 M⊙, will have an associated low value for the spin, i.e. a < 0.5. We prove that this is true regardless of the formation channel, and that when two HSBs build a binary, each of the spin magnitudes is also low, and the binary members have similar masses. We also address the distribution of the eccentricities of HSB binaries in dense stellar systems using a large suite of three-body scattering experiments that include binary-single interactions and long-lived hierarchical systems with a highly accurate integrator, including relativistic corrections up to O(1/c^5). We find that most sources in the detector band will have nearly zero eccentricities. This correlation between large, similar masses, low spin and low eccentricity will help to accelerate the searches for gravitational-wave signals.

  19. Similarity of atoms in molecules

    Cioslowski, J.; Nanayakkara, A. (Florida State Univ., Tallahassee, FL (United States))


    Similarity of atoms in molecules is quantitatively assessed with a measure that employs electron densities within respective atomic basins. This atomic similarity measure does not rely on arbitrary assumptions concerning basis functions or 'atomic orbitals', is relatively inexpensive to compute, and has straightforward interpretation. Inspection of similarities between pairs of carbon, hydrogen, and fluorine atoms in the CH[sub 4], CH[sub 3]F, CH[sub 2]F[sub 2], CHF[sub 3], CF[sub 4], C[sub 2]H[sub 2], C[sub 2]H[sub 4], and C[sub 2]H[sub 6] molecules, calculated at the MP2/6-311G[sup **] level of theory, reveals that the atomic similarity is greatly reduced by a change in the number or the character of ligands (i.e. the atoms with nuclei linked through bond paths to the nucleus of the atom in question). On the other hand, atoms with formally identical (i.e. having the same nuclei and numbers of ligands) ligands resemble each other to a large degree, with the similarity indices greater than 0.95 for hydrogens and 0.99 for non-hydrogens. 19 refs., 6 tabs.

  20. Quantifying Similarity in Seismic Polarizations

    Eaton, D. W. S.; Jones, J. P.; Caffagni, E.


    Measuring similarity in seismic attributes can help identify tremor, low S/N signals, and converted or reflected phases, in addition to diagnosing site noise and sensor misalignment in arrays. Polarization analysis is a widely accepted method for studying the orientation and directional characteristics of seismic phases via. computed attributes, but similarity is ordinarily discussed using qualitative comparisons with reference values. Here we introduce a technique for quantitative polarization similarity that uses weighted histograms computed in short, overlapping time windows, drawing on methods adapted from the image processing and computer vision literature. Our method accounts for ambiguity in azimuth and incidence angle and variations in signal-to-noise (S/N) ratio. Using records of the Mw=8.3 Sea of Okhotsk earthquake from CNSN broadband sensors in British Columbia and Yukon Territory, Canada, and vertical borehole array data from a monitoring experiment at Hoadley gas field, central Alberta, Canada, we demonstrate that our method is robust to station spacing. Discrete wavelet analysis extends polarization similarity to the time-frequency domain in a straightforward way. Because histogram distance metrics are bounded by [0 1], clustering allows empirical time-frequency separation of seismic phase arrivals on single-station three-component records. Array processing for automatic seismic phase classification may be possible using subspace clustering of polarization similarity, but efficient algorithms are required to reduce the dimensionality.

  1. Google Ajax Search API

    Fitzgerald, Michael


    Use the Google Ajax Search API to integrateweb search, image search, localsearch, and other types of search intoyour web site by embedding a simple, dynamicsearch box to display search resultsin your own web pages using a fewlines of JavaScript. For those who do not want to write code,the search wizards and solutions builtwith the Google Ajax Search API generatecode to accomplish common taskslike adding local search results to a GoogleMaps API mashup, adding videosearch thumbnails to your web site, oradding a news reel with the latest up todate stories to your blog. More advanced users can

  2. Dielectric laser accelerators

    England, R. Joel; Noble, Robert J.; Bane, Karl; Dowell, David H.; Ng, Cho-Kuen; Spencer, James E.; Tantawi, Sami; Wu, Ziran; Byer, Robert L.; Peralta, Edgar; Soong, Ken; Chang, Chia-Ming; Montazeri, Behnam; Wolf, Stephen J.; Cowan, Benjamin; Dawson, Jay; Gai, Wei; Hommelhoff, Peter; Huang, Yen-Chieh; Jing, Chunguang; McGuinness, Christopher; Palmer, Robert B.; Naranjo, Brian; Rosenzweig, James; Travish, Gil; Mizrahi, Amit; Schachter, Levi; Sears, Christopher; Werner, Gregory R.; Yoder, Rodney B.


    The use of infrared lasers to power optical-scale lithographically fabricated particle accelerators is a developing area of research that has garnered increasing interest in recent years. The physics and technology of this approach is reviewed, which is referred to as dielectric laser acceleration (DLA). In the DLA scheme operating at typical laser pulse lengths of 0.1 to 1 ps, the laser damage fluences for robust dielectric materials correspond to peak surface electric fields in the GV /m regime. The corresponding accelerating field enhancement represents a potential reduction in active length of the accelerator between 1 and 2 orders of magnitude. Power sources for DLA-based accelerators (lasers) are less costly than microwave sources (klystrons) for equivalent average power levels due to wider availability and private sector investment. Because of the high laser-to-particle coupling efficiency, required pulse energies are consistent with tabletop microJoule class lasers. Combined with the very high (MHz) repetition rates these lasers can provide, the DLA approach appears promising for a variety of applications, including future high-energy physics colliders, compact light sources, and portable medical scanners and radiative therapy machines.

  3. Haptic Influence on Visual Search

    Marcia Grabowecky


    Full Text Available Information from different sensory modalities influences perception and attention in tasks such as visual search. We have previously reported identity-based crossmodal influences of audition on visual search (Iordanescu, Guzman-Martinez, Grabowecky, & Suzuki, 2008; Iordanescu, Grabowecky, Franconeri, Theeuwes, and Suzuki, 2010; Iordanescu, Grabowecky and Suzuki, 2011. Here, we extend those results and demonstrate a novel crossmodal interaction between haptic shape information and visual attention. Manually-explored, but unseen, shapes facilitated visual search for similarly-shaped objects. This effect manifests as a reduction in both overall search times and initial saccade latencies when the haptic shape (eg, a sphere is consistent with a visual target (eg, an orange compared to when it is inconsistent with a visual target (eg, a hockey puck]. This haptic-visual interaction occurred even though the manually held shapes were not predictive of the visual target's shape or location, suggesting that the interaction occurs automatically. Furthermore, when the haptic shape was consistent with a distracter in the visual search array (instead of with the target, initial saccades toward the target were disrupted. Together, these results demonstrate a robust shape-specific haptic influence on visual search.

  4. Similarity measures for face recognition

    Vezzetti, Enrico


    Face recognition has several applications, including security, such as (authentication and identification of device users and criminal suspects), and in medicine (corrective surgery and diagnosis). Facial recognition programs rely on algorithms that can compare and compute the similarity between two sets of images. This eBook explains some of the similarity measures used in facial recognition systems in a single volume. Readers will learn about various measures including Minkowski distances, Mahalanobis distances, Hansdorff distances, cosine-based distances, among other methods. The book also summarizes errors that may occur in face recognition methods. Computer scientists "facing face" and looking to select and test different methods of computing similarities will benefit from this book. The book is also useful tool for students undertaking computer vision courses.

  5. Plasma-based accelerator structures

    Schroeder, Carl B. [Univ. of California, Berkeley, CA (United States)


    Plasma-based accelerators have the ability to sustain extremely large accelerating gradients, with possible high-energy physics applications. This dissertation further develops the theory of plasma-based accelerators by addressing three topics: the performance of a hollow plasma channel as an accelerating structure, the generation of ultrashort electron bunches, and the propagation of laser pulses is underdense plasmas.

  6. Search Engine For Ebook Portal

    Prashant Kanade


    Full Text Available The purpose of this paper is to establish the textual analytics involved in developing a search engine for an ebook portal. We have extracted our dataset from Project Gutenberg using a robot harvester. Textual Analytics is used for efficient search retrieval. The entire dataset is represented using Vector Space Model where each document is a vector in the vector space. Further for computational purposes we represent our dataset in the form of a Term Frequency- Inverse Document Frequency tf-idf matrix. The first step involves obtaining the most coherent sequence of words of the search query entered. The entered query is processed using Front End algorithms this includes-Spell Checker Text Segmentation and Language Modeling. Back End processing includes Similarity Modeling Clustering Indexing and Retrieval. The relationship between documents and words is established using cosine similarity measured between the documents and words in Vector Space. Clustering performed is used to suggest books that are similar to the search query entered by the user. Lastly the Lucene Based Elasticsearch engine is used for indexing on the documents. This allows faster retrieval of data. Elasticsearch returns a dictionary and creates a tf-idf matrix. The processed query is compared with the dictionary obtained and tf-idf matrix is used to calculate the score for each match to give most relevant result.

  7. What can we expect from future accelerators

    Panofsky, W.K.H.


    This talk covers a general but highly subjective overview of the expectation for new accelerator development. An updated version of the Livingston chart demonstrates the exponential growth in time of the equivalent laboratory energy of accelerators. A similar Livingston chart pertaining only to electron-positron colliders shows an exponential growth but in the past only one technology - electron-positron storage rings - have been responsible for this development. The question addressed is whether the type of exponential growth reflected by these two charts can be sustained in the future.

  8. Uniform Acceleration in General Relativity

    Friedman, Yaakov


    We extend de la Fuente and Romero's defining equation for uniform acceleration in a general curved spacetime from linear acceleration to the full Lorentz covariant uniform acceleration. In a flat spacetime background, we have explicit solutions. We use generalized Fermi-Walker transport to parallel transport the Frenet basis along the trajectory. In flat spacetime, we obtain velocity and acceleration transformations from a uniformly accelerated system to an inertial system. We obtain the time dilation between accelerated clocks. We apply our acceleration transformations to the motion of a charged particle in a constant electromagnetic field and recover the Lorentz-Abraham-Dirac equation.

  9. Cosmic Plasma Wakefield Acceleration

    Chen, P


    Recently we proposed a new cosmic acceleration mechanism which was based on the wakefields excited by the Alfven shocks in a relativistically owing plasma. In this paper we include some omitted details, and show that there exists a threshold condition for transparency below which the accelerating particle is collision-free and suffers little energy loss in the plasma medium. The stochastic encounters of the random accelerating-decelerating phases results in a power-law energy spectrum: f({epsilon}) {proportional_to} 1/{epsilon}{sup 2}. As an example, we discuss the possible production of super-GZK ultra high energy cosmic rays (UHECR) in the atmosphere of gamma ray bursts. The estimated event rate in our model agrees with that from UHECR observations.

  10. Superconducting Accelerator Magnets

    Mess, K H; Wolff, S


    The main topic of the book are the superconducting dipole and quadrupole magnets needed in high-energy accelerators and storage rings for protons, antiprotons or heavy ions. The basic principles of low-temperature superconductivity are outlined with special emphasis on the effects which are relevant for accelerator magnets. Properties and fabrication methods of practical superconductors are described. Analytical methods for field calculation and multipole expansion are presented for coils without and with iron yoke. The effect of yoke saturation and geometric distortions on field quality is studied. Persistent magnetization currents in the superconductor and eddy currents the copper part of the cable are analyzed in detail and their influence on field quality and magnet performance is investigated. Superconductor stability, quench origins and propagation and magnet protection are addressed. Some important concepts of accelerator physics are introduced which are needed to appreciate the demanding requirements ...

  11. Particle accelerator physics

    Wiedemann, Helmut


    Particle Accelerator Physics is an in-depth and comprehensive introduction to the field of high-energy particle acceleration and beam dynamics. Part I gathers the basic tools, recalling the essentials of electrostatics and electrodynamics as well as of particle dynamics in electromagnetic fields. Part II is an extensive primer in beam dynamics, followed in Part III by the introduction and description of the main beam parameters. Part IV is devoted to the treatment of perturbations in beam dynamics. Part V discusses the details of charged particle accleration. Part VI and Part VII introduce the more advanced topics of coupled beam dynamics and the description of very intense beams. Part VIII is an exhaustive treatment of radiation from accelerated charges and introduces important sources of coherent radiation such as synchrotrons and free-electron lasers. Part IX collects the appendices gathering useful mathematical and physical formulae, parameters and units. Solutions to many end-of-chapter problems are give...

  12. Non-thermal Electron Acceleration in Low Mach Number Collisionless Shocks. I. Particle Energy Spectra and Acceleration Mechanism

    Guo, Xinyi; Sironi, Lorenzo; Narayan, Ramesh


    Electron acceleration to non-thermal energies in low Mach number (Ms Diffusive shock acceleration, also known as first-order Fermi acceleration, cannot be directly invoked to explain the acceleration of electrons. Rather, an additional mechanism is required to pre-accelerate the electrons from thermal to supra-thermal energies, so they can then participate in the Fermi process. In this work, we use two- and three-dimensional particle-in-cell plasma simulations to study electron acceleration in low Mach number shocks. We focus on the particle energy spectra and the acceleration mechanism in a reference run with Ms = 3 and a quasi-perpendicular pre-shock magnetic field. We find that about 15% of the electrons can be efficiently accelerated, forming a non-thermal power-law tail in the energy spectrum with a slope of p ~= 2.4. Initially, thermal electrons are energized at the shock front via shock drift acceleration (SDA). The accelerated electrons are then reflected back upstream where their interaction with the incoming flow generates magnetic waves. In turn, the waves scatter the electrons propagating upstream back toward the shock for further energization via SDA. In summary, the self-generated waves allow for repeated cycles of SDA, similarly to a sustained Fermi-like process. This mechanism offers a natural solution to the conflict between the bright radio synchrotron emission observed from the outskirts of galaxy clusters and the low electron acceleration efficiency usually expected in low Mach number shocks.

  13. Microelectromechanical acceleration-sensing apparatus

    Lee, Robb M. (Albuquerque, NM); Shul, Randy J. (Albuquerque, NM); Polosky, Marc A. (Albuquerque, NM); Hoke, Darren A. (Albuquerque, NM); Vernon, George E. (Rio Rancho, NM)


    An acceleration-sensing apparatus is disclosed which includes a moveable shuttle (i.e. a suspended mass) and a latch for capturing and holding the shuttle when an acceleration event is sensed above a predetermined threshold level. The acceleration-sensing apparatus provides a switch closure upon sensing the acceleration event and remains latched in place thereafter. Examples of the acceleration-sensing apparatus are provided which are responsive to an acceleration component in a single direction (i.e. a single-sided device) or to two oppositely-directed acceleration components (i.e. a dual-sided device). A two-stage acceleration-sensing apparatus is also disclosed which can sense two acceleration events separated in time. The acceleration-sensing apparatus of the present invention has applications, for example, in an automotive airbag deployment system.

  14. Distance learning for similarity estimation

    Yu, J.; Amores, J.; Sebe, N.; Radeva, P.; Tian, Q.


    In this paper, we present a general guideline to find a better distance measure for similarity estimation based on statistical analysis of distribution models and distance functions. A new set of distance measures are derived from the harmonic distance, the geometric distance, and their generalized

  15. Distance learning for similarity estimation.

    Yu, Jie; Amores, Jaume; Sebe, Nicu; Radeva, Petia; Tian, Qi


    In this paper, we present a general guideline to find a better distance measure for similarity estimation based on statistical analysis of distribution models and distance functions. A new set of distance measures are derived from the harmonic distance, the geometric distance, and their generalized variants according to the Maximum Likelihood theory. These measures can provide a more accurate feature model than the classical Euclidean and Manhattan distances. We also find that the feature elements are often from heterogeneous sources that may have different influence on similarity estimation. Therefore, the assumption of single isotropic distribution model is often inappropriate. To alleviate this problem, we use a boosted distance measure framework that finds multiple distance measures which fit the distribution of selected feature elements best for accurate similarity estimation. The new distance measures for similarity estimation are tested on two applications: stereo matching and motion tracking in video sequences. The performance of boosted distance measure is further evaluated on several benchmark data sets from the UCI repository and two image retrieval applications. In all the experiments, robust results are obtained based on the proposed methods.

  16. Revisiting Inter-Genre Similarity

    Sturm, Bob L.; Gouyon, Fabien


    We revisit the idea of ``inter-genre similarity'' (IGS) for machine learning in general, and music genre recognition in particular. We show analytically that the probability of error for IGS is higher than naive Bayes classification with zero-one loss (NB). We show empirically that IGS does...

  17. Comparison of hydrological similarity measures

    Rianna, Maura; Ridolfi, Elena; Manciola, Piergiorgio; Napolitano, Francesco; Russo, Fabio


    The use of a traditional at site approach for the statistical characterization and simulation of spatio-temporal precipitation fields has a major recognized drawback. Indeed, the weakness of the methodology is related to the estimation of rare events and it involves the uncertainty of the at-site sample statistical inference, because of the limited length of records. In order to overcome the lack of at-site observations, regional frequency approach uses the idea of substituting space for time to estimate design floods. The conventional regional frequency analysis estimates quantile values at a specific site from multi-site analysis. The main idea is that homogeneous sites, once pooled together, have similar probability distribution curves of extremes, except for a scaling factor. The method for pooling groups of sites can be based on geographical or climatological considerations. In this work the region of influence (ROI) pooling method is compared with an entropy-based one. The ROI is a flexible pooling group approach which defines for each site its own "region" formed by a unique set of similar stations. The similarity is found through the Euclidean distance metric in the attribute space. Here an alternative approach based on entropy is introduced to cluster homogeneous sites. The core idea is that homogeneous sites share a redundant (i.e. similar) amount of information. Homogeneous sites are pooled through a hierarchical selection based on the mutual information index (i.e. a measure of redundancy). The method is tested on precipitation data in Central Italy area.

  18. Power Supplies for High Energy Particle Accelerators

    Dey, Pranab Kumar


    The on-going research and the development projects with Large Hadron Collider at CERN, Geneva, Switzerland has generated enormous enthusiasm and interest amongst all to know about the ultimate findings on `God's Particle'. This paper has made an attempt to unfold the power supply requirements and the methodology adopted to provide the stringent demand of such high energy particle accelerators during the initial stages of the search for the ultimate particles. An attempt has also been made to highlight the present status on the requirement of power supplies in some high energy accelerators with a view that, precautionary measures can be drawn during design and development from earlier experience which will be of help for the proposed third generation synchrotron to be installed in India at a huge cost.

  19. Acceleration Factor Harmonious Particle Swarm Optimizer

    Jie Chen; Feng Pan; Tao Cai


    A Particle Swarm Optimizer (PSO) exhibits good performance for optimization problems, although it cannot guarantee convergence to a global, or even local minimum. However, there are some adjustable parameters, and restrictive conditions, which can affect the performance of the algorithm. In this paper, the sufficient conditions for the asymptotic stability of an acceleration factor and inertia weight are deduced, the value of the inertia weight ω is enhanced to (-1, 1).Furthermore a new adaptive PSO algorithm - Acceleration Factor Harmonious PSO (AFHPSO) is proposed, and is proved to be a global search algorithm. AFHPSO is used for the parameter design of a fuzzy controller for a linear motor driving servo system. The performance of the nonlinear model for the servo system demonstrates the effectiveness of the optimized fuzzy controller and AFHPSO.

  20. Annotated bibliography on high-intensity linear accelerators. [240 citations

    Jameson, R.A.; Roybal, E.U.


    A technical bibliography covering subjects important to the design of high-intensity beam transport systems and linear accelerators is presented. Space charge and emittance growth are stressed. Subject and author concordances provide cross-reference to detailed citations, which include an abstract and notes on the material. The bibliography resides in a computer database that can be searched for key words and phrases.

  1. Accelerated Peer-Review Journal Usage Technique for Undergraduates

    Wallace, J. D.


    The internet has given undergraduate students ever-increasing access to academic journals via search engines and online databases. However, students typically do not have the ability to use these journals effectively. This often poses a dilemma for instructors. The accelerated peer-review journal usage (APJU) technique provides a way for…


    Ramshankar Vijayalakshmi


    Full Text Available Recently Biopharmaceuticals are the new chemotherapeutical agents that are called as “Biosimilars” or “follow on protein products” by the European Medicines Agency (EMA and the American regulatory agencies (Food and Drug Administration respectively. Biosimilars are extremely similar to the reference molecule but not identical, however close their similarities may be. A regulatory framework is therefore in place to assess the application for marketing authorisation of biosimilars. When a biosimilar is similar to the reference biopharmaceutical in terms of safety, quality, and efficacy, it can be registered. It is important to document data from clinical trials with a view of similar safety and efficacy. If the development time for a generic medicine is around 3 years, a biosimilar takes about 6-9 years. Generic medicines need to demonstrate bioequivalence only unlike biosimilars that need to conduct phase I and Phase III clinical trials. In this review, different biosimilars that are already being used successfully in the field on Oncology is discussed. Their similarity, differences and guidelines to be followed before a clinically informed decision to be taken, is discussed. More importantly the regulatory guidelines that are operational in India with a work flow of making a biosimilar with relevant dos and dont’s are discussed. For a large populous country like India, where with improved treatments in all sectors including oncology, our ageing population is increasing. For the health care of this sector, we need more newer, cheaper and effective biosimilars in the market. It becomes therefore important to understand the regulatory guidelines and steps to come up with more biosimilars for the existing population and also more information is mandatory for the practicing clinicians to translate these effectively into clinical practice.

  3. Accelerating time to benefit

    Svejvig, Per; Geraldi, Joana; Grex, Sara

    Despite the ubiquitous pressure for speed, our approaches to accelerate projects remain constrained to the old-fashioned understanding of the project as a vehicle to deliver products and services, not value. This article explores an attempt to accelerate time to benefit. We describe and deconstruct...... of the time. Although all cases valued speed and speed to benefit, and implemented most practices proposed by the methodology, only three of the five projects were more successful in decreasing time to speed. Based on a multi-case study comparison between these five different projects and their respective...

  4. Accelerating News Issue 4

    Szeberenyi, A; Wildner, E


    In this winter issue, we are very pleased to announce the approval of EuCARD-2 by the European Commission. We look at the conclusions of EUROnu in proposing future neutrino facilities at CERN, a new milestone reached by CLIC and progress on the SPARC upgrade using C-band technology. We also report on recent events: second Joint HiLumi LHC-LARP Annual Meeting and workshop on Superconducting technologies for the Next Generation of Accelerators aiming at closer collaboration with industry. The launch of the Accelerators for Society brochure is also highlighted.

  5. Shielding high energy accelerators

    Stevenson, Graham Roger


    After introducing the subject of shielding high energy accelerators, point source, line-of-sight models, and in particular the Moyer model. are discussed. Their use in the shielding of proton and electron accelerators is demonstrated and their limitations noted. especially in relation to shielding in the forward direction provided by large, flat walls. The limitations of reducing problems to those using it cylindrical geometry description are stressed. Finally the use of different estimators for predicting dose is discussed. It is suggested that dose calculated from track-length estimators will generally give the most satisfactory estimate. (9 refs).

  6. Slowed Search in the Context of Unimpaired Grouping in Autism: Evidence from Multiple Conjunction Search.

    Keehn, Brandon; Joseph, Robert M


    In multiple conjunction search, the target is not known in advance but is defined only with respect to the distractors in a given search array, thus reducing the contributions of bottom-up and top-down attentional and perceptual processes during search. This study investigated whether the superior visual search skills typically demonstrated by individuals with autism spectrum disorder (ASD) would be evident in multiple conjunction search. Thirty-two children with ASD and 32 age- and nonverbal IQ-matched typically developing (TD) children were administered a multiple conjunction search task. Contrary to findings from the large majority of studies on visual search in ASD, response times of individuals with ASD were significantly slower than those of their TD peers. Evidence of slowed performance in ASD suggests that the mechanisms responsible for superior ASD performance in other visual search paradigms are not available in multiple conjunction search. Although the ASD group failed to exhibit superior performance, they showed efficient search and intertrial priming levels similar to the TD group. Efficient search indicates that ASD participants were able to group distractors into distinct subsets. In summary, while demonstrating grouping and priming effects comparable to those exhibited by their TD peers, children with ASD were slowed in their performance on a multiple conjunction search task, suggesting that their usual superior performance in visual search tasks is specifically dependent on top-down and/or bottom-up attentional and perceptual processes.

  7. Enhancing Divergent Search through Extinction Events

    Lehman, Joel; Miikkulainen, Risto


    A challenge in evolutionary computation is to create representations as evolvable as those in natural evolution. This paper hypothesizes that extinction events, i.e. mass extinctions, can significantly increase evolvability, but only when combined with a divergent search algorithm, i.e. a search...... for the capacity to evolve. This hypothesis is tested through experiments in two evolutionary robotics domains. The results show that combining extinction events with divergent search increases evolvability, while combining them with convergent search offers no similar benefit. The conclusion is that extinction...

  8. Landscape similarity, retrieval, and machine mapping of physiographic units

    Jasiewicz, Jaroslaw; Netzel, Pawel; Stepinski, Tomasz F.


    We introduce landscape similarity - a numerical measure that assesses affinity between two landscapes on the basis of similarity between the patterns of their constituent landform elements. Such a similarity function provides core technology for a landscape search engine - an algorithm that parses the topography of a study area and finds all places with landscapes broadly similar to a landscape template. A landscape search can yield answers to a query in real time, enabling a highly effective means to explore large topographic datasets. In turn, a landscape search facilitates auto-mapping of physiographic units within a study area. The country of Poland serves as a test bed for these novel concepts. The topography of Poland is given by a 30 m resolution DEM. The geomorphons method is applied to this DEM to classify the topography into ten common types of landform elements. A local landscape is represented by a square tile cut out of a map of landform elements. A histogram of cell-pair features is used to succinctly encode the composition and texture of a pattern within a local landscape. The affinity between two local landscapes is assessed using the Wave-Hedges similarity function applied to the two corresponding histograms. For a landscape search the study area is organized into a lattice of local landscapes. During the search the algorithm calculates the similarity between each local landscape and a given query. Our landscape search for Poland is implemented as a GeoWeb application called TerraEx-Pl and is available at Given a sample, or a number of samples, from a target physiographic unit the landscape search delineates this unit using the principles of supervised machine learning. Repeating this procedure for all units yields a complete physiographic map. The application of this methodology to topographic data of Poland results in the delineation of nine physiographic units. The resultant map bears a close resemblance to a conventional

  9. Photonic Crystal Laser Accelerator Structures

    Cowan, Benjamin M


    Photonic crystals have great potential for use as laser-driven accelerator structures. A photonic crystal is a dielectric structure arranged in a periodic geometry. Like a crystalline solid with its electronic band structure, the modes of a photonic crystal lie in a set of allowed photonic bands. Similarly, it is possible for a photonic crystal to exhibit one or more photonic band gaps, with frequencies in the gap unable to propagate in the crystal. Thus photonic crystals can confine an optical mode in an all-dielectric structure, eliminating the need for metals and their characteristic losses at optical frequencies. We discuss several geometries of photonic crystal accelerator structures. Photonic crystal fibers (PCFs) are optical fibers which can confine a speed-of-light optical mode in vacuum. Planar structures, both two- and three-dimensional, can also confine such a mode, and have the additional advantage that they can be manufactured using common microfabrication techniques such as those used for integrated circuits. This allows for a variety of possible materials, so that dielectrics with desirable optical and radiation-hardness properties can be chosen. We discuss examples of simulated photonic crystal structures to demonstrate the scaling laws and trade-offs involved, and touch on potential fabrication processes.

  10. An algorithm for online optimization of accelerators

    Huang, Xiaobiao [SLAC National Accelerator Lab., Menlo Park, CA (United States); Corbett, Jeff [SLAC National Accelerator Lab., Menlo Park, CA (United States); Safranek, James [SLAC National Accelerator Lab., Menlo Park, CA (United States); Wu, Juhao [SLAC National Accelerator Lab., Menlo Park, CA (United States)


    We developed a general algorithm for online optimization of accelerator performance, i.e., online tuning, using the performance measure as the objective function. This method, named robust conjugate direction search (RCDS), combines the conjugate direction set approach of Powell's method with a robust line optimizer which considers the random noise in bracketing the minimum and uses parabolic fit of data points that uniformly sample the bracketed zone. Moreover, it is much more robust against noise than traditional algorithms and is therefore suitable for online application. Simulation and experimental studies have been carried out to demonstrate the strength of the new algorithm.

  11. Influence of spatiotemporal coupling on the capture-and-acceleration-scenario vacuum electron acceleration by ultrashort pulsed laser beam

    Lu Da-Quan; Qian Lie-Jia; Li Yong-Zhong; Fan Dian-Yuan


    This paper investigates the properties of the ultrashort pulsed beam aimed to the capture-and-acceleration-scenario(CAS) vacuum electron acceleration. The result shows that the spatiotemporal distribution of the phase velocity, the longitudinal component of the electric field and the acceleration quality factor are qualitatively similar to that of the continuous-wave Gaussian beam, and are slightly influenced by the spatiotemporal coupling of the ultrashort pulsed beam. When the pulse is compressed to an ultrashort one in which the pulse duration TFWHM < 5T0, the variation of the maximum net energy gain due to the carrier-envelope phase is a crucial disadvantage in the CAS acceleration process.

  12. Sound Search Engine Concept


    Sound search is provided by the major search engines, however, indexing is text based, not sound based. We will establish a dedicated sound search services with based on sound feature indexing. The current demo shows the concept of the sound search engine. The first engine will be realased June...

  13. Sound Search Engine Concept


    Sound search is provided by the major search engines, however, indexing is text based, not sound based. We will establish a dedicated sound search services with based on sound feature indexing. The current demo shows the concept of the sound search engine. The first engine will be realased June...

  14. Web Search Engines

    Rajashekar, TB


    The World Wide Web is emerging as an all-in-one information source. Tools for searching Web-based information include search engines, subject directories and meta search tools. We take a look at key features of these tools and suggest practical hints for effective Web searching.

  15. Self-similarity Driven Demosaicking

    Antoni Buades


    Full Text Available Digital cameras record only one color component per pixel, red, green or blue. Demosaicking is the process by which one can infer a whole color matrix from such a matrix of values, thus interpolating the two missing color values per pixel. In this article we propose a demosaicking method based on the property of non-local self-similarity of images.

  16. Sparse Similarity-Based Fisherfaces

    Fagertun, Jens; Gomez, David Delgado; Hansen, Mads Fogtmann;


    In this work, the effect of introducing Sparse Principal Component Analysis within the Similarity-based Fisherfaces algorithm is examined. The technique aims at mimicking the human ability to discriminate faces by projecting the faces in a highly discriminative and easy interpretative way. Pixel...... obtain the same recognition results as the technique in a dense version using only a fraction of the input data. Furthermore, the presented results suggest that using SPCA in the technique offers robustness to occlusions....

  17. Roget's Thesaurus and Semantic Similarity

    Jarmasz, Mario


    We have implemented a system that measures semantic similarity using a computerized 1987 Roget's Thesaurus, and evaluated it by performing a few typical tests. We compare the results of these tests with those produced by WordNet-based similarity measures. One of the benchmarks is Miller and Charles' list of 30 noun pairs to which human judges had assigned similarity measures. We correlate these measures with those computed by several NLP systems. The 30 pairs can be traced back to Rubenstein and Goodenough's 65 pairs, which we have also studied. Our Roget's-based system gets correlations of .878 for the smaller and .818 for the larger list of noun pairs; this is quite close to the .885 that Resnik obtained when he employed humans to replicate the Miller and Charles experiment. We further evaluate our measure by using Roget's and WordNet to answer 80 TOEFL, 50 ESL and 300 Reader's Digest questions: the correct synonym must be selected amongst a group of four words. Our system gets 78.75%, 82.00% and 74.33% of ...

  18. Self-Similar Collisionless Shocks

    Katz, B; Waxman, E; Katz, Boaz; Keshet, Uri; Waxman, Eli


    Observations of gamma-ray burst afterglows suggest that the correlation length of magnetic field fluctuations downstream of relativistic non-magnetized collisionless shocks grows with distance from the shock to scales much larger than the plasma skin depth. We argue that this indicates that the plasma properties are described by a self-similar solution, and derive constraints on the scaling properties of the solution. For example, we find that the scaling of the characteristic magnetic field amplitude with distance from the shock is B \\propto D^{s_B} with -1 \\propto x^{2s_B} (for x>>D). We show that the plasma may be approximated as a combination of two self-similar components: a kinetic component of energetic particles and an MHD-like component representing "thermal" particles. We argue that the latter may be considered as infinitely conducting, in which case s_B=0 and the scalings are completely determined (e.g. dn/dE \\propto E^{-2} and B \\propto D^0). Similar claims apply to non- relativistic shocks such a...

  19. Combined generating-accelerating buncher for compact linear accelerators

    Savin, E. A.; Matsievskiy, S. V.; Sobenin, N. P.; Sokolov, I. D.; Zavadtsev, A. A.


    Described in the previous article [1] method of the power extraction from the modulated electron beam has been applied to the compact standing wave electron linear accelerator feeding system, which doesnt require any connection waveguides between the power source and the accelerator itself [2]. Generating and accelerating bunches meet in the hybrid accelerating cell operating at TM020 mode, thus the accelerating module is placed on the axis of the generating module, which consists from the pulsed high voltage electron sources and electrons dumps. This combination makes the accelerator very compact in size which is very valuable for the modern applications such as portable inspection sources. Simulations and geometry cold tests are presented.

  20. The CERN accelerator complex

    Christiane Lefèvre


    The LHC is the last ring (dark grey line) in a complex chain of particle accelerators. The smaller machines are used in a chain to help boost the particles to their final energies and provide beams to a whole set of smaller experiments, which also aim to uncover the mysteries of the Universe.

  1. The CERN Accelerator School


      Introduction to accelerator physics This course will take place in Istanbul, Turkey, from 18 to 30 September 2016. It is now open for registration, and further information can be found here:

  2. The CERN Accelerator School


    Introduction to accelerator physics This course will take place in Budapest, Hungary, from 2 to 14 October 2016. It is now open for registration and further information can be found at: and

  3. The CERN accelerator complex

    De Melis, Cinzia


    The LHC is the last ring (dark blue line) in a complex chain of particle accelerators. The smaller machines are used in a chain to help boost the particles to their final energies and provide beams to a whole set of smaller experiments, which also aim to uncover the mysteries of the Universe.

  4. Acceleration and Special Relativity

    Yahalomi, E M


    The integration of acceleration over time before reaching the uniformvelocity turns out to be the source of all the special relativity effects. Itexplains physical phenomena like clocks comparisons. The equations forspace-time, mass and energy are presented. This phenomenon complements theexplanation for the twins paradox. A Universal reference frame is obtained.

  5. Prospects for Accelerator Technology

    Todd, Alan


    Accelerator technology today is a greater than US$5 billion per annum business. Development of higher-performance technology with improved reliability that delivers reduced system size and life cycle cost is expected to significantly increase the total accelerator technology market and open up new application sales. Potential future directions are identified and pitfalls in new market penetration are considered. Both of the present big market segments, medical radiation therapy units and semiconductor ion implanters, are approaching the "maturity" phase of their product cycles, where incremental development rather than paradigm shifts is the norm, but they should continue to dominate commercial sales for some time. It is anticipated that large discovery-science accelerators will continue to provide a specialty market beset by the unpredictable cycles resulting from the scale of the projects themselves, coupled with external political and economic drivers. Although fraught with differing market entry difficulties, the security and environmental markets, together with new, as yet unrealized, industrial material processing applications, are expected to provide the bulk of future commercial accelerator technology growth.

  6. The CERN accelerator complex

    Haffner, Julie


    The LHC is the last ring (dark grey line) in a complex chain of particle accelerators. The smaller machines are used in a chain to help boost the particles to their final energies and provide beams to a whole set of smaller experiments, which also aim to uncover the mysteries of the Universe.

  7. SPS accelerating cavity


    One of the SPS acceleration cavities (200 MHz, travelling wave structure). On the ceiling one sees the coaxial transmission line which feeds the power from the amplifier, located in a surface building above, to the upstream end of the cavity. See 7603195 for more details, 7411032 for the travelling wave structure, and also 8104138, 8302397.

  8. The CERN accelerator complex

    Mobs, Esma Anais


    The LHC is the last ring (dark blue line) in a complex chain of particle accelerators. The smaller machines are used in a chain to help boost the particles to their final energies and provide beams to a whole set of smaller experiments, which also aim to uncover the mysteries of the Universe.

  9. Atmospheric and accelerator neutrinos

    Suzuki, Yoichiro [Kamioka Observatory, Institute for Cosmic Ray Research, University of Tokyo Higashi-Mozumi, Kamioka, Hida-City, Gifu 506-1205 (Japan)


    Results from the atmospheric neutrino measurements are presented. Evidence for the {nu}{sub {tau}} appearance in the atmospheric neutrino events was shown by statistical methods. The long baseline oscillation experiment using man-made neutrinos has confirmed the atmospheric neutrino oscillation. The future accelerator experiments are briefly discussed.

  10. SPS accelerating cavity

    CERN PhotoLab


    One of the SPS accelerating cavities (200 MHz, travelling wave structure). The power that is fed into the upstream end of the cavity is extracted at the downstream end and sent into a dump load. See 7603195 for more details, 7411032 for the travelling wave structure, and also 8011289, 8302397.

  11. Angular Accelerating White Light

    Dudley, Angela L


    Full Text Available Shaping XVI, 958104, San Diego, California, United States, 09 August 2015 Angular Accelerating White Light Angela Dudley*a,b, Christian Vetterc , Alexander Szameitc , and Andrew Forbesa,b a CSIR National Laser Centre, PO Box 395, Pretoria 0001...

  12. Large Neighborhood Search

    Pisinger, David; Røpke, Stefan


    Heuristics based on large neighborhood search have recently shown outstanding results in solving various transportation and scheduling problems. Large neighborhood search methods explore a complex neighborhood by use of heuristics. Using large neighborhoods makes it possible to find better...... candidate solutions in each iteration and hence traverse a more promising search path. Starting from the large neighborhood search method,we give an overview of very large scale neighborhood search methods and discuss recent variants and extensions like variable depth search and adaptive large neighborhood...... search....

  13. Heavy ion acceleration in the Breakout Afterburner regime

    Petrov, G M; Thomas, A G R; Krushelnick, K; Beg, F N


    Theoretical study of heavy ion acceleration from an ultrathin (20 nm) gold foil irradiated by sub-picosecond lasers is presented. Using two dimensional particle-in-cell simulations we identified two highly efficient ion acceleration schemes. By varying the laser pulse duration we observed a transition from Radiation Pressure Acceleration to the Breakout Afterburner regime akin to light ions. The underlying physics and ion acceleration regimes are similar to that of light ions, however, nuances of the acceleration process make the acceleration of heavy ions more challenging. Two laser systems are studied in detail: the Texas Petawatt Laser and the Trident laser, the former having pulse duration 180 fs, intermediate between very short femtosecond pulses and picosecond pulses. Both laser systems generated directional gold ions beams (~10 degrees half-angle) with fluxes in excess of 1011 ion/sr and normalized energy >10 MeV/nucleon.

  14. The human ocular torsion position response during yaw angular acceleration.

    Smith, S T; Curthoys, I S; Moore, S T


    Recent results by Wearne [(1993) Ph.D. thesis] using the scleral search-coil method of measuring eye position indicate that changes in ocular torsion position (OTP) occur during yaw angular acceleration about an earth vertical axis. The present set of experiments, using an image processing method of eye movement measurement free from the possible confound of search coil slippage, demonstrates the generality and repeatability of this phenomenon and examines its possible causes. The change in torsion position is not a linear vestibulo-ocular reflex (LVOR) response to interaural linear acceleration stimulation of the otoliths, but rather the effect is dependent on the characteristics of the angular acceleration stimulus, commencing at the onset and decaying at the offset of the angular acceleration. In the experiments reported here, the magnitude of the angular acceleration stimulus was varied and the torsion position response showed corresponding variations. We consider that the change in torsion position observed during angular acceleration is most likely to be due to activity of the semicircular canals.

  15. Rotational invariant similarity measurement for content-based image indexing

    Ro, Yong M.; Yoo, Kiwon


    We propose a similarity matching technique for contents based image retrieval. The proposed technique is invariant from rotated images. Since image contents for image indexing and retrieval would be arbitrarily extracted from still image or key frame of video, the rotation invariant property of feature description of image is important for general application of contents based image indexing and retrieval. In this paper, we propose a rotation invariant similarity measurement in cooperating with texture featuring base on HVS. To simplify computational complexity, we employed hierarchical similarity distance searching. To verify the method, experiments with MPEG-7 data set are performed.

  16. GPUs as Storage System Accelerators

    Al-Kiswany, Samer; Ripeanu, Matei


    Massively multicore processors, such as Graphics Processing Units (GPUs), provide, at a comparable price, a one order of magnitude higher peak performance than traditional CPUs. This drop in the cost of computation, as any order-of-magnitude drop in the cost per unit of performance for a class of system components, triggers the opportunity to redesign systems and to explore new ways to engineer them to recalibrate the cost-to-performance relation. This project explores the feasibility of harnessing GPUs' computational power to improve the performance, reliability, or security of distributed storage systems. In this context, we present the design of a storage system prototype that uses GPU offloading to accelerate a number of computationally intensive primitives based on hashing, and introduce techniques to efficiently leverage the processing power of GPUs. We evaluate the performance of this prototype under two configurations: as a content addressable storage system that facilitates online similarity detectio...

  17. Neurodegeneration in accelerated aging.

    Scheibye-Knudsen, Moren


    The growing proportion of elderly people represents an increasing economic burden, not least because of age-associated diseases that pose a significant cost to the health service. Finding possible interventions to age-associated disorders therefore have wide ranging implications. A number of genetically defined accelerated aging diseases have been characterized that can aid in our understanding of aging. Interestingly, all these diseases are associated with defects in the maintenance of our genome. A subset of these disorders, Cockayne syndrome, Xeroderma pigmentosum group A and ataxia-telangiectasia, show neurological involvement reminiscent of what is seen in primary human mitochondrial diseases. Mitochondria are the power plants of the cells converting energy stored in oxygen, sugar, fat, and protein into ATP, the energetic currency of our body. Emerging evidence has linked this organelle to aging and finding mitochondrial dysfunction in accelerated aging disorders thereby strengthens the mitochondrial theory of aging. This theory states that an accumulation of damage to the mitochondria may underlie the process of aging. Indeed, it appears that some accelerated aging disorders that show neurodegeneration also have mitochondrial dysfunction. The mitochondrial alterations may be secondary to defects in nuclear DNA repair. Indeed, nuclear DNA damage may lead to increased energy consumption, alterations in mitochondrial ATP production and defects in mitochondrial recycling, a term called mitophagy. These changes may be caused by activation of poly-ADP-ribose-polymerase 1 (PARP1), an enzyme that responds to DNA damage. Upon activation PARP1 utilizes key metabolites that attenuate pathways that are normally protective for the cell. Notably, pharmacological inhibition of PARP1 or reconstitution of the metabolites rescues the changes caused by PARP1 hyperactivation and in many cases reverse the phenotypes associated with accelerated aging. This implies that modulation

  18. Hot self-similar relativistic MHD flows

    Zakamska, Nadia L; Blandford, Roger D


    We consider axisymmetric relativistic jets with a toroidal magnetic field and an ultrarelativistic equation of state, with the goal of studying the lateral structure of jets whose pressure is matched to the pressure of the medium through which they propagate. We find all self-similar steady-state solutions of the relativistic MHD equations for this setup. One of the solutions is the case of a parabolic jet being accelerated by the pressure gradient as it propagates through a medium with pressure declining as p(z)\\propto z^{-2}. As the jet material expands due to internal pressure gradients, it runs into the ambient medium resulting in a pile-up of material along the jet boundary, while the magnetic field acts to produce a magnetic pinch along the axis of the jet. Such jets can be in a lateral pressure equilibrium only if their opening angle \\theta_j at distance z is smaller than about 1/\\gamma, where \\gamma is the characteristic bulk Lorentz-factor at this distance; otherwise, different parts of the jet canno...

  19. Assessing protein kinase target similarity

    Gani, Osman A; Thakkar, Balmukund; Narayanan, Dilip


    : focussed chemical libraries, drug repurposing, polypharmacological design, to name a few. Protein kinase target similarity is easily quantified by sequence, and its relevance to ligand design includes broad classification by key binding sites, evaluation of resistance mutations, and the use of surrogate......" of sequence and crystal structure information, with statistical methods able to identify key correlates to activity but also here, "the devil is in the details." Examples from specific repurposing and polypharmacology applications illustrate these points. This article is part of a Special Issue entitled...

  20. The Search Performance Evaluation and Prediction in Exploratory Search


    The exploratory search for complex search tasks requires an effective search behavior model to evaluate and predict user search performance. Few studies have investigated the relationship between user search behavior and search performance in exploratory search. This research adopts a mixed approach combining search system development, user search experiment, search query log analysis, and multivariate regression analysis to resolve the knowledge gap. Through this study, it is shown that expl...

  1. The Search Performance Evaluation and Prediction in Exploratory Search

    Liu, Fei


    The exploratory search for complex search tasks requires an effective search behavior model to evaluate and predict user search performance. Few studies have investigated the relationship between user search behavior and search performance in exploratory search. This research adopts a mixed approach combining search system development, user search experiment, search query log analysis, and multivariate regression analysis to resolve the knowledge gap. Through this study, it is shown that expl...

  2. Nonlinear dynamics in particle accelerators

    Dilão, Rui


    This book is an introductory course to accelerator physics at the level of graduate students. It has been written for a large audience which includes users of accelerator facilities, accelerator physicists and engineers, and undergraduates aiming to learn the basic principles of construction, operation and applications of accelerators.The new concepts of dynamical systems developed in the last twenty years give the theoretical setting to analyse the stability of particle beams in accelerator. In this book a common language to both accelerator physics and dynamical systems is integrated and dev

  3. Mechanisms for similarity based cooperation

    Traulsen, A.


    Cooperation based on similarity has been discussed since Richard Dawkins introduced the term “green beard” effect. In these models, individuals cooperate based on an aribtrary signal (or tag) such as the famous green beard. Here, two different models for such tag based cooperation are analysed. As neutral drift is important in both models, a finite population framework is applied. The first model, which we term “cooperative tags” considers a situation in which groups of cooperators are formed by some joint signal. Defectors adopting the signal and exploiting the group can lead to a breakdown of cooperation. In this case, conditions are derived under which the average abundance of the more cooperative strategy exceeds 50%. The second model considers a situation in which individuals start defecting towards others that are not similar to them. This situation is termed “defective tags”. It is shown that in this case, individuals using tags to cooperate exclusively with their own kind dominate over unconditional cooperators.

  4. UV photolysis for accelerating pyridine biodegradation.

    Zhang, Yongming; Chang, Ling; Yan, Ning; Tang, Yingxia; Liu, Rui; Rittmann, Bruce E


    Pyridine, a nitrogen-containing heterocyclic compound, is slowly biodegradable, and coupling biodegradation with UV photolysis is a potential means to accelerate its biotransformation and mineralization. The initial steps of pyridine biodegradation involve mono-oxygenation reactions that have molecular oxygen and an intracellular electron carrier as cosubstrates. We employed an internal circulation baffled biofilm reactor for pyridine biodegradation following three protocols: direct biodegradation (B), biodegradation after photolysis (P+B), and biodegradation with succinic acid added (B+S). Succinic acid was the main UV-photolysis product from pyridine, and its catabolic oxidation generates internal electron carriers that may accelerate the initial steps of pyridine biodegradation. Compared with direct biodegradation of pyridine (B), the removal rate for the same concentration of photolyzed pyridine (P+B) was higher by 15 to 43%, depending on the initial pyridine concentrations (increasing through the range of 130 to 310 mg/L). Adding succinic acid alone (B+S) gave results similar to P+B, which supports that succinic acid was the main agent for accelerating the pyridine biodegradation rate. In addition, protocols P+B and B+S were similar in terms of increasing pyridine mineralization over 10 h: 84% and 87%, respectively, which were higher than with protocol B (72%). The positive impact of succinic acid-whether added directly or produced via UV photolysis-confirms that its catabolism, which produced intracellular electron carriers, accelerated the initial steps of pyridine biotransformation.

  5. White Noise in Quantum Random Walk Search Algorithm

    MA Lei; DU Jiang-Feng; LI Yun; LI Hui; KWEK L. C.; OH C. H.


    @@ The quantum random walk is a possible approach to construct new quantum search algorithms. It has been shown by Shenvi et al. [Phys. Rev. A 67 (2003)52307] that a kind of algorithm can perform an oracle search on a database of N items with O(√N) calling to the oracle, yielding a speedup similar to other quantum search algorithms.

  6. Dynamic Search and Working Memory in Social Recall

    Hills, Thomas T.; Pachur, Thorsten


    What are the mechanisms underlying search in social memory (e.g., remembering the people one knows)? Do the search mechanisms involve dynamic local-to-global transitions similar to semantic search, and are these transitions governed by the general control of attention, associated with working memory span? To find out, we asked participants to…

  7. Dynamic Search and Working Memory in Social Recall

    Hills, Thomas T.; Pachur, Thorsten


    What are the mechanisms underlying search in social memory (e.g., remembering the people one knows)? Do the search mechanisms involve dynamic local-to-global transitions similar to semantic search, and are these transitions governed by the general control of attention, associated with working memory span? To find out, we asked participants to…

  8. The Search for Another Earth


    Is there life anywhere else in the vast cosmos?Are there planets similar to the Earth? For centuries,these questions baffled curious minds. Eithera positive or negative answer, if found oneday, would carry a deep philosophical significancefor our very existence in the universe. Althoughthe search for extra-terrestrial intelligence wasinitiated decades ago, a systematic scientific andglobal quest towards achieving a convincing answerbegan in 1995 with the discovery of the firstconfirmed planet orbiting around the solar-typestar 51 Pegasi. Since then, astronomers have discoveredmany exoplanets using two main techniques,radial velocity and transit measurements.In the first part of this article, we shall describethe different astronomical methods through whichthe extrasolar planets of various kinds are discovered.In the second part of the article we shalldiscuss the various kinds of exoplanets, in particularabout the habitable planets discovered tilldate and the present status of our search for ahabitable planet similar to the Earth.

  9. Interneurons targeting similar layers receive synaptic inputs with similar kinetics.

    Cossart, Rosa; Petanjek, Zdravko; Dumitriu, Dani; Hirsch, June C; Ben-Ari, Yehezkel; Esclapez, Monique; Bernard, Christophe


    GABAergic interneurons play diverse and important roles in controlling neuronal network dynamics. They are characterized by an extreme heterogeneity morphologically, neurochemically, and physiologically, but a functionally relevant classification is still lacking. Present taxonomy is essentially based on their postsynaptic targets, but a physiological counterpart to this classification has not yet been determined. Using a quantitative analysis based on multidimensional clustering of morphological and physiological variables, we now demonstrate a strong correlation between the kinetics of glutamate and GABA miniature synaptic currents received by CA1 hippocampal interneurons and the laminar distribution of their axons: neurons that project to the same layer(s) receive synaptic inputs with similar kinetics distributions. In contrast, the kinetics distributions of GABAergic and glutamatergic synaptic events received by a given interneuron do not depend upon its somatic location or dendritic arborization. Although the mechanisms responsible for this unexpected observation are still unclear, our results suggest that interneurons may be programmed to receive synaptic currents with specific temporal dynamics depending on their targets and the local networks in which they operate.

  10. Acceleration of microparticle

    Shibata, H


    A microparticle (dust) ion source has been installed at the high voltage terminal of the 3.75 MV single ended Van de Graaff electrostatic accelerator and a beam line for microparticle experiments has been build at High Fluence Irradiation Facility (HIT) of Research Center for Nuclear Science and Technology, the University of Tokyo. Microparticle acceleration has been successful in obtaining expected velocities of 1-20 km/s or more for micron or submicron sized particles. Development of in situ dust detectors and analyzers on board satellites and spacecraft in the expected mass and velocity range of micrometeoroids and investigation of hypervelocity impact phenomena by using time of flight mass spectrometry, impact flash or luminescence measurement and scanning electron or laser microscope observation for metals, ceramics, polymers and semiconductors bombarded by micron-sized particles were started three years ago. (author)

  11. accelerating cavity from LEP

    This is an accelerating cavity from LEP, with a layer of niobium on the inside. Operating at 4.2 degrees above absolute zero, the niobium is superconducting and carries an accelerating field of 6 million volts per metre with negligible losses. Each cavity has a surface of 6 m2. The niobium layer is only 1.2 microns thick, ten times thinner than a hair. Such a large area had never been coated to such a high accuracy. A speck of dust could ruin the performance of the whole cavity so the work had to be done in an extremely clean environment. These challenging requirements pushed European industry to new achievements. 256 of these cavities are now used in LEP to double the energy of the particle beams.

  12. Particle accelerator physics

    Wiedemann, Helmut


    This book by Helmut Wiedemann is a well-established, classic text, providing an in-depth and comprehensive introduction to the field of high-energy particle acceleration and beam dynamics. The present 4th edition has been significantly revised, updated and expanded. The newly conceived Part I is an elementary introduction to the subject matter for undergraduate students. Part II gathers the basic tools in preparation of a more advanced treatment, summarizing the essentials of electrostatics and electrodynamics as well as of particle dynamics in electromagnetic fields. Part III is an extensive primer in beam dynamics, followed, in Part IV, by an introduction and description of the main beam parameters and including a new chapter on beam emittance and lattice design. Part V is devoted to the treatment of perturbations in beam dynamics. Part VI then discusses the details of charged particle acceleration. Parts VII and VIII introduce the more advanced topics of coupled beam dynamics and describe very intense bea...


    Perri, S.; Zimbardo, G. [Dipartimento di Fisica, Universita della Calabria, Ponte P. Bucci Cubo 31C, I-87036 Rende (Italy)


    The theory of diffusive shock acceleration is extended to the case of superdiffusive transport, i.e., when the mean square deviation grows proportionally to t{sup {alpha}}, with {alpha} > 1. Superdiffusion can be described by a statistical process called Levy random walk, in which the propagator is not a Gaussian but it exhibits power-law tails. By using the propagator appropriate for Levy random walk, it is found that the indices of energy spectra of particles are harder than those obtained where a normal diffusion is envisaged, with the spectral index decreasing with the increase of {alpha}. A new scaling for the acceleration time is also found, allowing substantially shorter times than in the case of normal diffusion. Within this framework we can explain a number of observations of flat spectra in various astrophysical and heliospheric contexts, for instance, for the Crab Nebula and the termination shock of the solar wind.

  14. Accelerating time to benefit

    Svejvig, Per; Geraldi, Joana; Grex, Sara

    Despite the ubiquitous pressure for speed, our approaches to accelerate projects remain constrained to the old-fashioned understanding of the project as a vehicle to deliver products and services, not value. This article explores an attempt to accelerate time to benefit. We describe and deconstruct...... the implementation of a large intervention undertaken in five project-based organizations in Denmark – the Project Half Double where the same project methodology has been applied in five projects, each of them in five distinct organizations in Denmark, as a bold attempt to realize double the benefit in half...... of the time. Although all cases valued speed and speed to benefit, and implemented most practices proposed by the methodology, only three of the five projects were more successful in decreasing time to speed. Based on a multi-case study comparison between these five different projects and their respective...

  15. Accelerating QDP++ using GPUs

    Winter, Frank


    Graphic Processing Units (GPUs) are getting increasingly important as target architectures in scientific High Performance Computing (HPC). NVIDIA established CUDA as a parallel computing architecture controlling and making use of the compute power of GPUs. CUDA provides sufficient support for C++ language elements to enable the Expression Template (ET) technique in the device memory domain. QDP++ is a C++ vector class library suited for quantum field theory which provides vector data types and expressions and forms the basis of the lattice QCD software suite Chroma. In this work accelerating QDP++ expression evaluation to a GPU was successfully implemented leveraging the ET technique and using Just-In-Time (JIT) compilation. The Portable Expression Template Engine (PETE) and the C API for CUDA kernel arguments were used to build the bridge between host and device memory domains. This provides the possibility to accelerate Chroma routines to a GPU which are typically not subject to special optimisation. As an ...

  16. Accelerators for Cancer Therapy

    Lennox, Arlene J.


    The vast majority of radiation treatments for cancerous tumors are given using electron linacs that provide both electrons and photons at several energies. Design and construction of these linacs are based on mature technology that is rapidly becoming more and more standardized and sophisticated. The use of hadrons such as neutrons, protons, alphas, or carbon, oxygen and neon ions is relatively new. Accelerators for hadron therapy are far from standardized, but the use of hadron therapy as an alternative to conventional radiation has led to significant improvements and refinements in conventional treatment techniques. This paper presents the rationale for radiation therapy, describes the accelerators used in conventional and hadron therapy, and outlines the issues that must still be resolved in the emerging field of hadron therapy.

  17. Hardware Accelerated Simulated Radiography

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R


    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32 bit floating point texture capabilities to obtain validated solutions to the radiative transport equation for X-rays. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedra that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester. We show that the hardware accelerated solution is faster than the current technique used by scientists.


    Sessler, A.M.


    But a glance at the Livingston chart, Fig. 1, of accelerator particle energy as a function of time shows that the energy has steadily, exponentially, increased. Equally significant is the fact that this increase is the envelope of diverse technologies. If one is to stay on, or even near, the Livingston curve in future years then new acceleration techniques need to be developed. What are the new acceleration methods? In these two lectures I would like to sketch some of these new ideas. I am well aware that they will probably not result in high energy accelerators within this or the next decade, but conversely, it is likely that these ideas will form the basis for the accelerators of the next century. Anyway, the ideas are stimulating and suffice to show that accelerator physicists are not just 'engineers', but genuine scientists deserving to be welcomed into the company of high energy physicists. I believe that outsiders will find this field surprisingly fertile and, certainly fun. To put it more personally, I very much enjoy working in this field and lecturing on it. There are a number of review articles which should be consulted for references to the original literature. In addition there are three books on the subject. Given this material, I feel free to not completely reference the material in the remainder of this article; consultation of the review articles and books will be adequate as an introduction to the literature for references abound (hundreds are given). At last, by way of introduction, I should like to quote from the end of Ref. 2 for I think the remarks made there are most germane. Remember that the talk was addressed to accelerator physicists: 'Finally, it is often said, I think by physicists who are not well-informed, that accelerator builders have used up their capital and now are bereft of ideas, and as a result, high energy physics will eventually--rather soon, in fact--come to a halt. After all, one can't build too many




    Full Text Available act: Similarities between the accounting of companies and territorial administrative units accounting are the following: organizing double entry accounting; accounting method both in terms of fundamental theoretical principles and specific practical tools. The differences between the accounting of companies and of territorial administrative units refer to: the accounting of territorial administrative units includes besides general accounting (financial also budgetary accounting, and the accounts system of the budgetary accounting is completely different from that of companies; financial statements of territorial administrative units to which leaders are not main authorizing officers are submitted to the hierarchically superior body (not at MPF; the accounts of territorial administrative units are opened at treasury and financial institutions, accounts at commercial banks being prohibited; equity accounts in territorial administrative units are structured into groups of funds; long term debts have a specific structure in territorial administrative units (internal local public debt and external local public debt.

  20. Performance Indexes: Similarities and Differences

    André Machado Caldeira


    Full Text Available The investor of today is more rigorous on monitoring a financial assets portfolio. He no longer thinks only in terms of the expected return (one dimension, but in terms of risk-return (two dimensions. Thus new perception is more complex, since the risk measurement can vary according to anyone’s perception; some use the standard deviation for that, others disagree with this measure by proposing others. In addition to this difficulty, there is the problem of how to consider these two dimensions. The objective of this essay is to study the main performance indexes through an empirical study in order to verify the differences and similarities for some of the selected assets. One performance index proposed in Caldeira (2005 shall be included in this analysis.

  1. Features Based Text Similarity Detection

    Kent, Chow Kok


    As the Internet help us cross cultural border by providing different information, plagiarism issue is bound to arise. As a result, plagiarism detection becomes more demanding in overcoming this issue. Different plagiarism detection tools have been developed based on various detection techniques. Nowadays, fingerprint matching technique plays an important role in those detection tools. However, in handling some large content articles, there are some weaknesses in fingerprint matching technique especially in space and time consumption issue. In this paper, we propose a new approach to detect plagiarism which integrates the use of fingerprint matching technique with four key features to assist in the detection process. These proposed features are capable to choose the main point or key sentence in the articles to be compared. Those selected sentence will be undergo the fingerprint matching process in order to detect the similarity between the sentences. Hence, time and space usage for the comparison process is r...

  2. Future Accelerator Magnet Needs

    Devred, Arnaud; Yamamoto, A


    Superconducting magnet technology is continually evolving in order to meet the demanding needs of new accelerators and to provide necessary upgrades for existing machines. A variety of designs are now under development, including high fields and gradients, rapid cycling and novel coil configurations. This paper presents a summary of R&D programs in the EU, Japan and the USA. A performance comparison between NbTi and Nb3Sn along with fabrication and cost issues are also discussed.

  3. Accelerated plate tectonics.

    Anderson, D L


    The concept of a stressed elastic lithospheric plate riding on a viscous asthenosphere is used to calculate the recurrence interval of great earthquakes at convergent plate boundaries, the separation of decoupling and lithospheric earthquakes, and the migration pattern of large earthquakes along an arc. It is proposed that plate motions accelerate after great decoupling earthquakes and that most of the observed plate motions occur during short periods of time, separated by periods of relative quiescence.

  4. LEP copper accelerating cavities

    Laurent Guiraud


    These copper cavities were used to generate the radio frequency electric field that was used to accelerate electrons and positrons around the 27-km Large Electron-Positron (LEP) collider at CERN, which ran from 1989 to 2000. The copper cavities were gradually replaced from 1996 with new superconducting cavities allowing the collision energy to rise from 90 GeV to 200 GeV by mid-1999.

  5. Measurement and Improvement of Subject Specialists Performance Searching Chemical Abstracts Online as Available on SDC Search Systems.

    Copeland, Richard F.; And Others

    The first phase of a project to design a prompting system to help semi-experienced end users to search Chemical Abstracts online, this study focused on the differences and similarities in the search approaches used by experienced users and those with less expertise. Four online searches on topics solicited from chemistry professors in small…

  6. French nuclear physics accelerator opens

    Dumé, Belle


    A new €140m particle accelerator for nuclear physics located at the French Large Heavy Ion National Accelerator (GANIL) in Caen was inaugurated last month in a ceremony attended by French president François Hollande.

  7. Accelerator mass spectrometry.

    Hellborg, Ragnar; Skog, Göran


    In this overview the technique of accelerator mass spectrometry (AMS) and its use are described. AMS is a highly sensitive method of counting atoms. It is used to detect very low concentrations of natural isotopic abundances (typically in the range between 10(-12) and 10(-16)) of both radionuclides and stable nuclides. The main advantages of AMS compared to conventional radiometric methods are the use of smaller samples (mg and even sub-mg size) and shorter measuring times (less than 1 hr). The equipment used for AMS is almost exclusively based on the electrostatic tandem accelerator, although some of the newest systems are based on a slightly different principle. Dedicated accelerators as well as older "nuclear physics machines" can be found in the 80 or so AMS laboratories in existence today. The most widely used isotope studied with AMS is 14C. Besides radiocarbon dating this isotope is used in climate studies, biomedicine applications and many other fields. More than 100,000 14C samples are measured per year. Other isotopes studied include 10Be, 26Al, 36Cl, 41Ca, 59Ni, 129I, U, and Pu. Although these measurements are important, the number of samples of these other isotopes measured each year is estimated to be less than 10% of the number of 14C samples.

  8. Berkeley Proton Linear Accelerator

    Alvarez, L. W.; Bradner, H.; Franck, J.; Gordon, H.; Gow, J. D.; Marshall, L. C.; Oppenheimer, F. F.; Panofsky, W. K. H.; Richman, C.; Woodyard, J. R.


    A linear accelerator, which increases the energy of protons from a 4 Mev Van de Graaff injector, to a final energy of 31.5 Mev, has been constructed. The accelerator consists of a cavity 40 feet long and 39 inches in diameter, excited at resonance in a longitudinal electric mode with a radio-frequency power of about 2.2 x 10{sup 6} watts peak at 202.5 mc. Acceleration is made possible by the introduction of 46 axial "drift tubes" into the cavity, which is designed such that the particles traverse the distance between the centers of successive tubes in one cycle of the r.f. power. The protons are longitudinally stable as in the synchrotron, and are stabilized transversely by the action of converging fields produced by focusing grids. The electrical cavity is constructed like an inverted airplane fuselage and is supported in a vacuum tank. Power is supplied by 9 high powered oscillators fed from a pulse generator of the artificial transmission line type.

  9. Optimizing accelerator technology

    Katarina Anthony


    A new EU-funded research and training network, oPAC, is bringing together 22 universities, research centres and industry partners to optimize particle accelerator technology. CERN is one of the network’s main partners and will host 5 early-stage researchers in the BE department.   A diamond detector that will be used for novel beam diagnostics applications in the oPAC project based at CIVIDEC. (Image courtesy of CIVIDEC.) As one of the largest Marie Curie Initial Training Networks ever funded by the EU – to the tune of €6 million – oPAC extends well beyond the particle physics community. “Accelerator physics has become integral to research in almost every scientific discipline – be it biology and life science, medicine, geology and material science, or fundamental physics,” explains Carsten P. Welsch, oPAC co-ordinator based at the University of Liverpool. “By optimizing the operation of accelerators, all of these...

  10. Particle Accelerator Focus Automation

    Lopes José


    Full Text Available The Laboratório de Aceleradores e Tecnologias de Radiação (LATR at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+ and proton (H+ beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  11. Particle Accelerator Focus Automation

    Lopes, José; Rocha, Jorge; Redondo, Luís; Cruz, João


    The Laboratório de Aceleradores e Tecnologias de Radiação (LATR) at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST) has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+) and proton (H+) beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  12. Introduction to Microwave Linear [Accelerators

    Whittum, David H


    The elements of microwave linear accelerators are introduced starting with the principles of acceleration and accelerating structures. Considerations for microwave structure modeling and design are developed from an elementary point of view. Basic elements of microwave electronics are described for application to the accelerator circuit and instrumentation. Concepts of beam physics are explored together with examples of common beamline instruments. Charged particle optics and lattice diagnostics are introduced. Considerations for fixed-target and colliding-beam experimentation are summarized.

  13. Plasma accelerator experiments in Yugoslavia

    Purić, J.; Astashynski, V. M.; Kuraica, M. M.; Dojčinovié, I. P.


    An overview is given of the results obtained in the Plasma Accelerator Experiments in Belgrade, using quasi-stationary high current plasma accelerators constructed within the framework of the Yugoslavia-Belarus Joint Project. So far, the following plasma accelerators have been realized: Magnetoplasma Compressor type (MPC); MPC Yu type; one stage Erosive Plasma Dynamic System (EPDS) and, in final stage of construction two stage Quasi-Stationary High Current Plasma Accelerator (QHPA).

  14. MMDB and VAST+: tracking structural similarities between macromolecular complexes.

    Madej, Thomas; Lanczycki, Christopher J; Zhang, Dachuan; Thiessen, Paul A; Geer, Renata C; Marchler-Bauer, Aron; Bryant, Stephen H


    The computational detection of similarities between protein 3D structures has become an indispensable tool for the detection of homologous relationships, the classification of protein families and functional inference. Consequently, numerous algorithms have been developed that facilitate structure comparison, including rapid searches against a steadily growing collection of protein structures. To this end, NCBI's Molecular Modeling Database (MMDB), which is based on the Protein Data Bank (PDB), maintains a comprehensive and up-to-date archive of protein structure similarities computed with the Vector Alignment Search Tool (VAST). These similarities have been recorded on the level of single proteins and protein domains, comprising in excess of 1.5 billion pairwise alignments. Here we present VAST+, an extension to the existing VAST service, which summarizes and presents structural similarity on the level of biological assemblies or macromolecular complexes. VAST+ simplifies structure neighboring results and shows, for macromolecular complexes tracked in MMDB, lists of similar complexes ranked by the extent of similarity. VAST+ replaces the previous VAST service as the default presentation of structure neighboring data in NCBI's Entrez query and retrieval system. MMDB and VAST+ can be accessed via

  15. A similarity-based data warehousing environment for medical images.

    Teixeira, Jefferson William; Annibal, Luana Peixoto; Felipe, Joaquim Cezar; Ciferri, Ricardo Rodrigues; Ciferri, Cristina Dutra de Aguiar


    A core issue of the decision-making process in the medical field is to support the execution of analytical (OLAP) similarity queries over images in data warehousing environments. In this paper, we focus on this issue. We propose imageDWE, a non-conventional data warehousing environment that enables the storage of intrinsic features taken from medical images in a data warehouse and supports OLAP similarity queries over them. To comply with this goal, we introduce the concept of perceptual layer, which is an abstraction used to represent an image dataset according to a given feature descriptor in order to enable similarity search. Based on this concept, we propose the imageDW, an extended data warehouse with dimension tables specifically designed to support one or more perceptual layers. We also detail how to build an imageDW and how to load image data into it. Furthermore, we show how to process OLAP similarity queries composed of a conventional predicate and a similarity search predicate that encompasses the specification of one or more perceptual layers. Moreover, we introduce an index technique to improve the OLAP query processing over images. We carried out performance tests over a data warehouse environment that consolidated medical images from exams of several modalities. The results demonstrated the feasibility and efficiency of our proposed imageDWE to manage images and to process OLAP similarity queries. The results also demonstrated that the use of the proposed index technique guaranteed a great improvement in query processing.

  16. Analysis of a librarian-mediated literature search service.

    Friesen, Carol; Lê, Mê-Linh; Cooke, Carol; Raynard, Melissa


    Librarian-mediated literature searching is a key service provided at medical libraries. This analysis outlines ten years of data on 19,248 literature searches and describes information on the volume and frequency of search requests, time spent per search, databases used, and professional designations of the patron requestors. Combined with information on best practices for expert searching and evaluations of similar services, these findings were used to form recommendations on the improvement and standardization of a literature search service at a large health library system.

  17. Combination of visual and textual similarity retrieval from medical documents.

    Eggel, Ivan; Müller, Henning


    Medical visual information retrieval has been an active research area over the past ten years as an increasing amount of images are produced digitally and have become available in patient records, scientific literature, and other medical documents. Most visual retrieval systems concentrate on images only, but it has become apparent that the retrieval of similar images alone is of limited interest, and rather the retrieval of similar documents is an important domain. Most medical institutions as well as the World Health Organization (WHO) produce many complex documents. Searching them, including a visual search, can help finding important information and also facilitates the reuse of document content and images. The work described in this paper is based on a proposal of the WHO that produces large amounts of documents from studies but also for training. The majority of these documents are in complex formats such as PDF, Microsoft Word, Excel, or PowerPoint. Goal is to create an information retrieval system that allows easy addition of documents and search by keywords and visual content. For text retrieval, Lucene is used and for image retrieval the GNU Image Finding Tool (GIFT). A Web 2.0 interface allows for an easy upload as well as simple searching.

  18. Accelerating in de Sitter spacetimes

    Cotaescu, Ion I


    We propose a definition of uniform accelerated frames in de Sitter spacetimes exploiting the Nachtmann group theoretical method of introducing coordinates on these manifolds. Requiring the transformation between the static frame and the accelerated one to depend continuously on acceleration in order to recover the well-known Rindler approach in the flat limit, we obtain a result with a reasonable physical meaning.

  19. Acceleration and improvement of dental implants’ osseointegration. Current perspective

    Gregory VENETIS


    Full Text Available The possibility to accelerate osseointegration and/or to improve bone quality around an implant is the subject of the present literature review. The following key words were searched on Pubmed: ac-celerate, improvement, osseointegration. The publication date span was set from 2009 to 2013. Combinations of the search terms retrieved the following results: a ac-celerate and osseointegration: 78 papers; b improve-ment and sseointegration: 206 papers. A supplementary search on the surgical techniques available for alveolar ridge augmentation for the last 10 years, found 457 pa-pers. After a systematic review of the above papers the following are concluded: 1. Guided bone regeneration (GBR is the most thor-oughly examined technique of bone growth around a dental implant placed into a poor alveolar bone. 2. The dental implants’ surface treatment trends aim to less invasive and more sophisticated techniques, with the use of nanotechnology. 3. Implant surface coating with adhesion peptides and/or inorganic calcium compounds in thin layers may amplify the biochemistry of osseointegration. 4. Other, non-biochemical methods, are being tested ex-perimentally to inhibit or decrease the alveolar bone loss around an implant and finally 5. The systemic administration of osteoclastic inhibitors, such as bisphosphonates or strontium seems to acceler-ate the initial stage of osseointegration. These findings represent an approximate prediction for the future development of osseointegration research and pose research questions for further study

  20. Minimizing Head Acceleration in Soccer: A Review of the Literature.

    Caccese, Jaclyn B; Kaminski, Thomas W


    Physicians and healthcare professionals are often asked for recommendations on how to keep athletes safe during contact sports such as soccer. With an increase in concussion awareness and concern about repetitive subconcussion, many parents and athletes are interested in mitigating head acceleration in soccer, so we conducted a literature review on factors that affect head acceleration in soccer. We searched electronic databases and reference lists to find studies using the keywords 'soccer' OR 'football' AND 'head acceleration'. Because of a lack of current research in soccer heading biomechanics, this review was limited to 18 original research studies. Low head-neck segment mass predisposes athletes to high head acceleration, but head-neck-torso alignment during heading and follow-through after contact can be used to decrease head acceleration. Additionally, improvements in symmetric neck flexor and extensor strength and neuromuscular neck stiffness can decrease head acceleration. Head-to-head impacts and unanticipated ball contacts result in the highest head acceleration. Ball contacts at high velocity may also be dangerous. The risk of concussive impacts may be lessened through the use of headgear, but headgear may also cause athletes to play more recklessly because they feel a sense of increased security. Young, but physically capable, athletes should be taught proper heading technique in a controlled setting, using a carefully planned progression of the skill.

  1. X-band Dielectric Loaded Rf Driven Accelerator Structures Theoretical And Experimental Investigations

    Zou, P


    An important area of application of high-power radio frequency (RF) and microwave sources is particle acceleration. A major challenge for the current worldwide research and development effort in linear accelerator is the search for a compact and affordable very-high-energy accelerator technology for the next generation supercolliders. It has been recognized for sometime that dielectric loaded accelerator structures are attractive candidates for the next generation very-high-energy linear accelerators, because they possess several distinct advantages over conventional metallic iris- loaded accelerator structures. However, some fundamental issues, such as RF breakdown in the dielectric, Joule heating, and vacuum properties of dielectric materials, are still the subjects of intense investigation, requiring the validation by experiments conducted at high power levels. An X-band traveling-wave accelerator based on dielectric-lined waveguide has been designed and constructed. Numerical calculation, bench measuremen...

  2. Staging and laser acceleration of ions in underdense plasma

    Ting, Antonio; Hafizi, Bahman; Helle, Michael; Chen, Yu-Hsin; Gordon, Daniel; Kaganovich, Dmitri; Polyanskiy, Mikhail; Pogorelsky, Igor; Babzien, Markus; Miao, Chenlong; Dover, Nicholas; Najmudin, Zulfikar; Ettlinger, Oliver


    Accelerating ions from rest in a plasma requires extra considerations because of their heavy mass. Low phase velocity fields or quasi-electrostatic fields are often necessary, either by operating above or near the critical density or by applying other slow wave generating mechanisms. Solid targets have been a favorite and have generated many good results. High density gas targets have also been reported to produce energetic ions. It is interesting to consider acceleration of ions in laser-driven plasma configurations that will potentially allow continuous acceleration in multiple consecutive stages. The plasma will be derived from gaseous targets, producing plasma densities slightly below the critical plasma density (underdense) for the driving laser. Such a plasma is experimentally robust, being repeatable and relatively transparent to externally injected ions from a previous stage. When optimized, multiple stages of this underdense laser plasma acceleration mechanism can progressively accelerate the ions to a high final energy. For a light mass ion such as the proton, relativistic velocities could be reached, making it suitable for further acceleration by high phase velocity plasma accelerators to energies appropriate for High Energy Physics applications. Negatively charged ions such as antiprotons could be similarly accelerated in this multi-staged ion acceleration scheme.

  3. A New Generalized Similarity-Based Topic Distillation Algorithm

    ZHOU Hongfang; DANG Xiaohui


    The procedure of hypertext induced topic search based on a semantic relation model is analyzed, and the reason for the topic drift of HITS algorithm was found to prove that Web pages are projected to a wrong latent semantic basis. A new concept-generalized similarity is introduced and, based on this, a new topic distillation algorithm GSTDA(generalized similarity based topic distillation algorithm) was presented to improve the quality of topic distillation. GSTDA was applied not only to avoid the topic drift, but also to explore relative topics to user query. The experimental results on 10 queries show that GSTDA reduces topic drift rate by 10% to 58% compared to that of HITS(hypertext induced topic search) algorithm, and discovers several relative topics to queries that have multiple meanings.

  4. Searching Databases with Keywords

    Shan Wang; Kun-Long Zhang


    Traditionally, SQL query language is used to search the data in databases. However, it is inappropriate for end-users, since it is complex and hard to learn. It is the need of end-user, searching in databases with keywords, like in web search engines. This paper presents a survey of work on keyword search in databases. It also includes a brief introduction to the SEEKER system which has been developed.

  5. Integrated vs. Federated Search

    Løvschall, Kasper


    Oplæg om forskelle og ligheder mellem integrated og federated search i bibliotekskontekst. Holdt ved temadag om "Integrated Search - samsøgning i alle kilder" på Danmarks Biblioteksskole den 22. januar 2009.......Oplæg om forskelle og ligheder mellem integrated og federated search i bibliotekskontekst. Holdt ved temadag om "Integrated Search - samsøgning i alle kilder" på Danmarks Biblioteksskole den 22. januar 2009....

  6. Routing Optimization Based on Taboo Search Algorithm for Logistic Distribution

    Hongxue Yang


    Full Text Available Along with the widespread application of the electronic commerce in the modern business, the logistic distribution has become increasingly important. More and more enterprises recognize that the logistic distribution plays an important role in the process of production and sales. A good routing for logistic distribution can cut down transport cost and improve efficiency. In order to cut down transport cost and improve efficiency, a routing optimization based on taboo search for logistic distribution is proposed in this paper. Taboo search is a metaheuristic search method to perform local search used for logistic optimization. The taboo search is employed to accelerate convergence and the aspiration criterion is combined with the heuristics algorithm to solve routing optimization. Simulation experimental results demonstrate that the optimal routing in the logistic distribution can be quickly obtained by the taboo search algorithm

  7. How doctors search

    Lykke, Marianne; Price, Susan; Delcambre, Lois


    to context-specific aspects of the main topic of the documents. We have tested the model in an interactive searching study with family doctors with the purpose to explore doctors’ querying behaviour, how they applied the means for specifying a search, and how these features contributed to the search outcome...

  8. The Information Search

    Doraiswamy, Uma


    This paper in the form of story discusses a college student's information search process. In this story we see Kuhlthau's information search process: initiation, selection, exploration, formulation, collection, and presentation. Katie is a student who goes in search of information for her class research paper. Katie's class readings, her interest…

  9. Search and the city

    P.A. Gautier; C.N. Teulings


    We develop a model of an economy with several regions, which differ in scale. Within each region, workers have to search for a job-type that matches their skill. They face a trade-off between match quality and the cost of extended search. This trade-off differs between regions, because search is mor

  10. Muon Acceleration - RLA and FFAG

    Bogacz, Alex


    Various acceleration schemes for muons are presented. The overall goal of the acceleration systems: large acceptance acceleration to 25 GeV and 'beam shaping' can be accomplished by various fixed field accelerators at different stages. They involve three superconducting linacs: a single pass linear Pre-accelerator followed by a pair of multi-pass Recirculating Linear Accelerators (RLA) and finally a non-scaling FFAG ring. The present baseline acceleration scenario has been optimized to take maximum advantage of appropriate acceleration scheme at a given stage. The solenoid based Pre-accelerator offers very large acceptance and facilitates correction of energy gain across the bunch and significant longitudinal compression trough induced synchrotron motion. However, far off-crest acceleration reduces the effective acceleration gradient and adds complexity through the requirement of individual RF phase control for each cavity. The RLAs offer very efficient usage of high gradient superconducting RF and ability to adjust path-length after each linac pass through individual return arcs with uniformly periodic FODO optics suitable for chromatic compensation of emittance dilution with sextupoles. However, they require spreaders/recombiners switchyards at both linac ends and significant total length of the arcs. The non-scaling Fixed Field Alternating Gradient (FFAG) ring combines compactness with very large chromatic acceptance (twice the injection energy) and it allows for large number of passes through the RF (at least eight, possibly as high as 15).

  11. VLHC accelerator physics

    Michael Blaskiewicz et al.


    A six-month design study for a future high energy hadron collider was initiated by the Fermilab director in October 2000. The request was to study a staged approach where a large circumference tunnel is built that initially would house a low field ({approx}2 T) collider with center-of-mass energy greater than 30 TeV and a peak (initial) luminosity of 10{sup 34} cm{sup -2}s{sup -1}. The tunnel was to be scoped, however, to support a future upgrade to a center-of-mass energy greater than 150 TeV with a peak luminosity of 2 x 10{sup 34} cm{sup -2} sec{sup -1} using high field ({approx} 10 T) superconducting magnet technology. In a collaboration with Brookhaven National Laboratory and Lawrence Berkeley National Laboratory, a report of the Design Study was produced by Fermilab in June 2001. 1 The Design Study focused on a Stage 1, 20 x 20 TeV collider using a 2-in-1 transmission line magnet and leads to a Stage 2, 87.5 x 87.5 TeV collider using 10 T Nb{sub 3}Sn magnet technology. The article that follows is a compilation of accelerator physics designs and computational results which contributed to the Design Study. Many of the parameters found in this report evolved during the study, and thus slight differences between this text and the Design Study report can be found. The present text, however, presents the major accelerator physics issues of the Very Large Hadron Collider as examined by the Design Study collaboration and provides a basis for discussion and further studies of VLHC accelerator parameters and design philosophies.

  12. APT accelerator. Topical report

    Lawrence, G.; Rusthoi, D. [comp.] [ed.


    The Accelerator Production of Tritium (APT) project, sponsored by Department of Energy Defense Programs (DOE/DP), involves the preconceptual design of an accelerator system to produce tritium for the nation`s stockpile of nuclear weapons. Tritium is an isotope of hydrogen used in nuclear weapons, and must be replenished because of radioactive decay (its half-life is approximately 12 years). Because the annual production requirements for tritium has greatly decreased since the end of the Cold War, an alternative approach to reactors for tritium production, based on a linear accelerator, is now being seriously considered. The annual tritium requirement at the time this study was undertaken (1992-1993) was 3/8 that of the 1988 goal, usually stated as 3/8-Goal. Continued reduction in the number of weapons in the stockpile has led to a revised (lower) production requirement today (March, 1995). The production requirement needed to maintain the reduced stockpile, as stated in the recent Nuclear Posture Review (summer 1994) is approximately 3/16-Goal, half the previous level. The Nuclear Posture Review also requires that the production plant be designed to accomodate a production increase (surge) to 3/8-Goal capability within five years, to allow recovery from a possible extended outage of the tritium plant. A multi-laboratory team, collaborating with several industrial partners, has developed a preconceptual APT design for the 3/8-Goal, operating at 75% capacity. The team has presented APT as a promising alternative to the reactor concepts proposed for Complex-21. Given the requirements of a reduced weapons stockpile, APT offers both significant safety, environmental, and production-fexibility advantages in comparison with reactor systems, and the prospect of successful development in time to meet the US defense requirements of the 21st Century.

  13. FACT: functional annotation transfer between proteins with similar feature architectures.

    Koestler, Tina; von Haeseler, Arndt; Ebersberger, Ingo


    The increasing number of sequenced genomes provides the basis for exploring the genetic and functional diversity within the tree of life. Only a tiny fraction of the encoded proteins undergoes a thorough experimental characterization. For the remainder, bioinformatics annotation tools are the only means to infer their function. Exploiting significant sequence similarities to already characterized proteins, commonly taken as evidence for homology, is the prevalent method to deduce functional equivalence. Such methods fail when homologs are too diverged, or when they have assumed a different function. Finally, due to convergent evolution, functional equivalence is not necessarily linked to common ancestry. Therefore complementary approaches are required to identify functional equivalents. We present the Feature Architecture Comparison Tool to search for functionally equivalent proteins. FACT uses the similarity between feature architectures of two proteins, i.e., the arrangements of functional domains, secondary structure elements and compositional properties, as a proxy for their functional equivalence. A scoring function measures feature architecture similarities, which enables searching for functional equivalents in entire proteomes. Our evaluation of 9,570 EC classified enzymes revealed that FACT, using the full feature, set outperformed the existing architecture-based approaches by identifying significantly more functional equivalents as highest scoring proteins. We show that FACT can identify functional equivalents that share no significant sequence similarity. However, when the highest scoring protein of FACT is also the protein with the highest local sequence similarity, it is in 99% of the cases functionally equivalent to the query. We demonstrate the versatility of FACT by identifying a missing link in the yeast glutathione metabolism and also by searching for the human GolgA5 equivalent in Trypanosoma brucei. FACT facilitates a

  14. Accelerated Innovation Pilot

    Davis, Jeffrey


    Opportunities: I. Engage NASA team (examples) a) Research and technology calls . provide suggestions to AES, HRP, OCT. b) Use NASA@Work to solicit other ideas; (possibly before R+D calls). II. Stimulate collaboration (examples) a) NHHPC. b) Wharton Mack Center for Technological Innovation (Feb 2013). c) International ] DLR ] :envihab (July 2013). d) Accelerated research models . NSF, Myelin Repair Foundation. III. Engage public Prizes (open platform: InnoCentive,, NTL; Rice Business Plan, etc.) IV. Use same methods to engage STEM.

  15. Accelerating abelian gauge dynamics

    Adler, Stephen Louis


    In this paper, we suggest a new acceleration method for Abelian gauge theories based on linear transformations to variables which weight all length scales equally. We measure the autocorrelation time for the Polyakov loop and the plaquette at β=1.0 in the U(1) gauge theory in four dimensions, for the new method and for standard Metropolis updates. We find a dramatic improvement for the new method over the Metropolis method. Computing the critical exponent z for the new method remains an important open issue.

  16. 2014 CERN Accelerator Schools


    A specialised school on Power Converters will be held in Baden, Switzerland, from 7 to 14 May 2014. Please note that the deadline for applications is 7 FEBRUARY 2014. A course on Introduction to Accelerator Physics will be held in Prague, Czech Republic, from 31 August to 12 September 2014. Applications are now open for this school; the application deadline is 25 APRIL 2014. Further information on these schools and other CAS events can be found on the CAS website and on the Indico page. For further information please contact

  17. Hardware Accelerated Power Estimation

    Coburn, Joel; Raghunathan, Anand


    In this paper, we present power emulation, a novel design paradigm that utilizes hardware acceleration for the purpose of fast power estimation. Power emulation is based on the observation that the functions necessary for power estimation (power model evaluation, aggregation, etc.) can be implemented as hardware circuits. Therefore, we can enhance any given design with "power estimation hardware", map it to a prototyping platform, and exercise it with any given test stimuli to obtain power consumption estimates. Our empirical studies with industrial designs reveal that power emulation can achieve significant speedups (10X to 500X) over state-of-the-art commercial register-transfer level (RTL) power estimation tools.

  18. Medical applications of accelerators

    Rossi, Sandro


    At Present, about five thousands accelerators are devoted to biomedical applications. They are mainly used in radiotherapy, research and medical radioisotopes production. In this framework oncological hadron-therapy deserves particular attention since it represents a field in rapid evolution thanks to the joint efforts of laboratories with long experiences in particle physics. It is the case of CERN where the design of an optimised synchrotron for medical applications has been pursued. These lectures present these activities with particular attention to the new developments which are scientifically interesting and/or economically promising.


    Jensen, Jens Stissing; Koch, Christian


    By viewing the construction industry as a technological innovation system (TIS) this paper discusses possible initiatives to accelerate nanotechnological innovations. The point of departure is a recent report on the application of nano-technology in the Danish construction industry, which concludes...... of the system are furthermore poorly equipped at identifying potentials within high-tech areas. In order to exploit the potentials of nano-technology it is thus argued that an alternative TIS needs to be established. Initiatives should identify and support “incubation rooms” or marked niches in order...

  20. Faceted Semantic Search for Personalized Social Search

    Mas, Massimiliano Dal


    Actual social networks (like Facebook, Twitter, Linkedin, ...) need to deal with vagueness on ontological indeterminacy. In this paper is analyzed the prototyping of a faceted semantic search for personalized social search using the "joint meaning" in a community environment. User researches in a "collaborative" environment defined by folksonomies can be supported by the most common features on the faceted semantic search. A solution for the context-aware personalized search is based on "joint meaning" understood as a joint construal of the creators of the contents and the user of the contents using the faced taxonomy with the Semantic Web. A proof-of concept prototype shows how the proposed methodological approach can also be applied to existing presentation components, built with different languages and/or component technologies.

  1. Keep Searching and You’ll Find

    Laursen, Keld


    triggers for different kinds of search. It argues that the initial focus on local search was a consequence, in part, of the attention in evolutionary economics to path-dependent behavior, but that as localized behavior was increasingly accepted as the standard mode, studies began to question whether local...... search was the best solution in all cases. More recently, the literature has focused on the trade-offs being created, by firms having to balance local and non-local search. We account also for the apparent “variety paradox” in the stylized fact that organizations within the same industry tend to follow...... different search strategies, but end up with very similar technological profiles in fast-growing technologies. The article concludes by highlighting what we have learnt from the literature and suggesting some new avenues for research....

  2. Time-dependent particle acceleration in a Fermi reservoir

    Litvinenko, Y. E.


    Context. A steady model was presented by Burn, in which energy conservation is used to constrain the parameters of stochastic Fermi acceleration. A steady model, however, is unlikely to be adequate for particle acceleration in impulsive solar flares. Aims: This paper describes a time-dependent model for particle acceleration in a Fermi reservoir Methods: The calculation is based on the original formulation of stochastic acceleration by Fermi, with additional physically motivated assumptions about the turbulent and particle energy densities within the reservoir, that are similar to those of the steady analysis. The problem is reduced to an integro-differential equation that possesses an analytical solution. Results: The model predicts the formation of a power-law differential energy spectrum N(E) ~ E-2, that is observable outside the reservoir. The predicted spectral index is independent of the parameters of the model. The results may help in understanding particle acceleration in solar flares and other astrophysical applications.

  3. Accelerated Chemical Reactions and Organic Synthesis in Leidenfrost Droplets.

    Bain, Ryan M; Pulliam, Christopher J; Thery, Fabien; Cooks, R Graham


    Leidenfrost levitated droplets can be used to accelerate chemical reactions in processes that appear similar to reaction acceleration in charged microdroplets produced by electrospray ionization. Reaction acceleration in Leidenfrost droplets is demonstrated for a base-catalyzed Claisen-Schmidt condensation, hydrazone formation from precharged and neutral ketones, and for the Katritzky pyrylium into pyridinium conversion under various reaction conditions. Comparisons with bulk reactions gave intermediate acceleration factors (2-50). By keeping the volume of the Leidenfrost droplets constant, it was shown that interfacial effects contribute to acceleration; this was confirmed by decreased reaction rates in the presence of a surfactant. The ability to multiplex Leidenfrost microreactors, to extract product into an immiscible solvent during reaction, and to use Leidenfrost droplets as reaction vessels to synthesize milligram quantities of product is also demonstrated.

  4. Dense plasma focus (DPF) accelerated non radio isotopic radiological source

    Rusnak, Brian; Tang, Vincent


    A non-radio-isotopic radiological source using a dense plasma focus (DPF) to produce an intense z-pinch plasma from a gas, such as helium, and which accelerates charged particles, such as generated from the gas or injected from an external source, into a target positioned along an acceleration axis and of a type known to emit ionizing radiation when impinged by the type of accelerated charged particles. In a preferred embodiment, helium gas is used to produce a DPF-accelerated He2+ ion beam to a beryllium target, to produce neutron emission having a similar energy spectrum as a radio-isotopic AmBe neutron source. Furthermore, multiple DPFs may be stacked to provide staged acceleration of charged particles for enhancing energy, tunability, and control of the source.

  5. The Maximum Energy of Accelerated Particles in Relativistic Collisionless Shocks

    Sironi, Lorenzo; Arons, Jonathan


    The afterglow emission from gamma-ray bursts (GRBs) is usually interpreted as synchrotron radiation from electrons accelerated at the GRB external shock, that propagates with relativistic velocities into the magnetized interstellar medium. By means of multi-dimensional particle-in-cell simulations, we investigate the acceleration performance of weakly magnetized relativistic shocks, in the magnetization range 0accelerators if the magnetization is sigma<1e-3. For electron-ion plasmas, the transition to efficient acceleration occurs for sigma<3e-5. Here, the acceleration process proceeds similarly for the two species, since the electrons enter the shock nearly in equipartition with the ions, as a result of strong pre-heating in the self-generated upstream turbulence. In both...

  6. Keyword Search in Databases

    Yu, Jeffrey Xu; Chang, Lijun


    It has become highly desirable to provide users with flexible ways to query/search information over databases as simple as keyword search like Google search. This book surveys the recent developments on keyword search over databases, and focuses on finding structural information among objects in a database using a set of keywords. Such structural information to be returned can be either trees or subgraphs representing how the objects, that contain the required keywords, are interconnected in a relational database or in an XML database. The structural keyword search is completely different from

  7. Scaling, similarity, and the fourth paradigm for hydrology

    Peters-Lidard, Christa D.; Clark, Martyn; Samaniego, Luis; Verhoest, Niko E. C.; van Emmerik, Tim; Uijlenhoet, Remko; Achieng, Kevin; Franz, Trenton E.; Woods, Ross


    In this synthesis paper addressing hydrologic scaling and similarity, we posit that roadblocks in the search for universal laws of hydrology are hindered by our focus on computational simulation (the third paradigm) and assert that it is time for hydrology to embrace a fourth paradigm of data-intensive science. Advances in information-based hydrologic science, coupled with an explosion of hydrologic data and advances in parameter estimation and modeling, have laid the foundation for a data-driven framework for scrutinizing hydrological scaling and similarity hypotheses. We summarize important scaling and similarity concepts (hypotheses) that require testing; describe a mutual information framework for testing these hypotheses; describe boundary condition, state, flux, and parameter data requirements across scales to support testing these hypotheses; and discuss some challenges to overcome while pursuing the fourth hydrological paradigm. We call upon the hydrologic sciences community to develop a focused effort towards adopting the fourth paradigm and apply this to outstanding challenges in scaling and similarity.

  8. Searching and Indexing Genomic Databases via Kernelization

    Travis eGagie


    Full Text Available The rapid advance of DNA sequencing technologies has yielded databases of thousands of genomes. To search and index these databases effectively, it is important that we take advantage of the similarity between those genomes. Several authors have recently suggested searching or indexing only one reference genome and the parts of the other genomes where they differ. In this paper we survey the twenty-year history of this idea and discuss its relation to kernelization in parameterized complexity.

  9. Control problems in very large accelerators

    Crowley-Milling, M.C.


    There is no fundamental difference of kind in the control requirements between a small and a large accelerator since they are built of the same types of components, which individually have similar control inputs and outputs. The main difference is one of scale; the large machine has many more components of each type, and the distances involved are much greater. Both of these factors must be taken into account in determining the optimum way of carrying out the control functions. Small machines should use standard equipment and software for control as much as possible, as special developments for small quantities cannot normally be justified if all costs are taken into account. On the other hand, the very great number of devices needed for a large machine means that, if special developments can result in simplification, they may make possible an appreciable reduction in the control equipment costs. It is the purpose of this report to look at the special control problems of large accelerators, which the author shall arbitarily define as those with a length of circumference in excess of 10 km, and point out where special developments, or the adoption of developments from outside the accelerator control field, can be of assistance in minimizing the cost of the control system. Most of the first part of this report was presented as a paper to the 1985 Particle Accelerator Conference. It has now been extended to include a discussion on the special case of the controls for the SSC.

  10. Electron Acceleration at Pulsar Wind Termination Shocks

    Giacchè, S.; Kirk, John G.


    We study the acceleration of electrons and positrons at an electromagnetically modified, ultrarelativistic shock in the context of pulsar wind nebulae. We simulate the outflow produced by an obliquely rotating pulsar in proximity of its termination shock with a two-fluid code that uses a magnetic shear wave to mimic the properties of the wind. We integrate electron trajectories in the test-particle limit in the resulting background electromagnetic fields to analyze the injection mechanism. We find that the shock-precursor structure energizes and reflects a sizable fraction of particles, which becomes available for further acceleration. We investigate the subsequent first-order Fermi process sustained by small-scale magnetic fluctuations with a Monte Carlo code. We find that the acceleration proceeds in two distinct regimes: when the gyroradius {r}{{g}} exceeds the wavelength of the shear λ, the process is remarkably similar to first-order Fermi acceleration at relativistic, parallel shocks. This regime corresponds to a low-density wind that allows the propagation of superluminal waves. When {r}{{g}}< λ , which corresponds to the scenario of driven reconnection, the spectrum is softer.

  11. Acceleration without Temperature

    Doria, Alaric


    We show that while some non-uniformly accelerating observers (NUAOs) do indeed see a Bose-Einstein distribution of particles for the expectation value of the number operator in the Minkowski vacuum state, the density matrix is non-thermal and therefore a definition of temperature is not warranted. This is due to the fact that our NUAOs do not see event horizons in the spacetime. More specifically, the Minkowski vacuum state is perceived by our NUAOs as a single-mode squeezed state as opposed to the two-mode squeezed state characteristic of uniformly accelerating observers. Both single and two-mode squeezed states are pure quantum states; however, tracing over degrees of freedom in one of the modes of the two-mode squeezed state reduces the pure density matrix to a thermal density matrix. It is this property in the two-mode squeezed state that allows one to consistently define a temperature. In the single-mode case, an equivalent tracing is neither required nor available.

  12. Particle acceleration mechanisms

    Petrosyan, V


    We review the possible mechanisms for production of non-thermal electrons which are responsible for non-thermal radiation in clusters of galaxies. Our primary focus is on non-thermal Bremsstrahlung and inverse Compton scattering, that produce hard X-ray emission. We briefly review acceleration mechanisms and point out that in most astrophysical situations, and in particular for the intracluster medium, shocks, turbulence and plasma waves play a crucial role. We consider two scenarios for production of non-thermal radiation. The first is hard X-ray emission due to non-thermal Bremsstrahlung by nonrelativistic particles. Non-thermal tails are produced by accelerating electrons from the background plasma with an initial Maxwellian distribution. However, these tails are accompanied by significant heating and they are present for a short time of <10^6 yr, which is also the time that the tail will be thermalised. Such non-thermal tails, even if possible, can only explain the hard X-ray but not the radio emission...

  13. Accelerator School Success


    Accelerator specialists don't grow on trees: training them is the job of the CERN Accelerator School (CAS). Group photo during visit to the Daresbury Laboratory. CAS and the CCLRC Daresbury Laboratory jointly organised a specialised school on Power Converters in Warrington, England from 12-18 May 2004. The last CAS Power Converter course was in 1990, so there was plenty of ground to cover. The challenging programme proposed a review of the state of the art and the latest developments in the field, including 30 hours of tuition. The school also included a visit to the CCLRC Daresbury laboratory, a one-day excursion to Liverpool and Chester and a themed (Welsh medieval) dinner at the school's closure. A record attendance of 91 students of more than 20 different nationalities included not only participants from Europe and North America but also from Armenia, Taiwan, India, Turkey, Iran and for the first time, fee-paying students from China and Australia. European industry showed a welcome and solid interest in...

  14. Evaluating search effectiveness of some selected search engines ...

    Evaluating search effectiveness of some selected search engines. ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search · USING AJOL ... seek for information on the World Wide Web (WWW) using variety of search engines.

  15. Judging the Capability of Search Engines and Search Terms

    Anna Kaushik


    .... The present study aims to judge the capability of five selected search engines and search terms on the basis of first ten results and to identify most appropriate search term and search engine...

  16. Mechanisms of Plasma Acceleration in Coronal Jets

    Soto, N.; Reeves, K.; Savcheva, A. S.


    Jets are small explosions that occur frequently in the Sun possibly driven by the local reconfiguration of the magnetic field, or reconnection. There are two types of coronal jets: standard jets and blowout jets. The purpose of this project is to determine which mechanisms accelerate plasma in two different jets, one that occurred in January 17, 2015 at the disk of the sun and another in October 24, 2015 at the limb. Two possible acceleration mechanisms are chromospheric evaporation and magnetic acceleration. Using SDO/AIA, Hinode/XRT and IRIS data, we create height-time plots, and calculate the velocities of each wavelength for both jets. We calculate the potential magnetic field of the jet and the general region around it to gain a more detailed understanding of its structure, and determine if the jet is likely to be either a standard or blowout jet. Finally, we calculate the magnetic field strength for different heights along the jet spire, and use differential emission measures to calculate the plasma density. Once we have these two values, we calculate the Alfven speed. When analyzing our results we are looking for certain patterns in our velocities. If the plasma in a jet is accelerated by chromospheric evaporation, we expect the velocities to increase as function of temperature, which is what we observed in the October 24th jet. The magnetic models for this jet also show the Eiffel Tower shaped structure characteristic of standard jets, which tend to have plasma accelerated by this mechanism. On the other hand, if the acceleration mechanism were magnetic acceleration, we would expect the velocities to be similar regardless of temperature. For the January 17th jet, we saw that along the spire, the velocities where approximately 200 km/s in all wavelengths, but the velocities of hot plasma detected at the base were closer to the Alfven speed, which was estimated to be about 2,000 km/s. These observations suggest that the plasma in the January 17th jet is

  17. Combined Particle Acceleration in Solar Flares and Associated CME Shocks

    Petrosian, Vahe


    I will review some observations of the characteristics of accelerated electrons seen near Earth (as SEPs) and those producing flare radiation in the low corona and chromosphere. The similarities and differences between the numbers, spectral distribution, etc. of the two population can shed light on the mechanism and sites of the acceleration. I will show that in some events the origin of both population appears to be the flare site while in others, with harder SEP spectra, in addition to acceleration at the flare site, there appears to be a need for a second stage re-acceleration in the associated fast Coronal Mass Ejection (CME) environment. This scenario can also describe a similar dichotomy that exists between the so called impulsive, highly enriched (3He and heavy ions) and softer SEP ion events, and stronger more gradual SEP events with near normal ionic abundances and harder spectra. I will also describe under what conditions such hardening can be achieved.

  18. Acceleration in Linear and Circular Motion

    Kellington, S. H.; Docherty, W.


    Describes the construction of a simple accelerometer and explains its use in demonstrating acceleration, deceleration, constant speed, measurement of acceleration, acceleration and the inclined plane and angular and radial acceleration. (GS)

  19. Automatic Planning of External Search Engine Optimization

    Vita Jasevičiūtė


    Full Text Available This paper describes an investigation of the external search engine optimization (SEO action planning tool, dedicated to automatically extract a small set of most important keywords for each month during whole year period. The keywords in the set are extracted accordingly to external measured parameters, such as average number of searches during the year and for every month individually. Additionally the position of the optimized web site for each keyword is taken into account. The generated optimization plan is similar to the optimization plans prepared manually by the SEO professionals and can be successfully used as a support tool for web site search engine optimization.

  20. 2014 Joint International Accelerator School: Beam Loss and Accelerator Protection

    JAS - Joint US-CERN-Japan-Russia Accelerator School


    Many particle accelerators operate with very high beam power and very high energy stored in particle beams as well as in magnet systems. In the future, the beam power in high intensity accelerators will further increase. The protection of the accelerator equipment from the consequences of uncontrolled release of the energy is essential. This was the motivation for organizing a first school on beam losses and accelerator protection (in general referred to as machine protection). During the school the methods and technologies to identify, mitigate, monitor and manage the technical risks associated with the operation of accelerators with high-power beams or subsystems with large stored energy were presented. At the completion of the school the participants should have been able to understand the physical phenomena that can damage machine subsystems or interrupt operations and to analyze an accelerator facility to produce a register of technical risks and the corresponding risk mitigation and management strategie...