WorldWideScience

Sample records for search space reduction

  1. Optimal Fungal Space Searching Algorithms.

    Science.gov (United States)

    Asenova, Elitsa; Lin, Hsin-Yu; Fu, Eileen; Nicolau, Dan V; Nicolau, Dan V

    2016-10-01

    Previous experiments have shown that fungi use an efficient natural algorithm for searching the space available for their growth in micro-confined networks, e.g., mazes. This natural "master" algorithm, which comprises two "slave" sub-algorithms, i.e., collision-induced branching and directional memory, has been shown to be more efficient than alternatives, with one, or the other, or both sub-algorithms turned off. In contrast, the present contribution compares the performance of the fungal natural algorithm against several standard artificial homologues. It was found that the space-searching fungal algorithm consistently outperforms uninformed algorithms, such as Depth-First-Search (DFS). Furthermore, while the natural algorithm is inferior to informed ones, such as A*, this under-performance does not importantly increase with the increase of the size of the maze. These findings suggest that a systematic effort of harvesting the natural space searching algorithms used by microorganisms is warranted and possibly overdue. These natural algorithms, if efficient, can be reverse-engineered for graph and tree search strategies.

  2. Searching Fragment Spaces with feature trees.

    Science.gov (United States)

    Lessel, Uta; Wellenzohn, Bernd; Lilienthal, Markus; Claussen, Holger

    2009-02-01

    Virtual combinatorial chemistry easily produces billions of compounds, for which conventional virtual screening cannot be performed even with the fastest methods available. An efficient solution for such a scenario is the generation of Fragment Spaces, which encode huge numbers of virtual compounds by their fragments/reagents and rules of how to combine them. Similarity-based searches can be performed in such spaces without ever fully enumerating all virtual products. Here we describe the generation of a huge Fragment Space encoding about 5 * 10(11) compounds based on established in-house synthesis protocols for combinatorial libraries, i.e., we encode practically evaluated combinatorial chemistry protocols in a machine readable form, rendering them accessible to in silico search methods. We show how such searches in this Fragment Space can be integrated as a first step in an overall workflow. It reduces the extremely huge number of virtual products by several orders of magnitude so that the resulting list of molecules becomes more manageable for further more elaborated and time-consuming analysis steps. Results of a case study are presented and discussed, which lead to some general conclusions for an efficient expansion of the chemical space to be screened in pharmaceutical companies.

  3. ODF Maxima Extraction in Spherical Harmonic Representation via Analytical Search Space Reduction

    Science.gov (United States)

    Aganj, Iman; Lenglet, Christophe; Sapiro, Guillermo

    2015-01-01

    By revealing complex fiber structure through the orientation distribution function (ODF), q-ball imaging has recently become a popular reconstruction technique in diffusion-weighted MRI. In this paper, we propose an analytical dimension reduction approach to ODF maxima extraction. We show that by expressing the ODF, or any antipodally symmetric spherical function, in the common fourth order real and symmetric spherical harmonic basis, the maxima of the two-dimensional ODF lie on an analytically derived one-dimensional space, from which we can detect the ODF maxima. This method reduces the computational complexity of the maxima detection, without compromising the accuracy. We demonstrate the performance of our technique on both artificial and human brain data. PMID:20879302

  4. Efficient and accurate nearest neighbor and closest pair search in high-dimensional space

    KAUST Repository

    Tao, Yufei

    2010-07-01

    Nearest Neighbor (NN) search in high-dimensional space is an important problem in many applications. From the database perspective, a good solution needs to have two properties: (i) it can be easily incorporated in a relational database, and (ii) its query cost should increase sublinearly with the dataset size, regardless of the data and query distributions. Locality-Sensitive Hashing (LSH) is a well-known methodology fulfilling both requirements, but its current implementations either incur expensive space and query cost, or abandon its theoretical guarantee on the quality of query results. Motivated by this, we improve LSH by proposing an access method called the Locality-Sensitive B-tree (LSB-tree) to enable fast, accurate, high-dimensional NN search in relational databases. The combination of several LSB-trees forms a LSB-forest that has strong quality guarantees, but improves dramatically the efficiency of the previous LSH implementation having the same guarantees. In practice, the LSB-tree itself is also an effective index which consumes linear space, supports efficient updates, and provides accurate query results. In our experiments, the LSB-tree was faster than: (i) iDistance (a famous technique for exact NN search) by two orders ofmagnitude, and (ii) MedRank (a recent approximate method with nontrivial quality guarantees) by one order of magnitude, and meanwhile returned much better results. As a second step, we extend our LSB technique to solve another classic problem, called Closest Pair (CP) search, in high-dimensional space. The long-term challenge for this problem has been to achieve subquadratic running time at very high dimensionalities, which fails most of the existing solutions. We show that, using a LSB-forest, CP search can be accomplished in (worst-case) time significantly lower than the quadratic complexity, yet still ensuring very good quality. In practice, accurate answers can be found using just two LSB-trees, thus giving a substantial

  5. Optimization of the graph model of the water conduit network, based on the approach of search space reducing

    Science.gov (United States)

    Korovin, Iakov S.; Tkachenko, Maxim G.

    2018-03-01

    In this paper we present a heuristic approach, improving the efficiency of methods, used for creation of efficient architecture of water distribution networks. The essence of the approach is a procedure of search space reduction the by limiting the range of available pipe diameters that can be used for each edge of the network graph. In order to proceed the reduction, two opposite boundary scenarios for the distribution of flows are analysed, after which the resulting range is further narrowed by applying a flow rate limitation for each edge of the network. The first boundary scenario provides the most uniform distribution of the flow in the network, the opposite scenario created the net with the highest possible flow level. The parameters of both distributions are calculated by optimizing systems of quadratic functions in a confined space, which can be effectively performed with small time costs. This approach was used to modify the genetic algorithm (GA). The proposed GA provides a variable number of variants of each gene, according to the number of diameters in list, taking into account flow restrictions. The proposed approach was implemented to the evaluation of a well-known test network - the Hanoi water distribution network [1], the results of research were compared with a classical GA with an unlimited search space. On the test data, the proposed trip significantly reduced the search space and provided faster and more obvious convergence in comparison with the classical version of GA.

  6. Putting Continuous Metaheuristics to Work in Binary Search Spaces

    Directory of Open Access Journals (Sweden)

    Broderick Crawford

    2017-01-01

    Full Text Available In the real world, there are a number of optimization problems whose search space is restricted to take binary values; however, there are many continuous metaheuristics with good results in continuous search spaces. These algorithms must be adapted to solve binary problems. This paper surveys articles focused on the binarization of metaheuristics designed for continuous optimization.

  7. Radon transformation on reductive symmetric spaces:Support theorems

    DEFF Research Database (Denmark)

    Kuit, Job Jacob

    2013-01-01

    We introduce a class of Radon transforms for reductive symmetric spaces, including the horospherical transforms, and derive support theorems for these transforms. A reductive symmetric space is a homogeneous space G/H for a reductive Lie group G of the Harish-Chandra class, where H is an open sub...... is based on the relation between the Radon transform and the Fourier transform on G/H, and a Paley–Wiener-shift type argument. Our results generalize the support theorem of Helgason for the Radon transform on a Riemannian symmetric space....

  8. Exploration of Stellarator Configuration Space with Global Search Methods

    International Nuclear Information System (INIS)

    Mynick, H.E.; Pomphrey, N.; Ethier, S.

    2001-01-01

    An exploration of stellarator configuration space z for quasi-axisymmetric stellarator (QAS) designs is discussed, using methods which provide a more global view of that space. To this end, we have implemented a ''differential evolution'' (DE) search algorithm in an existing stellarator optimizer, which is much less prone to become trapped in local, suboptimal minima of the cost function chi than the local search methods used previously. This search algorithm is complemented by mapping studies of chi over z aimed at gaining insight into the results of the automated searches. We find that a wide range of the attractive QAS configurations previously found fall into a small number of classes, with each class corresponding to a basin of chi(z). We develop maps on which these earlier stellarators can be placed, the relations among them seen, and understanding gained into the physics differences between them. It is also found that, while still large, the region of z space containing practically realizable QAS configurations is much smaller than earlier supposed

  9. Parameter-space metric of semicoherent searches for continuous gravitational waves

    International Nuclear Information System (INIS)

    Pletsch, Holger J.

    2010-01-01

    Continuous gravitational-wave (CW) signals such as emitted by spinning neutron stars are an important target class for current detectors. However, the enormous computational demand prohibits fully coherent broadband all-sky searches for prior unknown CW sources over wide ranges of parameter space and for yearlong observation times. More efficient hierarchical ''semicoherent'' search strategies divide the data into segments much shorter than one year, which are analyzed coherently; then detection statistics from different segments are combined incoherently. To optimally perform the incoherent combination, understanding of the underlying parameter-space structure is requisite. This problem is addressed here by using new coordinates on the parameter space, which yield the first analytical parameter-space metric for the incoherent combination step. This semicoherent metric applies to broadband all-sky surveys (also embedding directed searches at fixed sky position) for isolated CW sources. Furthermore, the additional metric resolution attained through the combination of segments is studied. From the search parameters (sky position, frequency, and frequency derivatives), solely the metric resolution in the frequency derivatives is found to significantly increase with the number of segments.

  10. Colored Range Searching in Linear Space

    DEFF Research Database (Denmark)

    Grossi, Roberto; Vind, Søren Juhl

    2014-01-01

    In colored range searching, we are given a set of n colored points in d ≥ 2 dimensions to store, and want to support orthogonal range queries taking colors into account. In the colored range counting problem, a query must report the number of distinct colors found in the query range, while...... an answer to the colored range reporting problem must report the distinct colors in the query range. We give the first linear space data structure for both problems in two dimensions (d = 2) with o(n) worst case query time. We also give the first data structure obtaining almost-linear space usage and o...

  11. Application of Conformational Space Search in Drug Action | Adikwu ...

    African Journals Online (AJOL)

    The role of conformational space in drug action is presented. Two examples of molecules in different therapeutic groups are presented. Conformational space search will lead to isolating the exact conformation with the desired medicinal properties. Many conformations of a plant isolate may exist which are active, weakly ...

  12. Gravitational wave searches using the DSN (Deep Space Network)

    International Nuclear Information System (INIS)

    Nelson, S.J.; Armstrong, J.W.

    1988-01-01

    The Deep Space Network Doppler spacecraft link is currently the only method available for broadband gravitational wave searches in the 0.01 to 0.001 Hz frequency range. The DSN's role in the worldwide search for gravitational waves is described by first summarizing from the literature current theoretical estimates of gravitational wave strengths and time scales from various astrophysical sources. Current and future detection schemes for ground based and space based detectors are then discussed. Past, present, and future planned or proposed gravitational wave experiments using DSN Doppler tracking are described. Lastly, some major technical challenges to improve gravitational wave sensitivities using the DSN are discussed

  13. search of extra space dimensions with ATLAs

    Indian Academy of Sciences (India)

    search of extra space dimensions with ATLAs. AMBREEsH GUPTA (for the ATLAs Collaboration). 5640 South Ellis Avenue, Enrico Fermi Institute, University of Chicago, Chicago,. IL 60637, USA. Abstract. If extra spatial dimensions were to exist, they could provide a solution to the hierarchy problem. The studies done by the ...

  14. MFV Reductions of MSSM Parameter Space

    CERN Document Server

    AbdusSalam, S.S.; Quevedo, F.

    2015-01-01

    The 100+ free parameters of the minimal supersymmetric standard model (MSSM) make it computationally difficult to compare systematically with data, motivating the study of specific parameter reductions such as the cMSSM and pMSSM. Here we instead study the reductions of parameter space implied by using minimal flavour violation (MFV) to organise the R-parity conserving MSSM, with a view towards systematically building in constraints on flavour-violating physics. Within this framework the space of parameters is reduced by expanding soft supersymmetry-breaking terms in powers of the Cabibbo angle, leading to a 24-, 30- or 42-parameter framework (which we call MSSM-24, MSSM-30, and MSSM-42 respectively), depending on the order kept in the expansion. We provide a Bayesian global fit to data of the MSSM-30 parameter set to show that this is manageable with current tools. We compare the MFV reductions to the 19-parameter pMSSM choice and show that the pMSSM is not contained as a subset. The MSSM-30 analysis favours...

  15. Search-Order Independent State Caching

    DEFF Research Database (Denmark)

    Evangelista, Sami; Kristensen, Lars Michael

    2009-01-01

    State caching is a memory reduction technique used by model checkers to alleviate the state explosion problem. It has traditionally been coupled with a depth-first search to ensure termination.We propose and experimentally evaluate an extension of the state caching method for general state...... exploring algorithms that are independent of the search order (i.e., search algorithms that partition the state space into closed (visited) states, open (to visit) states and unmet states)....

  16. Exoplanet Searches by Future Deep Space Missions

    Directory of Open Access Journals (Sweden)

    Maccone C.

    2011-02-01

    Full Text Available The search for exoplanets could benefit from gravitational lensing if we could get to 550 AU from the Sun and beyond. This is because the gravitational lens of the Sun would highly intensify there any weak electromagnetic wave reaching the solar system from distant planets in the Galaxy (see Maccone 2009. The gravitational lens of the Sun, however, has a drawback: the solar Corona. Electrons in the Corona make electromagnetic waves diverge and this pushes the focus out to distances higher than 550 AU. Jupiter is the second larger mass in the solar system after the Sun, but in this focal game not only the mass matters: rather, what really matters is the ratio between the radius of the body squared and the mass of the body. In this regard, Jupiter qualifies as the second best choice for a space mission, requiring the spacecraft to reach 6,077 AU. In this paper, we study the benefit of exoplanet searches by deep space missions.

  17. A Paley-Wiener theorem for reductive symmetric spaces

    NARCIS (Netherlands)

    Ban, E.P. van den; Schlichtkrull, H.

    2006-01-01

    Let X = G/H be a reductive symmetric space and K a maximal compact subgroup of G. The image under the Fourier transform of the space of K-finite compactly supported smooth functions on X is characterized.

  18. Space-time least-squares Petrov-Galerkin projection in nonlinear model reduction.

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Youngsoo [Sandia National Laboratories (SNL-CA), Livermore, CA (United States). Extreme-scale Data Science and Analytics Dept.; Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Carlberg, Kevin Thomas [Sandia National Laboratories (SNL-CA), Livermore, CA (United States). Extreme-scale Data Science and Analytics Dept.

    2017-09-01

    Our work proposes a space-time least-squares Petrov-Galerkin (ST-LSPG) projection method for model reduction of nonlinear dynamical systems. In contrast to typical nonlinear model-reduction methods that first apply Petrov-Galerkin projection in the spatial dimension and subsequently apply time integration to numerically resolve the resulting low-dimensional dynamical system, the proposed method applies projection in space and time simultaneously. To accomplish this, the method first introduces a low-dimensional space-time trial subspace, which can be obtained by computing tensor decompositions of state-snapshot data. The method then computes discrete-optimal approximations in this space-time trial subspace by minimizing the residual arising after time discretization over all space and time in a weighted ℓ2-norm. This norm can be de ned to enable complexity reduction (i.e., hyper-reduction) in time, which leads to space-time collocation and space-time GNAT variants of the ST-LSPG method. Advantages of the approach relative to typical spatial-projection-based nonlinear model reduction methods such as Galerkin projection and least-squares Petrov-Galerkin projection include: (1) a reduction of both the spatial and temporal dimensions of the dynamical system, (2) the removal of spurious temporal modes (e.g., unstable growth) from the state space, and (3) error bounds that exhibit slower growth in time. Numerical examples performed on model problems in fluid dynamics demonstrate the ability of the method to generate orders-of-magnitude computational savings relative to spatial-projection-based reduced-order models without sacrificing accuracy.

  19. Dependent Space and Attribute Reduction on Fuzzy Information System

    Directory of Open Access Journals (Sweden)

    Shu Chang

    2017-01-01

    Full Text Available From equivalence relation RBδ on discourse domain U, we can derive equivalence relation Rδ on the attribute set A. From equivalence relation Rδ on discourse domain A, we can derive a congruence relation on the attribute power set P(A and establish an object dependent space. And then,we discuss the reduction method of fuzzy information system on object dependent space. At last ,the example in this paper demonstrates the feasibility and effectiveness of the reduction method based on the congruence relation Tδ providing an insight into the link between equivalence relation and congruence relation of dependent spaces in the rough set. In this way, the paper can provide powerful theoritical support to the combined using of reduction method, so it is of certain practical value.

  20. Radon transformation on reductive symmetric spaces: support theorems

    NARCIS (Netherlands)

    Kuit, J.J.|info:eu-repo/dai/nl/313872589

    2011-01-01

    In this thesis we introduce a class of Radon transforms for reductive symmetric spaces, including the horospherical transforms, and study some of their properties. In particular we obtain a generalization of Helgason's support theorem for the horospherical transform on a Riemannian symmetric space.

  1. Coset space dimensional reduction of gauge theories

    Energy Technology Data Exchange (ETDEWEB)

    Kapetanakis, D. (Physik Dept., Technische Univ. Muenchen, Garching (Germany)); Zoupanos, G. (CERN, Geneva (Switzerland))

    1992-10-01

    We review the attempts to construct unified theories defined in higher dimensions which are dimensionally reduced over coset spaces. We employ the coset space dimensional reduction scheme, which permits the detailed study of the resulting four-dimensional gauge theories. In the context of this scheme we present the difficulties and the suggested ways out in the attempts to describe the observed interactions in a realistic way. (orig.).

  2. Coset space dimensional reduction of gauge theories

    International Nuclear Information System (INIS)

    Kapetanakis, D.; Zoupanos, G.

    1992-01-01

    We review the attempts to construct unified theories defined in higher dimensions which are dimensionally reduced over coset spaces. We employ the coset space dimensional reduction scheme, which permits the detailed study of the resulting four-dimensional gauge theories. In the context of this scheme we present the difficulties and the suggested ways out in the attempts to describe the observed interactions in a realistic way. (orig.)

  3. Certain integrable system on a space associated with a quantum search algorithm

    International Nuclear Information System (INIS)

    Uwano, Y.; Hino, H.; Ishiwatari, Y.

    2007-01-01

    On thinking up a Grover-type quantum search algorithm for an ordered tuple of multiqubit states, a gradient system associated with the negative von Neumann entropy is studied on the space of regular relative configurations of multiqubit states (SR 2 CMQ). The SR 2 CMQ emerges, through a geometric procedure, from the space of ordered tuples of multiqubit states for the quantum search. The aim of this paper is to give a brief report on the integrability of the gradient dynamical system together with quantum information geometry of the underlying space, SR 2 CMQ, of that system

  4. Search for Antimatter in Space

    CERN Document Server

    2002-01-01

    PAMELA is a cosmic ray space experiment that will be installed on board of the Russian satellite Resurs-DK1 whose launch is scheduled at the end of 2002. The duration of the mission will be at least three years in a high latitude orbit at an altitude ranging between 350 and 600 Km. \\\\ The observational objectives of the PAMELA experiment are the measurement of the spectra of antiprotons, positrons, particles and nuclei in a wide range of energies, the search for antinuclei and the study of the cosmic ray fluxes during a portion of a solar cycle. The main scientific objectives can be schematically summarized as follows: \\\\ \\\\ a) measurement of the antiproton spectrum in the energy range 80 MeV-190 GeV;\\\\ b) measurement of the positron spectrum in the energy range 50 MeV-270 GeV;\\\\ c) search for antinuclei with a sensitivity of the order $10^{-8}$ in the $\\overline{He}/He$ ratio;\\\\ d) measurement of the nuclei spectra (from H to C) in the energy range 100 MeV/n - 200 GeV/n;\\\\ e) energy spectrum of the electroni...

  5. Feed-Forward Neural Networks and Minimal Search Space Learning

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    2005-01-01

    Roč. 4, č. 12 (2005), s. 1867-1872 ISSN 1109-2750 R&D Projects: GA ČR GA201/05/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : search space * feed-forward networks * genetic algorithm s Subject RIV: BA - General Mathematics

  6. Exploration of the search space of the in-core fuel management problem by knowledge-based techniques

    International Nuclear Information System (INIS)

    Galperin, A.

    1995-01-01

    The process of generating reload configuration patterns is presented as a search procedure. The search space of the problem is found to contain ∼ 10 12 possible problem states. If computational resources and execution time necessary to evaluate a single solution are taken into account, this problem may be described as a ''large space search problem.'' Understanding of the structure of the search space, i.e., distribution of the optimal (or nearly optimal) solutions, is necessary to choose an appropriate search method and to utilize adequately domain heuristic knowledge. A worth function is developed based on two performance parameters: cycle length and power peaking factor. A series of numerical experiments was carried out; 300,000 patterns were generated in 40 sessions. All these patterns were analyzed by simulating the power production cycle and by evaluating the two performance parameters. The worth function was calculated and plotted. Analysis of the worth function reveals quite a complicated search space structure. The fine structure shows an extremely large number of local peaks: about one peak per hundred configurations. The direct implication of this discovery is that within a search space of 10 12 states, there are ∼10 10 local optima. Further consideration of the worth function shape shows that the distribution of the local optima forms a contour with much slower variations, where ''better'' or ''worse'' groups of patterns are spaced within a few thousand or tens of thousands of configurations, and finally very broad subregions of the whole space display variations of the worth function, where optimal regions include tens of thousands of patterns and are separated by hundreds of thousands and millions

  7. PAPR Reduction of OFDM Signals by Novel Global Harmony Search in PTS Scheme

    Directory of Open Access Journals (Sweden)

    Hojjat Salehinejad

    2012-01-01

    Full Text Available The orthogonal frequency division multiplexing (OFDM modulation technique is one of the key strategies for multiuser signal transmission especially in smart grids and wind farms. This paper introduces an approach for peak-to-average power ratio (PAPR reduction of such signals based on novel global harmony search (NGHS and partial transmit sequence (PTS schemes. In PTS technique, the data block to be transmitted is partitioned into disjoint subblocks, which are combined using phase factors to minimize PAPR. The PTS requires an exhaustive search over all combinations of allowed phase factors. Therefore, with respect to the fast implementation and simplicity of NGHS technique, we could achieve significant reduction of PAPR.

  8. Similarity searching and scaffold hopping in synthetically accessible combinatorial chemistry spaces.

    Science.gov (United States)

    Boehm, Markus; Wu, Tong-Ying; Claussen, Holger; Lemmen, Christian

    2008-04-24

    Large collections of combinatorial libraries are an integral element in today's pharmaceutical industry. It is of great interest to perform similarity searches against all virtual compounds that are synthetically accessible by any such library. Here we describe the successful application of a new software tool CoLibri on 358 combinatorial libraries based on validated reaction protocols to create a single chemistry space containing over 10 (12) possible products. Similarity searching with FTrees-FS allows the systematic exploration of this space without the need to enumerate all product structures. The search result is a set of virtual hits which are synthetically accessible by one or more of the existing reaction protocols. Grouping these virtual hits by their synthetic protocols allows the rapid design and synthesis of multiple follow-up libraries. Such library ideas support hit-to-lead design efforts for tasks like follow-up from high-throughput screening hits or scaffold hopping from one hit to another attractive series.

  9. The Surface Extraction from TIN based Search-space Minimization (SETSM) algorithm

    Science.gov (United States)

    Noh, Myoung-Jong; Howat, Ian M.

    2017-07-01

    Digital Elevation Models (DEMs) provide critical information for a wide range of scientific, navigational and engineering activities. Submeter resolution, stereoscopic satellite imagery with high geometric and radiometric quality, and wide spatial coverage are becoming increasingly accessible for generating stereo-photogrammetric DEMs. However, low contrast and repeatedly-textured surfaces, such as snow and glacial ice at high latitudes, and mountainous terrains challenge existing stereo-photogrammetric DEM generation techniques, particularly without a-priori information such as existing seed DEMs or the manual setting of terrain-specific parameters. To utilize these data for fully-automatic DEM extraction at a large scale, we developed the Surface Extraction from TIN-based Search-space Minimization (SETSM) algorithm. SETSM is fully automatic (i.e. no search parameter settings are needed) and uses only the sensor model Rational Polynomial Coefficients (RPCs). SETSM adopts a hierarchical, combined image- and object-space matching strategy utilizing weighted normalized cross-correlation with both original distorted and geometrically corrected images for overcoming ambiguities caused by foreshortening and occlusions. In addition, SETSM optimally minimizes search-spaces to extract optimal matches over problematic terrains by iteratively updating object surfaces within a Triangulated Irregular Network, and utilizes a geometric-constrained blunder and outlier detection in object space. We prove the ability of SETSM to mitigate typical stereo-photogrammetric matching problems over a range of challenging terrains. SETSM is the primary DEM generation software for the US National Science Foundation's ArcticDEM project.

  10. Make Yourself At Home! Adolescents in Search of the Queer Spaces of Home

    Directory of Open Access Journals (Sweden)

    Kokkola, Lydia

    2014-09-01

    Full Text Available Home is often assumed to be a safe place, a place to which children can return after their adventures Away. For many gay and lesbian teens, both fictional and in real life, however, the space they share with their family of origin is not a place where they can feel at home. The heterosexual family home is often so hostile to queerly desiring teens that they are forced to leave in search of a place where they can feel at home. The queer spaces they enter in their search are usually considered risky spaces – public spaces, urban spaces, the bar and the street – unhomely spaces. In these temporary, in-between spaces, the queerly desiring teens in the novels examined in this paper form new family structures. Although all the Anglophone novels discussed in this article end on moments of up-lift and hope for the future, the association of the queerly desiring youngster with risky spaces suggests that the queer teens are themselves unheimlich (uncanny..

  11. The Space Physics of Life: Searching for Biosignatures on Habitable Icy Worlds Affected by Space Weathering

    Science.gov (United States)

    Cooper, John F.

    2006-01-01

    Accessible surfaces of the most likely astrobiological habitats (Mars, Europa, Titan) in the solar system beyond Earth are exposed to various chemical and hydrologic weathering processes directly or indirectly induced by interaction with the overlying space environment. These processes can be both beneficial, through provision of chemical compounds and energy, and destructive, through chemical dissociation or burial, to detectable presence of biosignatures. Orbital, suborbital, and surface platforms carrying astrobiological instrumentation must survive, and preferably exploit, space environment interactions to reach these habitats and search for evidence of life or its precursors. Experience from Mars suggests that any detection of biosignatures must be accompanied by characterization of the local chemical environment and energy sources including irradiation by solar ultraviolet photons and energetic particles from the space environment. Orbital and suborbital surveys of surface chemistry and astrobiological potential in the context of the space environment should precede targeted in-situ measurements to maximize probability of biosignature detection through site selection. The Space Physics of Life (SPOL) investigation has recently been proposed to the NASA Astrobiology Institute and is briefly described in this presentation. SPOL is the astrobiologically relevant study of the interactions and relationships of potentially? or previously inhabited, bodies of the solar system with the surrounding environments. This requires an interdisciplinary effort in space physics, planetary science, and radiation biology. The proposed investigation addresses the search for habitable environments, chemical resources to support life, and techniques for detection of organic and inorganic signs of life in the context of the space environment.

  12. Pylogeny: an open-source Python framework for phylogenetic tree reconstruction and search space heuristics

    Directory of Open Access Journals (Sweden)

    Alexander Safatli

    2015-06-01

    Full Text Available Summary. Pylogeny is a cross-platform library for the Python programming language that provides an object-oriented application programming interface for phylogenetic heuristic searches. Its primary function is to permit both heuristic search and analysis of the phylogenetic tree search space, as well as to enable the design of novel algorithms to search this space. To this end, the framework supports the structural manipulation of phylogenetic trees, in particular using rearrangement operators such as NNI, SPR, and TBR, the scoring of trees using parsimony and likelihood methods, the construction of a tree search space graph, and the programmatic execution of a few existing heuristic programs. The library supports a range of common phylogenetic file formats and can be used for both nucleotide and protein data. Furthermore, it is also capable of supporting GPU likelihood calculation on nucleotide character data through the BEAGLE library.Availability. Existing development and source code is available for contribution and for download by the public from GitHub (http://github.com/AlexSafatli/Pylogeny. A stable release of this framework is available for download through PyPi (Python Package Index at http://pypi.python.org/pypi/pylogeny.

  13. State Space Reduction for Model Checking Agent Programs

    NARCIS (Netherlands)

    S.-S.T.Q. Jongmans (Sung-Shik); K.V. Hindriks; M.B. van Riemsdijk; L. Dennis; O. Boissier; R.H. Bordini (Rafael)

    2012-01-01

    htmlabstractState space reduction techniques have been developed to increase the efficiency of model checking in the context of imperative programming languages. Unfortunately, these techniques cannot straightforwardly be applied to agents: the nature of states in the two programming paradigms

  14. Normalizations of Eisenstein integrals for reductive symmetric spaces

    NARCIS (Netherlands)

    van den Ban, E.P.; Kuit, Job

    2017-01-01

    We construct minimal Eisenstein integrals for a reductive symmetric space G/H as matrix coefficients of the minimal principal series of G. The Eisenstein integrals thus obtained include those from the \\sigma-minimal principal series. In addition, we obtain related Eisenstein integrals, but with

  15. A learning heuristic for space mapping and searching self-organizing systems using adaptive mesh refinement

    Science.gov (United States)

    Phillips, Carolyn L.

    2014-09-01

    In a complex self-organizing system, small changes in the interactions between the system's components can result in different emergent macrostructures or macrobehavior. In chemical engineering and material science, such spontaneously self-assembling systems, using polymers, nanoscale or colloidal-scale particles, DNA, or other precursors, are an attractive way to create materials that are precisely engineered at a fine scale. Changes to the interactions can often be described by a set of parameters. Different contiguous regions in this parameter space correspond to different ordered states. Since these ordered states are emergent, often experiment, not analysis, is necessary to create a diagram of ordered states over the parameter space. By issuing queries to points in the parameter space (e.g., performing a computational or physical experiment), ordered states can be discovered and mapped. Queries can be costly in terms of resources or time, however. In general, one would like to learn the most information using the fewest queries. Here we introduce a learning heuristic for issuing queries to map and search a two-dimensional parameter space. Using a method inspired by adaptive mesh refinement, the heuristic iteratively issues batches of queries to be executed in parallel based on past information. By adjusting the search criteria, different types of searches (for example, a uniform search, exploring boundaries, sampling all regions equally) can be flexibly implemented. We show that this method will densely search the space, while preferentially targeting certain features. Using numerical examples, including a study simulating the self-assembly of complex crystals, we show how this heuristic can discover new regions and map boundaries more accurately than a uniformly distributed set of queries.

  16. Hiding and Searching Strategies of Adult Humans in a Virtual and a Real-Space Room

    Science.gov (United States)

    Talbot, Katherine J.; Legge, Eric L. G.; Bulitko, Vadim; Spetch, Marcia L.

    2009-01-01

    Adults searched for or cached three objects in nine hiding locations in a virtual room or a real-space room. In both rooms, the locations selected by participants differed systematically between searching and hiding. Specifically, participants moved farther from origin and dispersed their choices more when hiding objects than when searching for…

  17. Minimizing Warehouse Space through Inventory Reduction at Reckitt Benckiser

    OpenAIRE

    KILINC, IZGI SELEN

    2009-01-01

    This dissertation represents a ten week internship at pharmaceutical plant of Reckitt Benckiser for the Warehouse Stock Reduction Project. Due to foreseeable growth by the factory, there is increasing pressure to utilise existing warehouse space by reducing the existing stock level by 50 %. Therefore, this study aims to identify the opportunities to reduce the physical stock held in raw/pack materials in the warehouse and save space for additional manufacturing resources. The analysis demo...

  18. Evolved finite state controller for hybrid system in reduced search space

    DEFF Research Database (Denmark)

    Dupuis, Jean-Francois; Fan, Zhun

    2009-01-01

    This paper presents an evolutionary methodology to automatically generate finite state automata (FSA) controllers to control hybrid systems. The proposed approach reduces the search space using an invariant analysis of the system. FSA controllers for a case study of two-tank system have been...

  19. Min-Max Spaces and Complexity Reduction in Min-Max Expansions

    Energy Technology Data Exchange (ETDEWEB)

    Gaubert, Stephane, E-mail: Stephane.Gaubert@inria.fr [Ecole Polytechnique, INRIA and CMAP (France); McEneaney, William M., E-mail: wmceneaney@ucsd.edu [University of California San Diego, Dept. of Mech. and Aero. Eng. (United States)

    2012-06-15

    Idempotent methods have been found to be extremely helpful in the numerical solution of certain classes of nonlinear control problems. In those methods, one uses the fact that the value function lies in the space of semiconvex functions (in the case of maximizing controllers), and approximates this value using a truncated max-plus basis expansion. In some classes, the value function is actually convex, and then one specifically approximates with suprema (i.e., max-plus sums) of affine functions. Note that the space of convex functions is a max-plus linear space, or moduloid. In extending those concepts to game problems, one finds a different function space, and different algebra, to be appropriate. Here we consider functions which may be represented using infima (i.e., min-max sums) of max-plus affine functions. It is natural to refer to the class of functions so represented as the min-max linear space (or moduloid) of max-plus hypo-convex functions. We examine this space, the associated notion of duality and min-max basis expansions. In using these methods for solution of control problems, and now games, a critical step is complexity-reduction. In particular, one needs to find reduced-complexity expansions which approximate the function as well as possible. We obtain a solution to this complexity-reduction problem in the case of min-max expansions.

  20. Dimensional reduction from entanglement in Minkowski space

    International Nuclear Information System (INIS)

    Brustein, Ram; Yarom, Amos

    2005-01-01

    Using a quantum field theoretic setting, we present evidence for dimensional reduction of any sub-volume of Minkowksi space. First, we show that correlation functions of a class of operators restricted to a sub-volume of D-dimensional Minkowski space scale as its surface area. A simple example of such area scaling is provided by the energy fluctuations of a free massless quantum field in its vacuum state. This is reminiscent of area scaling of entanglement entropy but applies to quantum expectation values in a pure state, rather than to statistical averages over a mixed state. We then show, in a specific case, that fluctuations in the bulk have a lower-dimensional representation in terms of a boundary theory at high temperature. (author)

  1. A Novel Method Using Abstract Convex Underestimation in Ab-Initio Protein Structure Prediction for Guiding Search in Conformational Feature Space.

    Science.gov (United States)

    Hao, Xiao-Hu; Zhang, Gui-Jun; Zhou, Xiao-Gen; Yu, Xu-Feng

    2016-01-01

    To address the searching problem of protein conformational space in ab-initio protein structure prediction, a novel method using abstract convex underestimation (ACUE) based on the framework of evolutionary algorithm was proposed. Computing such conformations, essential to associate structural and functional information with gene sequences, is challenging due to the high-dimensionality and rugged energy surface of the protein conformational space. As a consequence, the dimension of protein conformational space should be reduced to a proper level. In this paper, the high-dimensionality original conformational space was converted into feature space whose dimension is considerably reduced by feature extraction technique. And, the underestimate space could be constructed according to abstract convex theory. Thus, the entropy effect caused by searching in the high-dimensionality conformational space could be avoided through such conversion. The tight lower bound estimate information was obtained to guide the searching direction, and the invalid searching area in which the global optimal solution is not located could be eliminated in advance. Moreover, instead of expensively calculating the energy of conformations in the original conformational space, the estimate value is employed to judge if the conformation is worth exploring to reduce the evaluation time, thereby making computational cost lower and the searching process more efficient. Additionally, fragment assembly and the Monte Carlo method are combined to generate a series of metastable conformations by sampling in the conformational space. The proposed method provides a novel technique to solve the searching problem of protein conformational space. Twenty small-to-medium structurally diverse proteins were tested, and the proposed ACUE method was compared with It Fix, HEA, Rosetta and the developed method LEDE without underestimate information. Test results show that the ACUE method can more rapidly and more

  2. Quasi-steady State Reduction of Molecular Motor-Based Models of Directed Intermittent Search

    KAUST Repository

    Newby, Jay M.

    2010-02-19

    We present a quasi-steady state reduction of a linear reaction-hyperbolic master equation describing the directed intermittent search for a hidden target by a motor-driven particle moving on a one-dimensional filament track. The particle is injected at one end of the track and randomly switches between stationary search phases and mobile nonsearch phases that are biased in the anterograde direction. There is a finite possibility that the particle fails to find the target due to an absorbing boundary at the other end of the track. Such a scenario is exemplified by the motor-driven transport of vesicular cargo to synaptic targets located on the axon or dendrites of a neuron. The reduced model is described by a scalar Fokker-Planck (FP) equation, which has an additional inhomogeneous decay term that takes into account absorption by the target. The FP equation is used to compute the probability of finding the hidden target (hitting probability) and the corresponding conditional mean first passage time (MFPT) in terms of the effective drift velocity V, diffusivity D, and target absorption rate λ of the random search. The quasi-steady state reduction determines V, D, and λ in terms of the various biophysical parameters of the underlying motor transport model. We first apply our analysis to a simple 3-state model and show that our quasi-steady state reduction yields results that are in excellent agreement with Monte Carlo simulations of the full system under physiologically reasonable conditions. We then consider a more complex multiple motor model of bidirectional transport, in which opposing motors compete in a "tug-of-war", and use this to explore how ATP concentration might regulate the delivery of cargo to synaptic targets. © 2010 Society for Mathematical Biology.

  3. A proposed heuristic methodology for searching reloading pattern

    International Nuclear Information System (INIS)

    Choi, K. Y.; Yoon, Y. K.

    1993-01-01

    A new heuristic method for loading pattern search has been developed to overcome shortcomings of the algorithmic approach. To reduce the size of vast solution space, general shuffling rules, a regionwise shuffling method, and a pattern grouping method were introduced. The entropy theory was applied to classify possible loading patterns into groups with similarity between them. The pattern search program was implemented with use of the PROLOG language. A two-group nodal code MEDIUM-2D was used for analysis of power distribution in the core. The above mentioned methodology has been tested to show effectiveness in reducing of solution space down to a few hundred pattern groups. Burnable poison rods were then arranged in each pattern group in accordance with burnable poison distribution rules, which led to further reduction of the solution space to several scores of acceptable pattern groups. The method of maximizing cycle length (MCL) and minimizing power-peaking factor (MPF) were applied to search for specific useful loading patterns from the acceptable pattern groups. Thus, several specific loading patterns that have low power-peaking factor and large cycle length were successfully searched from the selected pattern groups. (Author)

  4. The principal series for a reductive symmetric space, II. Eisenstein integrals.

    NARCIS (Netherlands)

    Ban, E.P. van den

    1991-01-01

    In this paper we develop a theory of Eisenstein integrals related to the principal series for a reductive symmetric space G=H: Here G is a real reductive group of Harish-Chandra's class, ? an involution of G and H an open subgroup of the group G ? of xed points for ?: The group G itself is a

  5. Analytic families of eigenfunctions on a reductive symmetric space

    NARCIS (Netherlands)

    Ban, E.P. van den; Schlichtkrull, H.

    2000-01-01

    In harmonic analysis on a reductive symmetric space X an important role is played by families of generalized eigenfunctions for the algebra D (X) of invariant dierential operators. Such families arise for instance as matrix coeÆcients of representations that come in series, such as the (generalized)

  6. Identification of Fuzzy Inference Systems by Means of a Multiobjective Opposition-Based Space Search Algorithm

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2013-01-01

    Full Text Available We introduce a new category of fuzzy inference systems with the aid of a multiobjective opposition-based space search algorithm (MOSSA. The proposed MOSSA is essentially a multiobjective space search algorithm improved by using an opposition-based learning that employs a so-called opposite numbers mechanism to speed up the convergence of the optimization algorithm. In the identification of fuzzy inference system, the MOSSA is exploited to carry out the parametric identification of the fuzzy model as well as to realize its structural identification. Experimental results demonstrate the effectiveness of the proposed fuzzy models.

  7. Partial Transmit Sequence Optimization Using Improved Harmony Search Algorithm for PAPR Reduction in OFDM

    Directory of Open Access Journals (Sweden)

    Mangal Singh

    2017-12-01

    Full Text Available This paper considers the use of the Partial Transmit Sequence (PTS technique to reduce the Peak‐to‐Average Power Ratio (PAPR of an Orthogonal Frequency Division Multiplexing signal in wireless communication systems. Search complexity is very high in the traditional PTS scheme because it involves an extensive random search over all combinations of allowed phase vectors, and it increases exponentially with the number of phase vectors. In this paper, a suboptimal metaheuristic algorithm for phase optimization based on an improved harmony search (IHS is applied to explore the optimal combination of phase vectors that provides improved performance compared with existing evolutionary algorithms such as the harmony search algorithm and firefly algorithm. IHS enhances the accuracy and convergence rate of the conventional algorithms with very few parameters to adjust. Simulation results show that an improved harmony search‐based PTS algorithm can achieve a significant reduction in PAPR using a simple network structure compared with conventional algorithms.

  8. Reduction rules-based search algorithm for opportunistic replacement strategy of multiple life-limited parts

    Directory of Open Access Journals (Sweden)

    Xuyun FU

    2018-01-01

    Full Text Available The opportunistic replacement of multiple Life-Limited Parts (LLPs is a problem widely existing in industry. The replacement strategy of LLPs has a great impact on the total maintenance cost to a lot of equipment. This article focuses on finding a quick and effective algorithm for this problem. To improve the algorithm efficiency, six reduction rules are suggested from the perspectives of solution feasibility, determination of the replacement of LLPs, determination of the maintenance occasion and solution optimality. Based on these six reduction rules, a search algorithm is proposed. This search algorithm can identify one or several optimal solutions. A numerical experiment shows that these six reduction rules are effective, and the time consumed by the algorithm is less than 38 s if the total life of equipment is shorter than 55000 and the number of LLPs is less than 11. A specific case shows that the algorithm can obtain optimal solutions which are much better than the result of the traditional method in 10 s, and it can provide support for determining to-be-replaced LLPs when determining the maintenance workscope of an aircraft engine. Therefore, the algorithm is applicable to engineering applications concerning opportunistic replacement of multiple LLPs in aircraft engines.

  9. Enhancing the Search in MOLAP Sparse Data 

    Directory of Open Access Journals (Sweden)

    Joseph Zalaket

    2012-11-01

    Full Text Available Multidimensional on-line analytical processing (MOLAP systems deal well with dense data than relational ones (ROLAP. In the existence of sparse data, MOLAP systems become memory consuming, which may limit and slow down data processing tasks. Many compression techniques have been proposed to deal with the sparsity of data in MOLAP systems. One of these techniques is the bitmap compression, which allows a significant reduction of the memory space used for data processing. In this article, we propose an extension to the bitmap compression technique by storing the compressed data as bits into multiple efficient data structures based on a new indexing strategy instead of the linear structure. Compared with the classical bitmap, the proposed enhancement not only allows space reduction but also reduces the search time through the compressed data. We present some algorithms that allow maintaining and searching within the compressed structure without the need for decompression. We demonstrate that the complexity of the proposed algorithms varies from logarithmic to constant, compared with the linear complexity of the classical bitmap technique. 

  10. Efficient and accurate nearest neighbor and closest pair search in high-dimensional space

    KAUST Repository

    Tao, Yufei; Yi, Ke; Sheng, Cheng; Kalnis, Panos

    2010-01-01

    Nearest Neighbor (NN) search in high-dimensional space is an important problem in many applications. From the database perspective, a good solution needs to have two properties: (i) it can be easily incorporated in a relational database, and (ii

  11. PETRA, MSVAT-SPACE and SEMAC sequences for metal artefact reduction in dental MR imaging

    International Nuclear Information System (INIS)

    Hilgenfeld, Tim; Heil, Alexander; Bendszus, Martin; Prager, Marcel; Heiland, Sabine; Schwindling, Franz Sebastian; Rammelsberg, Peter; Nittka, Mathias; Grodzki, David

    2017-01-01

    Dental MRI is often impaired by artefacts due to metallic dental materials. Several sequences were developed to reduce susceptibility artefacts. Here, we evaluated a set of sequences for artefact reduction for dental MRI for the first time. Artefact volume, signal-to-noise ratio (SNR) and image quality were assessed on a 3-T MRI for pointwise encoding time reduction with radial acquisition (PETRA), multiple-slab acquisition with view angle tilting gradient, based on a sampling perfection with application-optimised contrasts using different flip angle evolution (SPACE) sequence (MSVAT-SPACE), slice-encoding for metal-artefact correction (SEMAC) and compared to a standard SPACE and a standard turbo-spin-echo (TSE) sequence. Field-of-view and acquisition times were chosen to enable in vivo application. Two implant-supported prostheses were tested (porcelain fused to metal non-precious alloy and monolithic zirconia). Smallest artefact was measured for TSE sequences with no difference between the standard TSE and the SEMAC. MSVAT-SPACE reduced artefacts about 56% compared to the standard SPACE. Effect of the PETRA was dependent on sample used. Image quality and SNR were comparable for all sequences except PETRA, which yielded poor results. There is no benefit in terms of artefact reduction for SEMAC compared to standard TSE. Usage of MSVAT-SPACE is advantageous since artefacts are reduced and higher resolution is achieved. (orig.)

  12. PETRA, MSVAT-SPACE and SEMAC sequences for metal artefact reduction in dental MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Hilgenfeld, Tim; Heil, Alexander; Bendszus, Martin [Heidelberg University Hospital, Department of Neuroradiology, Heidelberg (Germany); Prager, Marcel; Heiland, Sabine [Heidelberg University Hospital, Department of Neuroradiology, Heidelberg (Germany); Heidelberg University Hospital, Section of Experimental Radiology, Heidelberg (Germany); Schwindling, Franz Sebastian; Rammelsberg, Peter [Heidelberg University Hospital, Department of Prosthodontics, Heidelberg (Germany); Nittka, Mathias; Grodzki, David [Siemens Healthcare GmbH, Erlangen (Germany)

    2017-12-15

    Dental MRI is often impaired by artefacts due to metallic dental materials. Several sequences were developed to reduce susceptibility artefacts. Here, we evaluated a set of sequences for artefact reduction for dental MRI for the first time. Artefact volume, signal-to-noise ratio (SNR) and image quality were assessed on a 3-T MRI for pointwise encoding time reduction with radial acquisition (PETRA), multiple-slab acquisition with view angle tilting gradient, based on a sampling perfection with application-optimised contrasts using different flip angle evolution (SPACE) sequence (MSVAT-SPACE), slice-encoding for metal-artefact correction (SEMAC) and compared to a standard SPACE and a standard turbo-spin-echo (TSE) sequence. Field-of-view and acquisition times were chosen to enable in vivo application. Two implant-supported prostheses were tested (porcelain fused to metal non-precious alloy and monolithic zirconia). Smallest artefact was measured for TSE sequences with no difference between the standard TSE and the SEMAC. MSVAT-SPACE reduced artefacts about 56% compared to the standard SPACE. Effect of the PETRA was dependent on sample used. Image quality and SNR were comparable for all sequences except PETRA, which yielded poor results. There is no benefit in terms of artefact reduction for SEMAC compared to standard TSE. Usage of MSVAT-SPACE is advantageous since artefacts are reduced and higher resolution is achieved. (orig.)

  13. Fast space-varying convolution using matrix source coding with applications to camera stray light reduction.

    Science.gov (United States)

    Wei, Jianing; Bouman, Charles A; Allebach, Jan P

    2014-05-01

    Many imaging applications require the implementation of space-varying convolution for accurate restoration and reconstruction of images. Here, we use the term space-varying convolution to refer to linear operators whose impulse response has slow spatial variation. In addition, these space-varying convolution operators are often dense, so direct implementation of the convolution operator is typically computationally impractical. One such example is the problem of stray light reduction in digital cameras, which requires the implementation of a dense space-varying deconvolution operator. However, other inverse problems, such as iterative tomographic reconstruction, can also depend on the implementation of dense space-varying convolution. While space-invariant convolution can be efficiently implemented with the fast Fourier transform, this approach does not work for space-varying operators. So direct convolution is often the only option for implementing space-varying convolution. In this paper, we develop a general approach to the efficient implementation of space-varying convolution, and demonstrate its use in the application of stray light reduction. Our approach, which we call matrix source coding, is based on lossy source coding of the dense space-varying convolution matrix. Importantly, by coding the transformation matrix, we not only reduce the memory required to store it; we also dramatically reduce the computation required to implement matrix-vector products. Our algorithm is able to reduce computation by approximately factoring the dense space-varying convolution operator into a product of sparse transforms. Experimental results show that our method can dramatically reduce the computation required for stray light reduction while maintaining high accuracy.

  14. A Framework for Similarity Search with Space-Time Tradeoffs using Locality Sensitive Filtering

    DEFF Research Database (Denmark)

    Christiani, Tobias Lybecker

    2017-01-01

    that satisfies certain locality-sensitivity properties, we can construct a dynamic data structure that solves the approximate near neighbor problem in $d$-dimensional space with query time $dn^{\\rho_q + o(1)}$, update time $dn^{\\rho_u + o(1)}$, and space usage $dn + n^{1 + \\rho_u + o(1)}$ where $n$ denotes......We present a framework for similarity search based on Locality-Sensitive Filtering~(LSF),generalizing the Indyk-Motwani (STOC 1998) Locality-Sensitive Hashing~(LSH) framework to support space-time tradeoffs. Given a family of filters, defined as a distribution over pairs of subsets of space...... the number of points in the data structure.The space-time tradeoff is tied to the tradeoff between query time and update time (insertions/deletions), controlled by the exponents $\\rho_q, \\rho_u$ that are determined by the filter family. \\\\ Locality-sensitive filtering was introduced by Becker et al. (SODA...

  15. The Fate of DDH Hips Showing Cartilaginous or Fibrous Tissue-filled Joint Spaces Following Primary Reduction.

    Science.gov (United States)

    Kim, Hui Taek; Lee, Tae Hoon; Ahn, Tae Young; Jang, Jae Hoon

    Because the use of magnetic resonance imaging is still not universal for the patients with developmental dysplasia of the hip patients, orthopaedists do not generally distinguish widened joint spaces which are "empty" after primary treatment (and therefore still reducible), from those which are filled and much more difficult to treat. To date no studies have focused on the latter hips. We treated and observed the outcomes for 19 hips which showed filled joint spaces after primary treatment. We retrospectively reviewed 19 cases of developmental dysplasia of the hip: (1) who showed a widened joint space on radiographs after primary treatment; and (2) whose magnetic resonance imaging showed that the widened joint space was accompanied by acetabular cartilage hypertrophy and/or was filled with fibrous tissues. All patients were over 1 year old at the time of primary reduction (reduction was closed in 4 patients, open in 6, and open with pelvic osteotomy in 9). Thirteen patients received at least 1 secondary treatment. Final results were classified using a modified Severin classification. Final outcomes were satisfactory in 10 (52.6%) and unsatisfactory in 9 (47.4%). The widened joint spaces gradually filled with bone, resulting in a shallow acetabulum in the patients with unsatisfactory results. Of 9 patients who underwent combined pelvic osteotomy at the time of primary reduction, results were satisfactory in 6 (66.7%), whereas all patients who had only closed or open primary reduction had unsatisfactory results. Combined pelvic osteotomy at the time of primary reduction is advisable in hips with widened joint spaces. However, hips with filled joint spaces after primary treatment often have unsatisfactory results even after additional pelvic and/or femoral osteotomy. Level IV-prognostic study.

  16. The economic benefits of rainwater-runoff reduction by urban green spaces: a case study in Beijing, China.

    Science.gov (United States)

    Zhang, Biao; Xie, Gaodi; Zhang, Canqiang; Zhang, Jing

    2012-06-15

    Urbanization involves the replacement of vegetated surfaces with impervious built surfaces, and it often results in an increase in the rate and volume of rainwater surface runoff. Urban green spaces play a positive role in rainwater-runoff reduction. However, few studies have explored the benefits of rainwater-runoff reduction by urban green spaces. Based on inventory data of urban green spaces in Beijing, the paper evaluated the economic benefits of rainwater-runoff reduction by urban green spaces, using the rainwater-runoff-coefficient method as well as the economic valuation methods. The results showed that, 2494 cubic meters of potential runoff was reduced per hectare of green area and a total volume of 154 million cubic meters rainwater was stored in these urban green spaces, which almost corresponds to the annual water needs of the urban ecological landscape in Beijing. The total economic benefit was 1.34 billion RMB in 2009 (RMB: Chinese currency, US$1=RMB6.83), which is equivalent to three-quarters of the maintenance cost of Beijing's green spaces; the value of rainwater-runoff reduction was 21.77 thousand RMB per hectare. In addition, the benefits in different districts and counties were ranked in the same order as urban green areas, and the average benefits per hectare of green space showed different trends, which may be related to the impervious surface index in different regions. This research will contribute to an understanding of the role that Beijing's green spaces play in rainwater regulation and in the creation and scientific management of urban green spaces. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. PLANE MATCHING WITH OBJECT-SPACE SEARCHING USING INDEPENDENTLY RECTIFIED IMAGES

    Directory of Open Access Journals (Sweden)

    H. Takeda

    2012-07-01

    Full Text Available In recent years, the social situation in cities has changed significantly such as redevelopment due to the massive earthquake and large-scale urban development. For example, numerical simulations can be used to study this phenomenon. Such simulations require the construction of high-definition three-dimensional city models that accurately reflect the real world. Progress in sensor technology allows us to easily obtain multi-view images. However, the existing multi-image matching techniques are inadequate. In this paper, we propose a new technique for multi-image matching. Since the existing method of feature searching is complicated, we have developed a rectification method that can be processed independently for each image does not depend on the stereo-pair. The object-space searching method that produces mismatches due to the occlusion or distortion of wall textures on images is the focus of our study. Our proposed technique can also match the building wall surface. The proposed technique has several advantages, and its usefulness is clarified through an experiment using actual images.

  18. Search for space charge effects in the ICARUS T600 LAr-TPC

    Science.gov (United States)

    Torti, Marta

    2016-11-01

    Space charge in Liquid Argon Time Projection Chamber is due to the accumu- lation of positive ions, produced by ionizing tracks crossing the detector, which slowly flow toward the cathode. As a consequence, electric field distortions may arise, thus hindering the possibility to produce faithful 3D images of the ionizing events. The presence of space charge becomes relevant for large TPCs operating at surface or at shallow depths, where cosmic ray flux is high. These effects could interest the next phase of the ICARUS T600 detector, which will be deployed at shallow depths as a Far Detector for Short Baseline Neutrino experiment at FNAL dedicated to sterile neutrino searches. In 2001, the first ICARUS T600 module (T300) operated at surface in Pavia (Italy), recording cosmic ray data. In this work, a sample of cosmic muon tracks from the 2001 run was analyzed and results on space charge effects in LAr-TPCs are shown.

  19. Search for space charge effects in the ICARUS T600 LAr-TPC

    International Nuclear Information System (INIS)

    Torti, Marta

    2016-01-01

    Space charge in Liquid Argon Time Projection Chamber is due to the accumu- lation of positive ions, produced by ionizing tracks crossing the detector, which slowly flow toward the cathode. As a consequence, electric field distortions may arise, thus hindering the possibility to produce faithful 3D images of the ionizing events. The presence of space charge becomes relevant for large TPCs operating at surface or at shallow depths, where cosmic ray flux is high. These effects could interest the next phase of the ICARUS T600 detector, which will be deployed at shallow depths as a Far Detector for Short Baseline Neutrino experiment at FNAL dedicated to sterile neutrino searches. In 2001, the first ICARUS T600 module (T300) operated at surface in Pavia (Italy), recording cosmic ray data. In this work, a sample of cosmic muon tracks from the 2001 run was analyzed and results on space charge effects in LAr-TPCs are shown

  20. N-Dimensional LLL Reduction Algorithm with Pivoted Reflection

    Directory of Open Access Journals (Sweden)

    Zhongliang Deng

    2018-01-01

    Full Text Available The Lenstra-Lenstra-Lovász (LLL lattice reduction algorithm and many of its variants have been widely used by cryptography, multiple-input-multiple-output (MIMO communication systems and carrier phase positioning in global navigation satellite system (GNSS to solve the integer least squares (ILS problem. In this paper, we propose an n-dimensional LLL reduction algorithm (n-LLL, expanding the Lovász condition in LLL algorithm to n-dimensional space in order to obtain a further reduced basis. We also introduce pivoted Householder reflection into the algorithm to optimize the reduction time. For an m-order positive definite matrix, analysis shows that the n-LLL reduction algorithm will converge within finite steps and always produce better results than the original LLL reduction algorithm with n > 2. The simulations clearly prove that n-LLL is better than the original LLL in reducing the condition number of an ill-conditioned input matrix with 39% improvement on average for typical cases, which can significantly reduce the searching space for solving ILS problem. The simulation results also show that the pivoted reflection has significantly declined the number of swaps in the algorithm by 57%, making n-LLL a more practical reduction algorithm.

  1. An efficient implementation of maximum likelihood identification of LTI state-space models by local gradient search

    NARCIS (Netherlands)

    Bergboer, N.H.; Verdult, V.; Verhaegen, M.H.G.

    2002-01-01

    We present a numerically efficient implementation of the nonlinear least squares and maximum likelihood identification of multivariable linear time-invariant (LTI) state-space models. This implementation is based on a local parameterization of the system and a gradient search in the resulting

  2. A New Approach to Reducing Search Space and Increasing Efficiency in Simulation Optimization Problems via the Fuzzy-DEA-BCC

    Directory of Open Access Journals (Sweden)

    Rafael de Carvalho Miranda

    2014-01-01

    Full Text Available The development of discrete-event simulation software was one of the most successful interfaces in operational research with computation. As a result, research has been focused on the development of new methods and algorithms with the purpose of increasing simulation optimization efficiency and reliability. This study aims to define optimum variation intervals for each decision variable through a proposed approach which combines the data envelopment analysis with the Fuzzy logic (Fuzzy-DEA-BCC, seeking to improve the decision-making units’ distinction in the face of uncertainty. In this study, Taguchi’s orthogonal arrays were used to generate the necessary quantity of DMUs, and the output variables were generated by the simulation. Two study objects were utilized as examples of mono- and multiobjective problems. Results confirmed the reliability and applicability of the proposed method, as it enabled a significant reduction in search space and computational demand when compared to conventional simulation optimization techniques.

  3. Pure state consciousness and its local reduction to neuronal space

    Science.gov (United States)

    Duggins, A. J.

    2013-01-01

    The single neuronal state can be represented as a vector in a complex space, spanned by an orthonormal basis of integer spike counts. In this model a scalar element of experience is associated with the instantaneous firing rate of a single sensory neuron over repeated stimulus presentations. Here the model is extended to composite neural systems that are tensor products of single neuronal vector spaces. Depiction of the mental state as a vector on this tensor product space is intended to capture the unity of consciousness. The density operator is introduced as its local reduction to the single neuron level, from which the firing rate can again be derived as the objective correlate of a subjective element. However, the relational structure of perceptual experience only emerges when the non-local mental state is considered. A metric of phenomenal proximity between neuronal elements of experience is proposed, based on the cross-correlation function of neurophysiology, but constrained by the association of theoretical extremes of correlation/anticorrelation in inseparable 2-neuron states with identical and opponent elements respectively.

  4. Effective Image Database Search via Dimensionality Reduction

    DEFF Research Database (Denmark)

    Dahl, Anders Bjorholm; Aanæs, Henrik

    2008-01-01

    Image search using the bag-of-words image representation is investigated further in this paper. This approach has shown promising results for large scale image collections making it relevant for Internet applications. The steps involved in the bag-of-words approach are feature extraction, vocabul......Image search using the bag-of-words image representation is investigated further in this paper. This approach has shown promising results for large scale image collections making it relevant for Internet applications. The steps involved in the bag-of-words approach are feature extraction......, vocabulary building, and searching with a query image. It is important to keep the computational cost low through all steps. In this paper we focus on the efficiency of the technique. To do that we substantially reduce the dimensionality of the features by the use of PCA and addition of color. Building...... of the visual vocabulary is typically done using k-means. We investigate a clustering algorithm based on the leader follower principle (LF-clustering), in which the number of clusters is not fixed. The adaptive nature of LF-clustering is shown to improve the quality of the visual vocabulary using this...

  5. A trust-based sensor allocation algorithm in cooperative space search problems

    Science.gov (United States)

    Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik

    2011-06-01

    Sensor allocation is an important and challenging problem within the field of multi-agent systems. The sensor allocation problem involves deciding how to assign a number of targets or cells to a set of agents according to some allocation protocol. Generally, in order to make efficient allocations, we need to design mechanisms that consider both the task performers' costs for the service and the associated probability of success (POS). In our problem, the costs are the used sensor resource, and the POS is the target tracking performance. Usually, POS may be perceived differently by different agents because they typically have different standards or means of evaluating the performance of their counterparts (other sensors in the search and tracking problem). Given this, we turn to the notion of trust to capture such subjective perceptions. In our approach, we develop a trust model to construct a novel mechanism that motivates sensor agents to limit their greediness or selfishness. Then we model the sensor allocation optimization problem with trust-in-loop negotiation game and solve it using a sub-game perfect equilibrium. Numerical simulations are performed to demonstrate the trust-based sensor allocation algorithm in cooperative space situation awareness (SSA) search problems.

  6. I-SG : Interactive Search Grouping - Search result grouping using Independent Component Analysis

    DEFF Research Database (Denmark)

    Lauritsen, Thomas; Kolenda, Thomas

    2002-01-01

    We present a computational simple and efficient approach to unsupervised grouping the search result from any search engine. Along with each group a set of keywords are found to annotate the contents. This approach leads to an interactive search trough a hierarchial structure that is build online....... It is the users task to improve the search, trough expanding the search query using the topic keywords representing the desired groups. In doing so the search engine limits the space of possible search results, virtually moving down in the search hierarchy, and so refines the search....

  7. Antenna concepts for interstellar search systems

    International Nuclear Information System (INIS)

    Basler, R.P.; Johnson, G.L.; Vondrak, R.R.

    1977-01-01

    An evaluation is made of microwave receiving systems designed to search for signals from extraterrestrial intelligence. Specific design concepts are analyzed parametrically to determine whether the optimum antenna system location is on earth, in space, or on the moon. Parameters considered include the hypothesized number of transmitting civilizations, the number of stars that must be searched to give any desired probability of receiving a signal, the antenna collecting area, the search time, the search range, and the cost. This analysis suggests that search systems based on the moon are not cost-competitive, if the search is extended only a few hundred light years from the earth, a Cyclops-type array on earth may be the most cost-effective system, for a search extending to 500 light years or more, a substantial cost and search-time advantage can be achieved with a large spherical reflector in space with multiple feeds, radio frequency interference shields can be provided for space systems, and cost can range from a few hundred million to tens of billions of dollars, depending on the parameter values assumed

  8. The N=4 supersymmetric E8 gauge theory and coset space dimensional reduction

    International Nuclear Information System (INIS)

    Olive, D.; West, P.

    1983-01-01

    Reasons are given to suggest that the N=4 supersymmetric E 8 gauge theory be considered as a serious candidate for a physical theory. The symmetries of this theory are broken by a scheme based on coset space dimensional reduction. The resulting theory possesses four conventional generations of low-mass fermions together with their mirror particles. (orig.)

  9. Coset space dimension reduction of gauge theories

    International Nuclear Information System (INIS)

    Farakos, K.; Kapetanakis, D.; Koutsoumbas, G.; Zoupanos, G.

    1989-01-01

    A very interesting approach in the attempts to unify all the interactions is to consider that a unification takes place in higher than four dimensions. The most ambitious program based on the old Kaluza-Klein idea is not able to reproduce the low energy chiral nature of the weak interactions. A suggested way out was the introduction of Yang-Mills fields in the higher dimensional theory. From the particle physics point of view the most important question is how such a theory behaves in four dimensions and in particular in low energies. Therefore most of our efforts concern studies of the properties of an attractive scheme, the Coset-Space-Dimensional-Reduction (C.S.D.R.) scheme, which permits the study of the effective four dimensional theory coming from a gauge theory defined in higher dimensions. Here we summarize the C.S.D.R. procedure the main the rems which are obeyed and to present a realistic model which is the result of the model building efforts that take into account all the C.S.D.R. properties. (orig./HSI)

  10. Reductive Lie-admissible algebras applied to H-spaces and connections

    International Nuclear Information System (INIS)

    Sagle, A.A.

    1982-01-01

    An algebra A with multiplication xy is Lie-admissible if the vector space A with new multiplication [x,y] = xy-yx is a Lie algebra; we denote this Lie algebra by A - . Thus, an associative algebra is Lie-admissible but a Cayley algebra is not Lie-admissible. In this paper we show how Lie-admissible algebras arise from Lie groups and their application to differential geometry on Lie groups via the following theorem. Let A be an n-dimensional Lie-admissible algebra over the reals. Let G be a Lie group with multiplication function μ and with Lie algebra g which is isomorphic to A - . Then there exiss a corrdinate system at the identify e in G which represents μ by a function F:gxg→g defined locally at the origin, such that the second derivative, F 2 , at the origin defines on the vector space g the structure of a nonassociative algebra (g, F 2 ). Furthermore this algebra is isomorphic to A and (g, F 2 ) - is isomorphic to A - . Thus roughly, any Lie-admissible algebra is isomorphic to an algebra obtained from a Lie algebra via a change of coordinates in the Lie group. Lie algebras arise by using canonical coordinates and the Campbell-Hausdorff formula. Applications of this show that any G-invariant psuedo-Riemannian connection on G is completely determined by a suitable Lie-admissible algebra. These results extend to H-spaces, reductive Lie-admissible algebras and connections on homogeneous H-spaces. Thus, alternative and other non-Lie-admissible algebras can be utilized

  11. [Development of domain specific search engines].

    Science.gov (United States)

    Takai, T; Tokunaga, M; Maeda, K; Kaminuma, T

    2000-01-01

    As cyber space exploding in a pace that nobody has ever imagined, it becomes very important to search cyber space efficiently and effectively. One solution to this problem is search engines. Already a lot of commercial search engines have been put on the market. However these search engines respond with such cumbersome results that domain specific experts can not tolerate. Using a dedicate hardware and a commercial software called OpenText, we have tried to develop several domain specific search engines. These engines are for our institute's Web contents, drugs, chemical safety, endocrine disruptors, and emergent response for chemical hazard. These engines have been on our Web site for testing.

  12. An Efficient VQ Codebook Search Algorithm Applied to AMR-WB Speech Coding

    Directory of Open Access Journals (Sweden)

    Cheng-Yu Yeh

    2017-04-01

    Full Text Available The adaptive multi-rate wideband (AMR-WB speech codec is widely used in modern mobile communication systems for high speech quality in handheld devices. Nonetheless, a major disadvantage is that vector quantization (VQ of immittance spectral frequency (ISF coefficients takes a considerable computational load in the AMR-WB coding. Accordingly, a binary search space-structured VQ (BSS-VQ algorithm is adopted to efficiently reduce the complexity of ISF quantization in AMR-WB. This search algorithm is done through a fast locating technique combined with lookup tables, such that an input vector is efficiently assigned to a subspace where relatively few codeword searches are required to be executed. In terms of overall search performance, this work is experimentally validated as a superior search algorithm relative to a multiple triangular inequality elimination (MTIE, a TIE with dynamic and intersection mechanisms (DI-TIE, and an equal-average equal-variance equal-norm nearest neighbor search (EEENNS approach. With a full search algorithm as a benchmark for overall search load comparison, this work provides an 87% search load reduction at a threshold of quantization accuracy of 0.96, a figure far beyond 55% in the MTIE, 76% in the EEENNS approach, and 83% in the DI-TIE approach.

  13. Reduction Potato s hydric soil erosion using space technology

    Science.gov (United States)

    Guyot, E.; Rios, V.; Zelaya, D.; Rios, E.; Lepen, F.; Padilla, P.; Soria, F.

    The potato's crop has an econ omic importance in Tucuman's agricultural PBI (Gross Product Income) because its rank is fourth(4°). Production's potato area is a breakable agro system; its geographic location is in Pedemonte's agro-ecological region so is essential to handle hydric erosion. Therefore, the aim of this work is improve crop's potato irrigation management through satellite information merge with farm's practices. The space technology consented to obtain Digital Model Soil using both unique differential and dual frequency GPS signals and total station. The irrigation practices were carried out due to irrigation management (FAO) and satellite imagine software (ENVI). Preliminary results of this experience allowed to follow the crop's growing through multitemporal study; reprogramming farm's irrigation practices intended for manage reduction hydric erosion and heighten economically its productivity for the next period

  14. Space based microlensing planet searches

    Directory of Open Access Journals (Sweden)

    Tisserand Patrick

    2013-04-01

    Full Text Available The discovery of extra-solar planets is arguably the most exciting development in astrophysics during the past 15 years, rivalled only by the detection of dark energy. Two projects unite the communities of exoplanet scientists and cosmologists: the proposed ESA M class mission EUCLID and the large space mission WFIRST, top ranked by the Astronomy 2010 Decadal Survey report. The later states that: “Space-based microlensing is the optimal approach to providing a true statistical census of planetary systems in the Galaxy, over a range of likely semi-major axes”. They also add: “This census, combined with that made by the Kepler mission, will determine how common Earth-like planets are over a wide range of orbital parameters”. We will present a status report of the results obtained by microlensing on exoplanets and the new objectives of the next generation of ground based wide field imager networks. We will finally discuss the fantastic prospect offered by space based microlensing at the horizon 2020–2025.

  15. Radon reduction in crawl-space houses

    International Nuclear Information System (INIS)

    Osborne, M.C.; Moore, D.G.; Southerlan, R.E.; Brennan, T.; Pyle, B.E.

    1989-01-01

    This paper gives results of an EPA study of radon-mitigation alternatives for crawl space houses in several houses in Nashville, TN. Application of one of these alternative mitigation options, suction under a polyethylene membrane, has been successful in significantly reducing radon levels in both the crawl space and the house. The large radon concentrations measured under unvented plastic ground covers and the moisture barriers found in many crawl spaces can act as radon-rich reservoirs capable of contaminating a crawl space and house during periods of depressurization. With the exhaust components of the mitigation system in place, radon levels below the plastic decreased by more than 95% under both passive and active suction conditions. Based on the study, the design of a cost-effective subplastic suction passive radon mitigation system for crawl spaces seems promising

  16. Open Search Environments: The Free Alternative to Commercial Search Services

    Directory of Open Access Journals (Sweden)

    Adrian O'Riordan

    2014-06-01

    Full Text Available Open search systems present a free and less restricted alternative to commercial search services. This paper explores the space of open search technology looking in particular at the issue of interoperability. A description of current protocols and formats for engineering open search applications is presented. The suitability of these technologies and issues around their adoption and operation are discussed. This open search approach is especially proving a fitting choice in applications involving the harvesting of resources and information integration. Principal among the technological solutions are OpenSearch and SRU. OpenSearch and SRU implement a federated model to enable existing and new search engines and search clients communicate. Applications and instances where Opensearch and SRU can be combined are presented. Other relevant technologies such as OpenURL, Apache Solr, and OAI-PMH are also discussed. The deployment of these freely licensed open standards in digital library applications is now a genuine alternative to commercial or proprietary systems.

  17. Long-Term International Space Station (ISS) Risk Reduction Activities

    Science.gov (United States)

    Fodroci, M. P.; Gafka, G. K.; Lutomski, M. G.; Maher, J. S.

    2012-01-01

    As the assembly of the ISS nears completion, it is worthwhile to step back and review some of the actions pursued by the Program in recent years to reduce risk and enhance the safety and health of ISS crewmembers, visitors, and space flight participants. While the initial ISS requirements and design were intended to provide the best practicable levels of safety, it is always possible to further reduce risk - given the determination, commitment, and resources to do so. The following is a summary of some of the steps taken by the ISS Program Manager, by our International Partners, by hardware and software designers, by operational specialists, and by safety personnel to continuously enhance the safety of the ISS, and to reduce risk to all crewmembers. While years of work went into the development of ISS requirements, there are many things associated with risk reduction in a Program like the ISS that can only be learned through actual operational experience. These risk reduction activities can be divided into roughly three categories: Areas that were initially noncompliant which have subsequently been brought into compliance or near compliance (i.e., Micrometeoroid and Orbital Debris [MMOD] protection, acoustics) Areas where initial design requirements were eventually considered inadequate and were subsequently augmented (i.e., Toxicity Hazard Level- 4 [THL] materials, emergency procedures, emergency equipment, control of drag-throughs) Areas where risks were initially underestimated, and have subsequently been addressed through additional mitigation (i.e., Extravehicular Activity [EVA] sharp edges, plasma shock hazards) Due to the hard work and cooperation of many parties working together across the span of more than a decade, the ISS is now a safer and healthier environment for our crew, in many cases exceeding the risk reduction targets inherent in the intent of the original design. It will provide a safe and stable platform for utilization and discovery for years

  18. One visual search, many memory searches: An eye-tracking investigation of hybrid search.

    Science.gov (United States)

    Drew, Trafton; Boettcher, Sage E P; Wolfe, Jeremy M

    2017-09-01

    Suppose you go to the supermarket with a shopping list of 10 items held in memory. Your shopping expedition can be seen as a combination of visual search and memory search. This is known as "hybrid search." There is a growing interest in understanding how hybrid search tasks are accomplished. We used eye tracking to examine how manipulating the number of possible targets (the memory set size [MSS]) changes how observers (Os) search. We found that dwell time on each distractor increased with MSS, suggesting a memory search was being executed each time a new distractor was fixated. Meanwhile, although the rate of refixation increased with MSS, it was not nearly enough to suggest a strategy that involves repeatedly searching visual space for subgroups of the target set. These data provide a clear demonstration that hybrid search tasks are carried out via a "one visual search, many memory searches" heuristic in which Os examine items in the visual array once with a very low rate of refixations. For each item selected, Os activate a memory search that produces logarithmic response time increases with increased MSS. Furthermore, the percentage of distractors fixated was strongly modulated by the MSS: More items in the MSS led to a higher percentage of fixated distractors. Searching for more potential targets appears to significantly alter how Os approach the task, ultimately resulting in more eye movements and longer response times.

  19. Optimum Design of Braced Steel Space Frames including Soil-Structure Interaction via Teaching-Learning-Based Optimization and Harmony Search Algorithms

    OpenAIRE

    Ayse T. Daloglu; Musa Artar; Korhan Ozgan; Ali İ. Karakas

    2018-01-01

    Optimum design of braced steel space frames including soil-structure interaction is studied by using harmony search (HS) and teaching-learning-based optimization (TLBO) algorithms. A three-parameter elastic foundation model is used to incorporate the soil-structure interaction effect. A 10-storey braced steel space frame example taken from literature is investigated according to four different bracing types for the cases with/without soil-structure interaction. X, V, Z, and eccentric V-shaped...

  20. Separable Reduction and Supporting Properties of Fréchet-Like Normals in Banach Spaces

    Czech Academy of Sciences Publication Activity Database

    Fabian, Marián; Mordukhovich, B. S.

    1999-01-01

    Roč. 51, č. 1 (1999), s. 26-48 ISSN 0008-414X R&D Projects: GA AV ČR IAA1019702; GA ČR GA201/98/1449 Institutional research plan: CEZ:AV0Z1019905; CEZ:AV0Z1019905 Keywords : nonsmooth analysis * Banach spaces * separable reduction Subject RIV: BA - General Mathematics Impact factor: 0.357, year: 1999

  1. An adaptive random search for short term generation scheduling with network constraints.

    Directory of Open Access Journals (Sweden)

    J A Marmolejo

    Full Text Available This paper presents an adaptive random search approach to address a short term generation scheduling with network constraints, which determines the startup and shutdown schedules of thermal units over a given planning horizon. In this model, we consider the transmission network through capacity limits and line losses. The mathematical model is stated in the form of a Mixed Integer Non Linear Problem with binary variables. The proposed heuristic is a population-based method that generates a set of new potential solutions via a random search strategy. The random search is based on the Markov Chain Monte Carlo method. The main key of the proposed method is that the noise level of the random search is adaptively controlled in order to exploring and exploiting the entire search space. In order to improve the solutions, we consider coupling a local search into random search process. Several test systems are presented to evaluate the performance of the proposed heuristic. We use a commercial optimizer to compare the quality of the solutions provided by the proposed method. The solution of the proposed algorithm showed a significant reduction in computational effort with respect to the full-scale outer approximation commercial solver. Numerical results show the potential and robustness of our approach.

  2. An introduction to data reduction: space-group determination, scaling and intensity statistics.

    Science.gov (United States)

    Evans, Philip R

    2011-04-01

    This paper presents an overview of how to run the CCP4 programs for data reduction (SCALA, POINTLESS and CTRUNCATE) through the CCP4 graphical interface ccp4i and points out some issues that need to be considered, together with a few examples. It covers determination of the point-group symmetry of the diffraction data (the Laue group), which is required for the subsequent scaling step, examination of systematic absences, which in many cases will allow inference of the space group, putting multiple data sets on a common indexing system when there are alternatives, the scaling step itself, which produces a large set of data-quality indicators, estimation of |F| from intensity and finally examination of intensity statistics to detect crystal pathologies such as twinning. An appendix outlines the scoring schemes used by the program POINTLESS to assign probabilities to possible Laue and space groups.

  3. Search Path Evaluation Incorporating Object Placement Structure

    National Research Council Canada - National Science Library

    Baylog, John G; Wettergren, Thomas A

    2007-01-01

    This report describes a computationally robust approach to search path performance evaluation where the objects of search interest exhibit structure in the way in which they occur within the search space...

  4. The New Horizons and Hubble Space Telescope search for rings, dust, and debris in the Pluto-Charon system

    Science.gov (United States)

    Lauer, Tod R.; Throop, Henry B.; Showalter, Mark R.; Weaver, Harold A.; Stern, S. Alan; Spencer, John R.; Buie, Marc W.; Hamilton, Douglas P.; Porter, Simon B.; Verbiscer, Anne J.; Young, Leslie A.; Olkin, Cathy B.; Ennico, Kimberly; New Horizons Science Team

    2018-02-01

    We conducted an extensive search for dust or debris rings in the Pluto-Charon system before, during, and after the New Horizons encounter in July 2015. Methodologies included attempting to detect features by back-scattered light during the approach to Pluto (phase angle α ∼ 15°), in situ detection of impacting particles, a search for stellar occultations near the time of closest approach, and by forward-scattered light imaging during departure (α ∼ 165°). An extensive search using the Hubble Space Telescope (HST) prior to the encounter also contributed to the final ring limits. No rings, debris, or dust features were observed, but our new detection limits provide a substantially improved picture of the environment throughout the Pluto-Charon system. Searches for rings in back-scattered light covered the range 35,000-250,000 km from the system barycenter, a zone that starts interior to the orbit of Styx, the innermost minor satellite, and extends out to four times the orbital radius of Hydra, the outermost known satellite. We obtained our firmest limits using data from the New Horizons LORRI camera in the inner half of this region. Our limits on the normal I/F of an unseen ring depends on the radial scale of the rings: 2 ×10-8 (3σ) for 1500 km wide rings, 1 ×10-8 for 6000 km rings, and 7 ×10-9 for 12,000 km rings. Beyond ∼ 100, 000 km from Pluto, HST observations limit normal I/F to ∼ 8 ×10-8 . Searches for dust features from forward-scattered light extended from the surface of Pluto to the Pluto-Charon Hill sphere (rHill = 6.4 ×106 km). No evidence for rings or dust clouds was detected to normal I/F limits of ∼ 8.9 ×10-7 on ∼ 104 km scales. Four stellar occulation observations also probed the space interior to Hydra, but again no dust or debris was detected. The Student Dust Counter detected one particle impact 3.6 × 106 km from Pluto, but this is consistent with the interplanetary space environment established during the cruise of New

  5. Reduction of biselenites into polyselenides in interlayer space of layered double hydroxides

    Science.gov (United States)

    Kim, Myeong Shin; Lee, Yongju; Park, Yong-Min; Cha, Ji-Hyun; Jung, Duk-Young

    2018-06-01

    A selenous acid (H2SeO3) precursor was intercalated as biselenite (HSeO3-) ions into the interlayer gallery of carbonated magnesium aluminum layered double hydroxide (MgAl-LDH) in aqueous solution. Reduction reaction of selenous ions by aqueous hydrazine solution produced polyselenide intercalated LDHs which were consecutively exchanged with iodide through redox reaction under iodine vapor. The polyselenide containing LDHs adsorbed iodine vapor spontaneously and triiodide was incorporated in the interlayer space followed by formation of selenium polycrystalline phase. Two dimensional framework of MgAl-LDH is strong enough to resist against the reducing power of hydrazine as well as oxidation condition of iodine. The SEM data demonstrated that the shapes of LDH polycrystalline have little changed after the above redox reactions. The polyselenide and iodide LDH products were analyzed by XRD, Infrared and Raman spectra which strongly suggested the horizontal arrangement of polyselenide and triiodide in gallery space of LDHs.

  6. Foraging in Semantic Fields: How We Search Through Memory.

    Science.gov (United States)

    Hills, Thomas T; Todd, Peter M; Jones, Michael N

    2015-07-01

    When searching for concepts in memory--as in the verbal fluency task of naming all the animals one can think of--people appear to explore internal mental representations in much the same way that animals forage in physical space: searching locally within patches of information before transitioning globally between patches. However, the definition of the patches being searched in mental space is not well specified. Do we search by activating explicit predefined categories (e.g., pets) and recall items from within that category (categorical search), or do we activate and recall a connected sequence of individual items without using categorical information, with each item recalled leading to the retrieval of an associated item in a stream (associative search), or both? Using semantic representations in a search of associative memory framework and data from the animal fluency task, we tested competing hypotheses based on associative and categorical search models. Associative, but not categorical, patch transitions took longer to make than position-matched productions, suggesting that categorical transitions were not true transitions. There was also clear evidence of associative search even within categorical patch boundaries. Furthermore, most individuals' behavior was best explained by an associative search model without the addition of categorical information. Thus, our results support a search process that does not use categorical information, but for which patch boundaries shift with each recall and local search is well described by a random walk in semantic space, with switches to new regions of the semantic space when the current region is depleted. Copyright © 2015 Cognitive Science Society, Inc.

  7. Harmonic analysis on reductive symmetric spaces

    NARCIS (Netherlands)

    Ban, E.P. van den; Schlichtkrull, H.

    2000-01-01

    We give a relatively non-technical survey of some recent advances in the Fourier theory for semisimple symmetric spaces. There are three major results: An inversion formula for the Fourier transform, a Palley-Wiener theorem, which describes the Fourier image of the space of completely supported

  8. Ringed Seal Search for Global Optimization via a Sensitive Search Model.

    Directory of Open Access Journals (Sweden)

    Younes Saadi

    Full Text Available The efficiency of a metaheuristic algorithm for global optimization is based on its ability to search and find the global optimum. However, a good search often requires to be balanced between exploration and exploitation of the search space. In this paper, a new metaheuristic algorithm called Ringed Seal Search (RSS is introduced. It is inspired by the natural behavior of the seal pup. This algorithm mimics the seal pup movement behavior and its ability to search and choose the best lair to escape predators. The scenario starts once the seal mother gives birth to a new pup in a birthing lair that is constructed for this purpose. The seal pup strategy consists of searching and selecting the best lair by performing a random walk to find a new lair. Affected by the sensitive nature of seals against external noise emitted by predators, the random walk of the seal pup takes two different search states, normal state and urgent state. In the normal state, the pup performs an intensive search between closely adjacent lairs; this movement is modeled via a Brownian walk. In an urgent state, the pup leaves the proximity area and performs an extensive search to find a new lair from sparse targets; this movement is modeled via a Levy walk. The switch between these two states is realized by the random noise emitted by predators. The algorithm keeps switching between normal and urgent states until the global optimum is reached. Tests and validations were performed using fifteen benchmark test functions to compare the performance of RSS with other baseline algorithms. The results show that RSS is more efficient than Genetic Algorithm, Particles Swarm Optimization and Cuckoo Search in terms of convergence rate to the global optimum. The RSS shows an improvement in terms of balance between exploration (extensive and exploitation (intensive of the search space. The RSS can efficiently mimic seal pups behavior to find best lair and provide a new algorithm to be

  9. A search for space energy alternatives

    Science.gov (United States)

    Gilbreath, W. P.; Billman, K. W.

    1978-01-01

    This paper takes a look at a number of schemes for converting radiant energy in space to useful energy for man. These schemes are possible alternatives to the currently most studied solar power satellite concept. Possible primary collection and conversion devices discussed include the space particle flux devices, solar windmills, photovoltaic devices, photochemical cells, photoemissive converters, heat engines, dielectric energy conversion, electrostatic generators, plasma solar collectors, and thermionic schemes. Transmission devices reviewed include lasers and masers.

  10. GeneLab Phase 2: Integrated Search Data Federation of Space Biology Experimental Data

    Science.gov (United States)

    Tran, P. B.; Berrios, D. C.; Gurram, M. M.; Hashim, J. C. M.; Raghunandan, S.; Lin, S. Y.; Le, T. Q.; Heher, D. M.; Thai, H. T.; Welch, J. D.; hide

    2016-01-01

    The GeneLab project is a science initiative to maximize the scientific return of omics data collected from spaceflight and from ground simulations of microgravity and radiation experiments, supported by a data system for a public bioinformatics repository and collaborative analysis tools for these data. The mission of GeneLab is to maximize the utilization of the valuable biological research resources aboard the ISS by collecting genomic, transcriptomic, proteomic and metabolomic (so-called omics) data to enable the exploration of the molecular network responses of terrestrial biology to space environments using a systems biology approach. All GeneLab data are made available to a worldwide network of researchers through its open-access data system. GeneLab is currently being developed by NASA to support Open Science biomedical research in order to enable the human exploration of space and improve life on earth. Open access to Phase 1 of the GeneLab Data Systems (GLDS) was implemented in April 2015. Download volumes have grown steadily, mirroring the growth in curated space biology research data sets (61 as of June 2016), now exceeding 10 TB/month, with over 10,000 file downloads since the start of Phase 1. For the period April 2015 to May 2016, most frequently downloaded were data from studies of Mus musculus (39) followed closely by Arabidopsis thaliana (30), with the remaining downloads roughly equally split across 12 other organisms (each 10 of total downloads). GLDS Phase 2 is focusing on interoperability, supporting data federation, including integrated search capabilities, of GLDS-housed data sets with external data sources, such as gene expression data from NIHNCBIs Gene Expression Omnibus (GEO), proteomic data from EBIs PRIDE system, and metagenomic data from Argonne National Laboratory's MG-RAST. GEO and MG-RAST employ specifications for investigation metadata that are different from those used by the GLDS and PRIDE (e.g., ISA-Tab). The GLDS Phase 2 system

  11. Gain reduction due to space charge at high counting rates in multiwire proportional chambers

    International Nuclear Information System (INIS)

    Smith, G.C.; Mathieson, E.

    1986-10-01

    Measurements with a small MWPC of gas gain reduction, due to ion space charge at high counting rates, have been compared with theoretical predictions. The quantity ln(q/q 0 )/(q/q 0 ), where (q/q 0 ) is the relative reduced avalanche charge, has been found to be closely proportional to count rate, as predicted. The constant of proportionality is in good agreement with calculations made with a modified version of the original, simplified theory

  12. LHCb Exotica and Higgs searches

    CERN Multimedia

    Lucchesi, Donatella

    2016-01-01

    The unique phase space coverage and features of the LHCb detector at the LHC makes it an ideal environment to probe complementary New Physics parameter regions. In particular, recently developed jet tagging algorithms are ideal for searches involving $b$ and $c$ jets. This poster will review different jet-related exotica searches together with the efforts in the search for a Higgs boson decaying to a pair of heavy quarks.

  13. Adaptive Large Neighbourhood Search

    DEFF Research Database (Denmark)

    Røpke, Stefan

    Large neighborhood search is a metaheuristic that has gained popularity in recent years. The heuristic repeatedly moves from solution to solution by first partially destroying the solution and then repairing it. The best solution observed during this search is presented as the final solution....... This tutorial introduces the large neighborhood search metaheuristic and the variant adaptive large neighborhood search that dynamically tunes parameters of the heuristic while it is running. Both heuristics belong to a broader class of heuristics that are searching a solution space using very large...... neighborhoods. The tutorial also present applications of the adaptive large neighborhood search, mostly related to vehicle routing problems for which the heuristic has been extremely successful. We discuss how the heuristic can be parallelized and thereby take advantage of modern desktop computers...

  14. Searching for Sterile Neutrinos with MINOS

    Energy Technology Data Exchange (ETDEWEB)

    Timmons, Ashley [Manchester U.

    2016-01-01

    This document presents the latest results for a 3+1 sterile neutrino search using the $10.56 \\times 10^{20}$ protons-on-target data set taken from 2005 - 2012. By searching for oscillations driven by a large mass splitting, MINOS is sensitive to the existence of sterile neutrinos through any energy dependent deviations using a charged current sample, as well as looking at any relative deficit between neutral current events between the far and near detectors. This document will discuss the novel analysis that enabled a search for sterile neutrinos setting a limit in the previously unexplored regions in the parameter space $\\{\\Delta m^{2}_{41}, \\sin^2\\theta_{24}\\}$. The results presented can be compared to the parameter space suggested by LSND and MiniBooNE and complements other previous experimental searches for sterile neutrinos in the electron neutrino appearance channel.

  15. Dark-matter QCD-axion searches

    International Nuclear Information System (INIS)

    Rosenberg, Leslie J

    2010-01-01

    The axion is a hypothetical elementary particle appearing in a simple and elegant extension to the Standard Model of particle physics that cancels otherwise huge CP-violating effects in QCD; this extension has a broken U(1) axial symmetry, where the resulting Goldstone Boson is the axion. A light axion of mass 10 -(6-3) eV (the so-called i nvisible axion ) would couple extraordinarily weakly to normal matter and radiation and would therefore be extremely difficult to detect in the laboratory. However, such an axion would be a compelling dark-matter candidate and is therefore a target of a number of searches. Compared to other dark-matter candidates, the plausible range of axion dark-matter couplings and masses is narrowly constrained. This restricted search space allows for 'definitive' searches, where non-observation would seriously impugn the dark-matter QCD-axion hypothesis. Axion searches employ a wide range of technologies and techniques, from astrophysical observations to laboratory electromagnetic signal detection. For some experiments, sensitivities are have reached likely dark-matter axion couplings and masses. This is a brief and selective overview of axion searches. With only very limited space, I briefly describe just two of the many experiments that are searching for dark-matter axions.

  16. Semantic Search of Web Services

    Science.gov (United States)

    Hao, Ke

    2013-01-01

    This dissertation addresses semantic search of Web services using natural language processing. We first survey various existing approaches, focusing on the fact that the expensive costs of current semantic annotation frameworks result in limited use of semantic search for large scale applications. We then propose a vector space model based service…

  17. The PAMELA space mission for antimatter and dark matter searches in space

    International Nuclear Information System (INIS)

    Boezio, M.; Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Borisov, S.; Bottai, S.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Consiglio, L.; De Pascale, M. P.; De Santis, C.

    2012-01-01

    The PAMELA satellite-borne experiment has presented new results on cosmic-ray antiparticles that can be interpreted in terms of DM annihilation or pulsar contribution. The instrument was launched from the Baikonur cosmodrome and it has been collecting data since July 2006. The combination of a permanent magnet silicon strip spectrometer and a silicon-tungsten imaging calorimeter allows precision studies of the charged cosmic radiation to be conducted over a wide energy range with high statistics. The primary scientific goal is the measurement of the antiproton and positron energy spectrum in order to search for exotic sources. PAMELA is also searching for primordial antinuclei (anti-helium), and testing cosmic-ray propagation models through precise measurements of the antiparticle energy spectrum and precision studies of light nuclei and their isotopes. This talk illustrates the most recent scientific results obtained by the PAMELA experiment.

  18. The PAMELA space mission for antimatter and dark matter searches in space

    Energy Technology Data Exchange (ETDEWEB)

    Boezio, M., E-mail: Mirko.Boezio@ts.infn.it [INFN, Sezione di Trieste (Italy); Bruno, A., E-mail: Alessandro.Bruno@ba.infn.it [University of Bari, Department of Physics (Italy); Adriani, O. [University of Florence, Department of Physics (Italy); Barbarino, G. C. [University of Naples ' Federico II' , Department of Physics (Italy); Bazilevskaya, G. A. [Lebedev Physical Institute (Russian Federation); Bellotti, R. [University of Bari, Department of Physics (Italy); Bogomolov, E. A. [Ioffe Physical Technical Institute (Russian Federation); Bongi, M. [INFN, Sezione di Florence (Italy); Bonvicini, V. [INFN, Sezione di Trieste (Italy); Borisov, S. [INFN, Sezione di Rome ' Tor Vergata' (Italy); Bottai, S. [INFN, Sezione di Florence (Italy); Cafagna, F. [University of Bari, Department of Physics (Italy); Campana, D.; Carbone, R. [INFN, Sezione di Naples (Italy); Carlson, P. [KTH, Department of Physics, and the Oskar Klein Centre for Cosmoparticle Physics (Sweden); Casolino, M. [INFN, Sezione di Rome ' Tor Vergata' (Italy); Castellini, G. [IFAC (Italy); Consiglio, L. [INFN, Sezione di Naples (Italy); De Pascale, M. P.; De Santis, C. [INFN, Sezione di Rome ' Tor Vergata' (Italy); and others

    2012-12-15

    The PAMELA satellite-borne experiment has presented new results on cosmic-ray antiparticles that can be interpreted in terms of DM annihilation or pulsar contribution. The instrument was launched from the Baikonur cosmodrome and it has been collecting data since July 2006. The combination of a permanent magnet silicon strip spectrometer and a silicon-tungsten imaging calorimeter allows precision studies of the charged cosmic radiation to be conducted over a wide energy range with high statistics. The primary scientific goal is the measurement of the antiproton and positron energy spectrum in order to search for exotic sources. PAMELA is also searching for primordial antinuclei (anti-helium), and testing cosmic-ray propagation models through precise measurements of the antiparticle energy spectrum and precision studies of light nuclei and their isotopes. This talk illustrates the most recent scientific results obtained by the PAMELA experiment.

  19. Positioning Reduction of Deep Space Probes Based on VLBI Tracking

    Science.gov (United States)

    Qiao, S. B.

    2011-11-01

    ) Investigate the application of Kalman filter to the positioning reduction of deep space probes and develop related software systems. In summary, the progress in this dissertation is made in the positioning reduction of deep space probes tracked by VLBI concerning the algorithm study, software development, real observation processing and so on, while a further study is still urgent and arduous.

  20. Self-calibration for lab-μCT using space-time regularized projection-based DVC and model reduction

    Science.gov (United States)

    Jailin, C.; Buljac, A.; Bouterf, A.; Poncelet, M.; Hild, F.; Roux, S.

    2018-02-01

    An online calibration procedure for x-ray lab-CT is developed using projection-based digital volume correlation. An initial reconstruction of the sample is positioned in the 3D space for every angle so that its projection matches the initial one. This procedure allows a space-time displacement field to be estimated for the scanned sample, which is regularized with (i) rigid body motions in space and (ii) modal time shape functions computed using model reduction techniques (i.e. proper generalized decomposition). The result is an accurate identification of the position of the sample adapted for each angle, which may deviate from the desired perfect rotation required for standard reconstructions. An application of this procedure to a 4D in situ mechanical test is shown. The proposed correction leads to a much improved tomographic reconstruction quality.

  1. Present status of the 4-m ILMT data reduction pipeline: application to space debris detection and characterization

    Science.gov (United States)

    Pradhan, Bikram; Delchambre, Ludovic; Hickson, Paul; Akhunov, Talat; Bartczak, Przemyslaw; Kumar, Brajesh; Surdej, Jean

    2018-04-01

    The 4-m International Liquid Mirror Telescope (ILMT) located at the ARIES Observatory (Devasthal, India) has been designed to scan at a latitude of +29° 22' 26" a band of sky having a width of about half a degree in the Time Delayed Integration (TDI) mode. Therefore, a special data-reduction and analysis pipeline to process online the large amount of optical data being produced has been dedicated to it. This requirement has led to the development of the 4-m ILMT data reduction pipeline, a new software package built with Python in order to simplify a large number of tasks aimed at the reduction of the acquired TDI images. This software provides astronomers with specially designed data reduction functions, astrometry and photometry calibration tools. In this paper we discuss the various reduction and calibration steps followed to reduce TDI images obtained in May 2015 with the Devasthal 1.3m telescope. We report here the detection and characterization of nine space debris present in the TDI frames.

  2. Search Engine For Ebook Portal

    Directory of Open Access Journals (Sweden)

    Prashant Kanade

    2017-05-01

    Full Text Available The purpose of this paper is to establish the textual analytics involved in developing a search engine for an ebook portal. We have extracted our dataset from Project Gutenberg using a robot harvester. Textual Analytics is used for efficient search retrieval. The entire dataset is represented using Vector Space Model where each document is a vector in the vector space. Further for computational purposes we represent our dataset in the form of a Term Frequency- Inverse Document Frequency tf-idf matrix. The first step involves obtaining the most coherent sequence of words of the search query entered. The entered query is processed using Front End algorithms this includes-Spell Checker Text Segmentation and Language Modeling. Back End processing includes Similarity Modeling Clustering Indexing and Retrieval. The relationship between documents and words is established using cosine similarity measured between the documents and words in Vector Space. Clustering performed is used to suggest books that are similar to the search query entered by the user. Lastly the Lucene Based Elasticsearch engine is used for indexing on the documents. This allows faster retrieval of data. Elasticsearch returns a dictionary and creates a tf-idf matrix. The processed query is compared with the dictionary obtained and tf-idf matrix is used to calculate the score for each match to give most relevant result.

  3. Does linear separability really matter? Complex visual search is explained by simple search

    Science.gov (United States)

    Vighneshvel, T.; Arun, S. P.

    2013-01-01

    Visual search in real life involves complex displays with a target among multiple types of distracters, but in the laboratory, it is often tested using simple displays with identical distracters. Can complex search be understood in terms of simple searches? This link may not be straightforward if complex search has emergent properties. One such property is linear separability, whereby search is hard when a target cannot be separated from its distracters using a single linear boundary. However, evidence in favor of linear separability is based on testing stimulus configurations in an external parametric space that need not be related to their true perceptual representation. We therefore set out to assess whether linear separability influences complex search at all. Our null hypothesis was that complex search performance depends only on classical factors such as target-distracter similarity and distracter homogeneity, which we measured using simple searches. Across three experiments involving a variety of artificial and natural objects, differences between linearly separable and nonseparable searches were explained using target-distracter similarity and distracter heterogeneity. Further, simple searches accurately predicted complex search regardless of linear separability (r = 0.91). Our results show that complex search is explained by simple search, refuting the widely held belief that linear separability influences visual search. PMID:24029822

  4. Development and evaluation of a biomedical search engine using a predicate-based vector space model.

    Science.gov (United States)

    Kwak, Myungjae; Leroy, Gondy; Martinez, Jesse D; Harwell, Jeffrey

    2013-10-01

    Although biomedical information available in articles and patents is increasing exponentially, we continue to rely on the same information retrieval methods and use very few keywords to search millions of documents. We are developing a fundamentally different approach for finding much more precise and complete information with a single query using predicates instead of keywords for both query and document representation. Predicates are triples that are more complex datastructures than keywords and contain more structured information. To make optimal use of them, we developed a new predicate-based vector space model and query-document similarity function with adjusted tf-idf and boost function. Using a test bed of 107,367 PubMed abstracts, we evaluated the first essential function: retrieving information. Cancer researchers provided 20 realistic queries, for which the top 15 abstracts were retrieved using a predicate-based (new) and keyword-based (baseline) approach. Each abstract was evaluated, double-blind, by cancer researchers on a 0-5 point scale to calculate precision (0 versus higher) and relevance (0-5 score). Precision was significantly higher (psearching than keywords, laying the foundation for rich and sophisticated information search. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. (abstract) Spacecraft Doppler Tracking with the Deep Space Network in the Search for Gravitational Waves

    Science.gov (United States)

    Asmar, Sami; Renzetti, Nicholas

    1994-01-01

    The Deep Space Network generates accurate radio science data observables for investigators who use radio links between spacecraft and the Earth to examine small changes in the phase and/or amplitude of the signal to study a wide variety of structures and phenomena in space. Several such studies are directed at aspects of the theory of general relativity such as gravitational redshift and gravitational waves. A gravitational wave is a propagating, polarized gravitational field, a ripple in the curvature of space-time. In Einstein's theory of general relativity, the waves are propagating solutions of the Einstein field equations. Their amplitudes are dimensionless strain amplitudes that change the fractional difference in distance between test masses and the rates at which separated clocks keep time. Predicted by all relativistic theories of gravity, they are extremely weak (the ratio of gravitational forces to electrical forces is about 10(sup -40)) and are generated at detectable levels only by astrophysical sources - very massive sources under violent dynamical conditions. The waves have never been detected but searches in the low-frequency band using Doppler tracking of many spacecraft have been conducted and others are being planned. Upper limits have been placed on the gravitational wave strength with the best sensitivities to date are for periodic waves being 7 x 10(sup -15).

  6. Content-based Music Search and Recommendation System

    Science.gov (United States)

    Takegawa, Kazuki; Hijikata, Yoshinori; Nishida, Shogo

    Recently, the turn volume of music data on the Internet has increased rapidly. This has increased the user's cost to find music data suiting their preference from such a large data set. We propose a content-based music search and recommendation system. This system has an interface for searching and finding music data and an interface for editing a user profile which is necessary for music recommendation. By exploiting the visualization of the feature space of music and the visualization of the user profile, the user can search music data and edit the user profile. Furthermore, by exploiting the infomation which can be acquired from each visualized object in a mutually complementary manner, we make it easier for the user to search music data and edit the user profile. Concretely, the system gives to the user an information obtained from the user profile when searching music data and an information obtained from the feature space of music when editing the user profile.

  7. Design of combinatorial libraries for the exploration of virtual hits from fragment space searches with LoFT.

    Science.gov (United States)

    Lessel, Uta; Wellenzohn, Bernd; Fischer, J Robert; Rarey, Matthias

    2012-02-27

    A case study is presented illustrating the design of a focused CDK2 library. The scaffold of the library was detected by a feature trees search in a fragment space based on reactions from combinatorial chemistry. For the design the software LoFT (Library optimizer using Feature Trees) was used. The special feature called FTMatch was applied to restrict the parts of the queries where the reagents are permitted to match. This way a 3D scoring function could be simulated. Results were compared with alternative designs by GOLD docking and ROCS 3D alignments.

  8. Reductions in dead space ventilation with nasal high flow depend on physiological dead space volume: metabolic hood measurements during sleep in patients with COPD and controls.

    Science.gov (United States)

    Biselli, Paolo; Fricke, Kathrin; Grote, Ludger; Braun, Andrew T; Kirkness, Jason; Smith, Philip; Schwartz, Alan; Schneider, Hartmut

    2018-05-01

    Nasal high flow (NHF) reduces minute ventilation and ventilatory loads during sleep but the mechanisms are not clear. We hypothesised NHF reduces ventilation in proportion to physiological but not anatomical dead space.11 subjects (five controls and six chronic obstructive pulmonary disease (COPD) patients) underwent polysomnography with transcutaneous carbon dioxide (CO 2 ) monitoring under a metabolic hood. During stable non-rapid eye movement stage 2 sleep, subjects received NHF (20 L·min -1 ) intermittently for periods of 5-10 min. We measured CO 2 production and calculated dead space ventilation.Controls and COPD patients responded similarly to NHF. NHF reduced minute ventilation (from 5.6±0.4 to 4.8±0.4 L·min -1 ; pspace ventilation (from 2.5±0.4 to 1.6±0.4 L·min -1 ; pspace ventilation correlated with baseline physiological dead space fraction (r 2 =0.36; pspace volume.During sleep, NHF decreases minute ventilation due to an overall reduction in dead space ventilation in proportion to the extent of baseline physiological dead space fraction. Copyright ©ERS 2018.

  9. Trade-space Analysis for Constellations

    Science.gov (United States)

    Le Moigne, J.; Dabney, P.; de Weck, O. L.; Foreman, V.; Grogan, P.; Holland, M. P.; Hughes, S. P.; Nag, S.

    2016-12-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: "How many spacecraft should be included in the constellation? Which design has the best cost/risk value?" The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time. This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance. TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model

  10. Searching Trajectories by Regions of Interest

    KAUST Repository

    Shang, Shuo

    2017-03-22

    With the increasing availability of moving-object tracking data, trajectory search is increasingly important. We propose and investigate a novel query type named trajectory search by regions of interest (TSR query). Given an argument set of trajectories, a TSR query takes a set of regions of interest as a parameter and returns the trajectory in the argument set with the highest spatial-density correlation to the query regions. This type of query is useful in many popular applications such as trip planning and recommendation, and location based services in general. TSR query processing faces three challenges: how to model the spatial-density correlation between query regions and data trajectories, how to effectively prune the search space, and how to effectively schedule multiple so-called query sources. To tackle these challenges, a series of new metrics are defined to model spatial-density correlations. An efficient trajectory search algorithm is developed that exploits upper and lower bounds to prune the search space and that adopts a query-source selection strategy, as well as integrates a heuristic search strategy based on priority ranking to schedule multiple query sources. The performance of TSR query processing is studied in extensive experiments based on real and synthetic spatial data.

  11. Searching Trajectories by Regions of Interest

    KAUST Repository

    Shang, Shuo; chen, Lisi; Jensen, Christian S.; Wen, Ji-Rong; Kalnis, Panos

    2017-01-01

    With the increasing availability of moving-object tracking data, trajectory search is increasingly important. We propose and investigate a novel query type named trajectory search by regions of interest (TSR query). Given an argument set of trajectories, a TSR query takes a set of regions of interest as a parameter and returns the trajectory in the argument set with the highest spatial-density correlation to the query regions. This type of query is useful in many popular applications such as trip planning and recommendation, and location based services in general. TSR query processing faces three challenges: how to model the spatial-density correlation between query regions and data trajectories, how to effectively prune the search space, and how to effectively schedule multiple so-called query sources. To tackle these challenges, a series of new metrics are defined to model spatial-density correlations. An efficient trajectory search algorithm is developed that exploits upper and lower bounds to prune the search space and that adopts a query-source selection strategy, as well as integrates a heuristic search strategy based on priority ranking to schedule multiple query sources. The performance of TSR query processing is studied in extensive experiments based on real and synthetic spatial data.

  12. Fourier inversion on a reductive symmetric space

    NARCIS (Netherlands)

    Ban, E.P. van den

    1999-01-01

    Let X be a semisimple symmetric space. In previous papers, [8] and [9], we have dened an explicit Fourier transform for X and shown that this transform is injective on the space C 1 c (X) ofcompactly supported smooth functions on X. In the present paper, which is a continuation of these papers, we

  13. The Space Technology-7 Disturbance Reduction System Precision Control Flight Validation Experiment Control System Design

    Science.gov (United States)

    O'Donnell, James R.; Hsu, Oscar C.; Maghami, Peirman G.; Markley, F. Landis

    2006-01-01

    As originally proposed, the Space Technology-7 Disturbance Reduction System (DRS) project, managed out of the Jet Propulsion Laboratory, was designed to validate technologies required for future missions such as the Laser Interferometer Space Antenna (LISA). The two technologies to be demonstrated by DRS were Gravitational Reference Sensors (GRSs) and Colloidal MicroNewton Thrusters (CMNTs). Control algorithms being designed by the Dynamic Control System (DCS) team at the Goddard Space Flight Center would control the spacecraft so that it flew about a freely-floating GRS test mass, keeping it centered within its housing. For programmatic reasons, the GRSs were descoped from DRS. The primary goals of the new mission are to validate the performance of the CMNTs and to demonstrate precise spacecraft position control. DRS will fly as a part of the European Space Agency (ESA) LISA Pathfinder (LPF) spacecraft along with a similar ESA experiment, the LISA Technology Package (LTP). With no GRS, the DCS attitude and drag-free control systems make use of the sensor being developed by ESA as a part of the LTP. The control system is designed to maintain the spacecraft s position with respect to the test mass, to within 10 nm/the square root of Hz over the DRS science frequency band of 1 to 30 mHz.

  14. Finite frequency shear wave splitting tomography: a model space search approach

    Science.gov (United States)

    Mondal, P.; Long, M. D.

    2017-12-01

    Observations of seismic anisotropy provide key constraints on past and present mantle deformation. A common method for upper mantle anisotropy is to measure shear wave splitting parameters (delay time and fast direction). However, the interpretation is not straightforward, because splitting measurements represent an integration of structure along the ray path. A tomographic approach that allows for localization of anisotropy is desirable; however, tomographic inversion for anisotropic structure is a daunting task, since 21 parameters are needed to describe general anisotropy. Such a large parameter space does not allow a straightforward application of tomographic inversion. Building on previous work on finite frequency shear wave splitting tomography, this study aims to develop a framework for SKS splitting tomography with a new parameterization of anisotropy and a model space search approach. We reparameterize the full elastic tensor, reducing the number of parameters to three (a measure of strength based on symmetry considerations for olivine, plus the dip and azimuth of the fast symmetry axis). We compute Born-approximation finite frequency sensitivity kernels relating model perturbations to splitting intensity observations. The strong dependence of the sensitivity kernels on the starting anisotropic model, and thus the strong non-linearity of the inverse problem, makes a linearized inversion infeasible. Therefore, we implement a Markov Chain Monte Carlo technique in the inversion procedure. We have performed tests with synthetic data sets to evaluate computational costs and infer the resolving power of our algorithm for synthetic models with multiple anisotropic layers. Our technique can resolve anisotropic parameters on length scales of ˜50 km for realistic station and event configurations for dense broadband experiments. We are proceeding towards applications to real data sets, with an initial focus on the High Lava Plains of Oregon.

  15. Compact state-space models for complex superconducting radio-frequency structures based on model order reduction and concatenation methods

    International Nuclear Information System (INIS)

    Flisgen, Thomas

    2015-01-01

    The modeling of large chains of superconducting cavities with couplers is a challenging task in computational electrical engineering. The direct numerical treatment of these structures can easily lead to problems with more than ten million degrees of freedom. Problems of this complexity are typically solved with the help of parallel programs running on supercomputing infrastructures. However, these infrastructures are expensive to purchase, to operate, and to maintain. The aim of this thesis is to introduce and to validate an approach which allows for modeling large structures on a standard workstation. The novel technique is called State-Space Concatenations and is based on the decomposition of the complete structure into individual segments. The radio-frequency properties of the generated segments are described by a set of state-space equations which either emerge from analytical considerations or from numerical discretization schemes. The model order of these equations is reduced using dedicated model order reduction techniques. In a final step, the reduced-order state-space models of the segments are concatenated in accordance with the topology of the complete structure. The concatenation is based on algebraic continuity constraints of electric and magnetic fields on the decomposition planes and results in a compact state-space system of the complete radio-frequency structure. Compared to the original problem, the number of degrees of freedom is drastically reduced, i.e. a problem with more than ten million degrees of freedom can be reduced on a standard workstation to a problem with less than one thousand degrees of freedom. The final state-space system allows for determining frequency-domain transfer functions, field distributions, resonances, and quality factors of the complete structure in a convenient manner. This thesis presents the theory of the state-space concatenation approach and discusses several validation and application examples. The examples

  16. Two-agent cooperative search using game models with endurance-time constraints

    Science.gov (United States)

    Sujit, P. B.; Ghose, Debasish

    2010-07-01

    In this article, the problem of two Unmanned Aerial Vehicles (UAVs) cooperatively searching an unknown region is addressed. The search region is discretized into hexagonal cells and each cell is assumed to possess an uncertainty value. The UAVs have to cooperatively search these cells taking limited endurance, sensor and communication range constraints into account. Due to limited endurance, the UAVs need to return to the base station for refuelling and also need to select a base station when multiple base stations are present. This article proposes a route planning algorithm that takes endurance time constraints into account and uses game theoretical strategies to reduce the uncertainty. The route planning algorithm selects only those cells that ensure the agent will return to any one of the available bases. A set of paths are formed using these cells which the game theoretical strategies use to select a path that yields maximum uncertainty reduction. We explore non-cooperative Nash, cooperative and security strategies from game theory to enhance the search effectiveness. Monte-Carlo simulations are carried out which show the superiority of the game theoretical strategies over greedy strategy for different look ahead step length paths. Within the game theoretical strategies, non-cooperative Nash and cooperative strategy perform similarly in an ideal case, but Nash strategy performs better than the cooperative strategy when the perceived information is different. We also propose a heuristic based on partitioning of the search space into sectors to reduce computational overhead without performance degradation.

  17. Discrete symmetries and coset space dimensional reduction

    International Nuclear Information System (INIS)

    Kapetanakis, D.; Zoupanos, G.

    1989-01-01

    We consider the discrete symmetries of all the six-dimensional coset spaces and we apply them in gauge theories defined in ten dimensions which are dimensionally reduced over these homogeneous spaces. Particular emphasis is given in the consequences of the discrete symmetries on the particle content as well as on the symmetry breaking a la Hosotani of the resulting four-dimensional theory. (orig.)

  18. Assessment of Technology Readiness Level of a Carbon Dioxide Reduction Assembly (CRA) for use on International Space Station

    Science.gov (United States)

    Murdoch, Karen; Smith, Fred; Perry, Jay; Green, Steve

    2004-01-01

    When technologies are traded for incorporation into vehicle systems to support a specific mission scenario, they are often assessed in terms of Technology Readiness Level (TRL). TRL is based on three major categories of Core Technology Components, Ancillary Hardware and System Maturity, and Control and Control Integration. This paper describes the Technology Readiness Level assessment of the Carbon Dioxide Reduction Assembly (CRA) for use on the International Space Station. A team comprising of the NASA Johnson Space Center, Marshall Space Flight Center, Southwest Research Institute and Hamilton Sundstrand Space Systems International have been working on various aspects of the CRA to bring its TRL from 4/5 up to 6. This paper describes the work currently being done in the three major categories. Specific details are given on technology development of the Core Technology Components including the reactor, phase separator and CO2 compressor.

  19. Structure Optimal Design of Electromagnetic Levitation Load Reduction Device for Hydroturbine Generator Set

    Directory of Open Access Journals (Sweden)

    Qingyan Wang

    2015-01-01

    Full Text Available Thrust bearing is one part with the highest failure rate in hydroturbine generator set, which is primarily due to heavy axial load. Such heavy load often makes oil film destruction, bearing friction, and even burning. It is necessary to study the load and the reduction method. The dynamic thrust is an important factor to influence the axial load and reduction design of electromagnetic device. Therefore, in the paper, combined with the structure features of vertical turbine, the hydraulic thrust is analyzed accurately. Then, take the turbine model HL-220-LT-550, for instance; the electromagnetic levitation load reduction device is designed, and its mathematical model is built, whose purpose is to minimize excitation loss and total quality under the constraints of installation space, connection layout, and heat dissipation. Particle swarm optimization (PSO is employed to search for the optimum solution; finally, the result is verified by finite element method (FEM, which demonstrates that the optimized structure is more effective.

  20. On High Dimensional Searching Spaces and Learning Methods

    DEFF Research Database (Denmark)

    Yazdani, Hossein; Ortiz-Arroyo, Daniel; Choros, Kazimierz

    2017-01-01

    , and similarity functions and discuss the pros and cons of using each of them. Conventional similarity functions evaluate objects in the vector space. Contrarily, Weighted Feature Distance (WFD) functions compare data objects in both feature and vector spaces, preventing the system from being affected by some...

  1. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    Science.gov (United States)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  2. The role of crowding in parallel search: Peripheral pooling is not responsible for logarithmic efficiency in parallel search.

    Science.gov (United States)

    Madison, Anna; Lleras, Alejandro; Buetti, Simona

    2018-02-01

    Recent results from our laboratory showed that, in fixed-target parallel search tasks, reaction times increase in a logarithmic fashion with set size, and the slope of this logarithmic function is modulated by lure-target similarity. These results were interpreted as being consistent with a processing architecture where early vision (stage one) processes elements in the display in exhaustive fashion with unlimited capacity and with a limitation in resolution. Here, we evaluate the contribution of crowding to our recent logarithmic search slope findings, considering the possibility that peripheral pooling of features (as observed in crowding) may be responsible for logarithmic efficiency. Factors known to affect the strength of crowding were varied, specifically: item spacing and similarity. The results from three experiments converge on the same pattern of results: reaction times increased logarithmically with set size and were modulated by lure-target similarity even when crowding was minimized within displays through an inter-item spacing manipulation. Furthermore, we found logarithmic search efficiencies were overall improved in displays where crowding was minimized compared to displays where crowding was possible. The findings from these three experiments suggest logarithmic efficiency in efficient search is not the result peripheral pooling of features. That said, the presence of crowding does tend to reduce search efficiency, even in "pop-out" search situations.

  3. New Search Space Reduction Algorithm for Vertical Reference Trajectory Optimization

    Directory of Open Access Journals (Sweden)

    Alejandro MURRIETA-MENDOZA

    2016-06-01

    Full Text Available Burning the fuel required to sustain a given flight releases pollution such as carbon dioxide and nitrogen oxides, and the amount of fuel consumed is also a significant expense for airlines. It is desirable to reduce fuel consumption to reduce both pollution and flight costs. To increase fuel savings in a given flight, one option is to compute the most economical vertical reference trajectory (or flight plan. A deterministic algorithm was developed using a numerical aircraft performance model to determine the most economical vertical flight profile considering take-off weight, flight distance, step climb and weather conditions. This algorithm is based on linear interpolations of the performance model using the Lagrange interpolation method. The algorithm downloads the latest available forecast from Environment Canada according to the departure date and flight coordinates, and calculates the optimal trajectory taking into account the effects of wind and temperature. Techniques to avoid unnecessary calculations are implemented to reduce the computation time. The costs of the reference trajectories proposed by the algorithm are compared with the costs of the reference trajectories proposed by a commercial flight management system using the fuel consumption estimated by the FlightSim® simulator made by Presagis®.

  4. Mass Reduction: The Weighty Challenge for Exploration Space Flight

    Science.gov (United States)

    Kloeris, Vickie L.

    2014-01-01

    Meeting nutritional and acceptability requirements is critical for the food system for an exploration class space mission. However, this must be achieved within the constraints of available resources such as water, crew time, stowage volume, launch mass and power availability. ? Due to resource constraints, exploration class missions are not expected to have refrigerators or freezers for food storage, and current per person food mass must be reduced to improve mission feasibility. ? The Packaged Food Mass Reduction Trade Study (Stoklosa, 2009) concluded that the mass of the current space food system can be effectively reduced by decreasing water content of certain foods and offering nutrient dense substitutes, such as meal replacement bars and beverages. Target nutrient ranges were established based on the nutritional content of the current breakfast and lunch meals in the ISS standard menu. A market survey of available commercial products produced no viable options for meal replacement bar or beverage products. New prototypes for both categories were formulated to meet target nutrient ranges. Samples of prototype products were packaged in high barrier packaging currently used for ISS and underwent an accelerated shelf life study at 31 degC and 41 degC (50% RH) for 24 weeks. Samples were assessed at the following time points: Initial, 6 weeks, 12 weeks, and 24 weeks. Testing at each time point included the following: color, texture, water activity, acceptability, and hexanal analysis (for food bars only). Proof of concept prototypes demonstrated that meal replacement food bars and beverages can deliver a comparable macronutrient profile while reducing the overall mass when compared to the ISS Standard Menu. Future work suggestions for meal replacement bars: Reformulation to include ingredients that reduce hardness and reduce browning to increase shelf life. Micronutrient analysis and potential fortification. Sensory evaluation studies including satiety tests and

  5. Learning Search Algorithms: An Educational View

    Directory of Open Access Journals (Sweden)

    Ales Janota

    2014-12-01

    Full Text Available Artificial intelligence methods find their practical usage in many applications including maritime industry. The paper concentrates on the methods of uninformed and informed search, potentially usable in solving of complex problems based on the state space representation. The problem of introducing the search algorithms to newcomers has its technical and psychological dimensions. The authors show how it is possible to cope with both of them through design and use of specialized authoring systems. A typical example of searching a path through the maze is used to demonstrate how to test, observe and compare properties of various search strategies. Performance of search methods is evaluated based on the common criteria.

  6. Space biology research development

    Science.gov (United States)

    Bonting, Sjoerd L.

    1993-01-01

    The purpose of the Search for Extraterrestrial Intelligence (SETI) Institute is to conduct and promote research related activities regarding the search for extraterrestrial life, particularly intelligent life. Such research encompasses the broad discipline of 'Life in the Universe', including all scientific and technological aspects of astronomy and the planetary sciences, chemical evolution, the origin of life, biological evolution, and cultural evolution. The primary purpose was to provide funding for the Principal Investigator to collaborate with the personnel of the SETI Institute and the NASA-Ames Research center in order to plan and develop space biology research on and in connection with Space Station Freedom; to promote cooperation with the international partners in the space station; to conduct a study on the use of biosensors in space biology research and life support system operation; and to promote space biology research through the initiation of an annual publication 'Advances in Space Biology and Medicine'.

  7. Searching for God: Illness-Related Mortality Threats and Religious Search Volume in Google in 16 Nations.

    Science.gov (United States)

    Pelham, Brett W; Shimizu, Mitsuru; Arndt, Jamie; Carvallo, Mauricio; Solomon, Sheldon; Greenberg, Jeff

    2018-03-01

    We tested predictions about religiosity and terror management processes in 16 nations. Specifically, we examined weekly variation in Google search volume in each nation for 12 years (all weeks for which data were available). In all 16 nations, higher than usual weekly Google search volume for life-threatening illnesses (cancer, diabetes, and hypertension) predicted increases in search volume for religious content (e.g., God, Jesus, prayer) in the following week. This effect held up after controlling for (a) recent past and annual variation in religious search volume, (b) increases in search volume associated with religious holidays, and (c) variation in searches for a non-life-threatening illness ("sore throat"). Terror management threat reduction processes appear to occur across the globe. Furthermore, they may occur over much longer periods than those studied in the laboratory. Managing fears of death via religious belief regulation appears to be culturally pervasive.

  8. Logistics Reduction and Repurposing Technology for Long Duration Space Missions

    Science.gov (United States)

    Broyan, James Lee, Jr.; Chu, Andrew; Ewert, Michael K.

    2014-01-01

    One of NASA's Advanced Exploration Systems (AES) projects is the Logistics Reduction and Repurposing (LRR) project, which has the goal of reducing logistics resupply items through direct and indirect means. Various technologies under development in the project will reduce the launch mass of consumables and their packaging, enable reuse and repurposing of items, and make logistics tracking more efficient. Repurposing also reduces the trash burden onboard spacecraft and indirectly reduces launch mass by one manifest item having two purposes rather than two manifest items each having only one purpose. This paper provides the status of each of the LRR technologies in their third year of development under AES. Advanced clothing systems (ACSs) are being developed to enable clothing to be worn longer, directly reducing launch mass. ACS has completed a ground exercise clothing study in preparation for an International Space Station technology demonstration in 2014. Development of launch packaging containers and other items that can be repurposed on-orbit as part of habitation outfitting has resulted in a logistics-to-living (L2L) concept. L2L has fabricated and evaluated several multi-purpose cargo transfer bags for potential reuse on-orbit. Autonomous logistics management is using radio frequency identification (RFID) to track items and thus reduce crew time for logistics functions. An RFID dense reader prototype is under construction and plans for integrated testing are being made. A heat melt compactor (HMC) second generation unit for processing trash into compact and stable tiles is nearing completion. The HMC prototype compaction chamber has been completed and system development testing is under way. Research has been conducted on the conversion of trash-to-gas (TtG) for high levels of volume reduction and for use in propulsion systems. A steam reformation system was selected for further system definition of the TtG technology.

  9. The role of space and time in object-based visual search

    NARCIS (Netherlands)

    Schreij, D.B.B.; Olivers, C.N.L.

    2013-01-01

    Recently we have provided evidence that observers more readily select a target from a visual search display if the motion trajectory of the display object suggests that the observer has dealt with it before. Here we test the prediction that this object-based memory effect on search breaks down if

  10. Expedite random structure searching using objects from Wyckoff positions

    Science.gov (United States)

    Wang, Shu-Wei; Hsing, Cheng-Rong; Wei, Ching-Ming

    2018-02-01

    Random structure searching has been proved to be a powerful approach to search and find the global minimum and the metastable structures. A true random sampling is in principle needed yet it would be highly time-consuming and/or practically impossible to find the global minimum for the complicated systems in their high-dimensional configuration space. Thus the implementations of reasonable constraints, such as adopting system symmetries to reduce the independent dimension in structural space and/or imposing chemical information to reach and relax into low-energy regions, are the most essential issues in the approach. In this paper, we propose the concept of "object" which is either an atom or composed of a set of atoms (such as molecules or carbonates) carrying a symmetry defined by one of the Wyckoff positions of space group and through this process it allows the searching of global minimum for a complicated system to be confined in a greatly reduced structural space and becomes accessible in practice. We examined several representative materials, including Cd3As2 crystal, solid methanol, high-pressure carbonates (FeCO3), and Si(111)-7 × 7 reconstructed surface, to demonstrate the power and the advantages of using "object" concept in random structure searching.

  11. Strong Production SUSY Searches at ATLAS and CMS

    CERN Document Server

    Marshall, Z L

    2015-01-01

    The results of searches for strongly-produced supersymmetry at the Large Hadron Collider by the ATLAS and CMS collaborations are presented. Several of the historically strongest zero-and one-lepton final state searches have been updated to include multi-bin fits and combinations. In addition, new two-lepton final state search results are shown from CMS and ATLAS, which show 2.6 and 3.0 standard deviation excesses, respectively, above the standard model expectation, albeit in different regions of phase space. Both collaborations have also shown new searches that cover areas uncovered by previous searches, in both searches for light stops and searches for stealth supersymmetry.

  12. Visualization of Pulsar Search Data

    Science.gov (United States)

    Foster, R. S.; Wolszczan, A.

    1993-05-01

    The search for periodic signals from rotating neutron stars or pulsars has been a computationally taxing problem to astronomers for more than twenty-five years. Over this time interval, increases in computational capability have allowed ever more sensitive searches, covering a larger parameter space. The volume of input data and the general presence of radio frequency interference typically produce numerous spurious signals. Visualization of the search output and enhanced real-time processing of significant candidate events allow the pulsar searcher to optimally processes and search for new radio pulsars. The pulsar search algorithm and visualization system presented in this paper currently runs on serial RISC based workstations, a traditional vector based super computer, and a massively parallel computer. A description of the serial software algorithm and its modifications for massively parallel computing are describe. The results of four successive searches for millisecond period radio pulsars using the Arecibo telescope at 430 MHz have resulted in the successful detection of new long-period and millisecond period radio pulsars.

  13. Organic chemistry in space

    Science.gov (United States)

    Johnson, R. D.

    1977-01-01

    Organic cosmochemistry, organic materials in space exploration, and biochemistry of man in space are briefly surveyed. A model of Jupiter's atmosphere is considered, and the search for organic molecules in the solar system and in interstellar space is discussed. Materials and analytical techniques relevant to space exploration are indicated, and the blood and urine analyses performed on Skylab are described.

  14. Complete Mapping of Complex Disulfide Patterns with Closely-Spaced Cysteines by In-Source Reduction and Data-Dependent Mass Spectrometry

    DEFF Research Database (Denmark)

    Cramer, Christian N; Kelstrup, Christian D; Olsen, Jesper V

    2017-01-01

    bonds are present in complicated patterns. This includes the presence of disulfide bonds in nested patterns and closely spaced cysteines. Unambiguous mapping of such disulfide bonds typically requires advanced MS approaches. In this study, we exploited in-source reduction (ISR) of disulfide bonds during...... the electrospray ionization process to facilitate disulfide bond assignments. We successfully developed a LC-ISR-MS/MS methodology to use as an online and fully automated partial reduction procedure. Postcolumn partial reduction by ISR provided fast and easy identification of peptides involved in disulfide bonding......Mapping of disulfide bonds is an essential part of protein characterization to ensure correct cysteine pairings. For this, mass spectrometry (MS) is the most widely used technique due to fast and accurate characterization. However, MS-based disulfide mapping is challenged when multiple disulfide...

  15. A Hybrid Optimization Framework with POD-based Order Reduction and Design-Space Evolution Scheme

    Science.gov (United States)

    Ghoman, Satyajit S.

    The main objective of this research is to develop an innovative multi-fidelity multi-disciplinary design, analysis and optimization suite that integrates certain solution generation codes and newly developed innovative tools to improve the overall optimization process. The research performed herein is divided into two parts: (1) the development of an MDAO framework by integration of variable fidelity physics-based computational codes, and (2) enhancements to such a framework by incorporating innovative features extending its robustness. The first part of this dissertation describes the development of a conceptual Multi-Fidelity Multi-Strategy and Multi-Disciplinary Design Optimization Environment (M3 DOE), in context of aircraft wing optimization. M 3 DOE provides the user a capability to optimize configurations with a choice of (i) the level of fidelity desired, (ii) the use of a single-step or multi-step optimization strategy, and (iii) combination of a series of structural and aerodynamic analyses. The modularity of M3 DOE allows it to be a part of other inclusive optimization frameworks. The M 3 DOE is demonstrated within the context of shape and sizing optimization of the wing of a Generic Business Jet aircraft. Two different optimization objectives, viz. dry weight minimization, and cruise range maximization are studied by conducting one low-fidelity and two high-fidelity optimization runs to demonstrate the application scope of M3 DOE. The second part of this dissertation describes the development of an innovative hybrid optimization framework that extends the robustness of M 3 DOE by employing a proper orthogonal decomposition-based design-space order reduction scheme combined with the evolutionary algorithm technique. The POD method of extracting dominant modes from an ensemble of candidate configurations is used for the design-space order reduction. The snapshot of candidate population is updated iteratively using evolutionary algorithm technique of

  16. Effects of spot size and spot spacing on lateral penumbra reduction when using a dynamic collimation system for spot scanning proton therapy

    International Nuclear Information System (INIS)

    Hyer, Daniel E; Hill, Patrick M; Wang, Dongxu; Smith, Blake R; Flynn, Ryan T

    2014-01-01

    The purpose of this work was to investigate the reduction in lateral dose penumbra that can be achieved when using a dynamic collimation system (DCS) for spot scanning proton therapy as a function of two beam parameters: spot size and spot spacing. This is an important investigation as both values impact the achievable dose distribution and a wide range of values currently exist depending on delivery hardware. Treatment plans were created both with and without the DCS for in-air spot sizes (σ air ) of 3, 5, 7, and 9 mm as well as spot spacing intervals of 2, 4, 6 and 8 mm. Compared to un-collimated treatment plans, the plans created with the DCS yielded a reduction in the mean dose to normal tissue surrounding the target of 26.2–40.6% for spot sizes of 3–9 mm, respectively. Increasing the spot spacing resulted in a decrease in the time penalty associated with using the DCS that was approximately proportional to the reduction in the number of rows in the raster delivery pattern. We conclude that dose distributions achievable when using the DCS are comparable to those only attainable with much smaller initial spot sizes, suggesting that the goal of improving high dose conformity may be achieved by either utilizing a DCS or by improving beam line optics. (note)

  17. Four-Dimensional Golden Search

    Energy Technology Data Exchange (ETDEWEB)

    Fenimore, Edward E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-25

    The Golden search technique is a method to search a multiple-dimension space to find the minimum. It basically subdivides the possible ranges of parameters until it brackets, to within an arbitrarily small distance, the minimum. It has the advantages that (1) the function to be minimized can be non-linear, (2) it does not require derivatives of the function, (3) the convergence criterion does not depend on the magnitude of the function. Thus, if the function is a goodness of fit parameter such as chi-square, the convergence does not depend on the noise being correctly estimated or the function correctly following the chi-square statistic. And, (4) the convergence criterion does not depend on the shape of the function. Thus, long shallow surfaces can be searched without the problem of premature convergence. As with many methods, the Golden search technique can be confused by surfaces with multiple minima.

  18. Searching for directly decaying gluinos at the Tevatron

    International Nuclear Information System (INIS)

    Alwall, Johan; Le, My-Phuong; Lisanti, Mariangela; Wacker, Jay G.

    2008-01-01

    This Letter describes how to perform searches over the complete kinematically-allowed parameter space for new pair-produced color octet particles that each subsequently decay into two jets plus missing energy at the Tevatron. This Letter shows that current searches can miss otherwise discoverable spectra of particles due to CMSSM-motivated cuts. Optimizing the H T and E/ T cuts expands the sensitivity of these searches

  19. On dimensional reduction over coset spaces

    International Nuclear Information System (INIS)

    Kapetanakis, D.; Zoupanos, G.

    1990-01-01

    Gauge theories defined in higher dimensions can be dimensionally reduced over coset spaces giving definite predictions for the resulting four-dimensional theory. We present the most interesting features of these theories as well as an attempt to construct a model with realistic low energy behaviour within this framework. (author)

  20. Speeding Up Maximal Causality Reduction with Static Dependency Analysis

    OpenAIRE

    Huang, Shiyou; Huang, Jeff

    2017-01-01

    Stateless Model Checking (SMC) offers a powerful approach to verifying multithreaded programs but suffers from the state-space explosion problem caused by the huge thread interleaving space. The pioneering reduction technique Partial Order Reduction (POR) mitigates this problem by pruning equivalent interleavings from the state space. However, limited by the happens-before relation, POR still explores redundant executions. The recent advance, Maximal Causality Reduction (MCR), shows a promisi...

  1. The experience of lived space in persons with dementia: a systematic meta-synthesis.

    Science.gov (United States)

    Førsund, Linn Hege; Grov, Ellen Karine; Helvik, Anne-Sofie; Juvet, Lene Kristine; Skovdahl, Kirsti; Eriksen, Siren

    2018-02-01

    Identifying how persons with dementia experience lived space is important for enabling supportive living environments and creating communities that compensate for the fading capabilities of these persons. Several single studies have explored this topic; however, few studies have attempted to explicitly review and synthesize this research literature. The aim of this systematic meta-synthesis was therefore to interpret and synthesize knowledge regarding persons with dementia's experience of space. A systematic, computerized search of AgeLine, CINAHL Complete, Embase, Medline and PsycINFO was conducted using a search strategy that combined MeSH terms and text words for different types of dementia with different descriptions of experience. Studies with 1) a sample of persons with dementia, 2) qualitative interviews as a research method and 3) a description of experiences of lived space were included. The search resulted in 1386 articles, of which 136 were identified as eligible and were read and assessed using the CASP criteria. The analysis was inspired by qualitative content analyses. This interpretative qualitative meta-synthesis included 45 articles encompassing interviews with 672 persons with dementia. The analysis showed that living in one's own home and living in long-term care established different settings and posed diverse challenges for the experience of lived space in persons with dementia. The material revealed four main categories that described the experience of lived space: (1) belonging; (2) meaningfulness; (3) safety and security; and (4) autonomy. It showed how persons with dementia experienced a reduction in their lived space due to the progression of dementia. A comprehensive understanding of the categories led to the latent theme: "Living with dementia is like living in a space where the walls keep closing in". This meta-synthesis reveals a process whereby lived space gradually becomes smaller for persons with dementia. This underscores the

  2. An automated full-symmetry Patterson search method

    International Nuclear Information System (INIS)

    Rius, J.; Miravitlles, C.

    1987-01-01

    A full-symmetry Patterson search method is presented that performs a molecular coarse rotation search in vector space and orientation refinement using the σ function. The oriented molecule is positioned using the fast translation function τ 0 , which is based on the automated interpretation of τ projections using the sum function. This strategy reduces the number of Patterson-function values to be stored in the rotation search, and the use of the τ 0 function minimizes the required time for the development of all probable rotation search solutions. The application of this method to five representative test examples is shown. (orig.)

  3. Reducing a Knowledge-Base Search Space When Data Are Missing

    Science.gov (United States)

    James, Mark

    2007-01-01

    This software addresses the problem of how to efficiently execute a knowledge base in the presence of missing data. Computationally, this is an exponentially expensive operation that without heuristics generates a search space of 1 + 2n possible scenarios, where n is the number of rules in the knowledge base. Even for a knowledge base of the most modest size, say 16 rules, it would produce 65,537 possible scenarios. The purpose of this software is to reduce the complexity of this operation to a more manageable size. The problem that this system solves is to develop an automated approach that can reason in the presence of missing data. This is a meta-reasoning capability that repeatedly calls a diagnostic engine/model to provide prognoses and prognosis tracking. In the big picture, the scenario generator takes as its input the current state of a system, including probabilistic information from Data Forecasting. Using model-based reasoning techniques, it returns an ordered list of fault scenarios that could be generated from the current state, i.e., the plausible future failure modes of the system as it presently stands. The scenario generator models a Potential Fault Scenario (PFS) as a black box, the input of which is a set of states tagged with priorities and the output of which is one or more potential fault scenarios tagged by a confidence factor. The results from the system are used by a model-based diagnostician to predict the future health of the monitored system.

  4. Indirect and inclusive search for dark matter with AMS02 space spectrometer

    International Nuclear Information System (INIS)

    Brun, P.

    2007-06-01

    AMS02 is a particle physics detector designed for 3 years of data collecting aboard the International Space Station. Equipped with a superconducting magnet, it will allow to measure gamma and cosmic ray fluxes in the GeV to TeV region with high particle identification capabilities. Its performance is based on the redundancy of measurements in specific sub-detectors: a Time-Of-Flight counter, a Transition Radiation Detector, a Silicon Tracker, a Ring Imaging Cherenkov counter and an Electromagnetic calorimeter (Ecal). The Ecal is studied in details, in particular with the qualification of a stand-alone trigger devoted to gamma ray astronomy. This system allows the increase of the AMS02 sensitivity to photons, and the improvement of the reconstruction of electromagnetic events. The analog part of the trigger system has been tested with test benches and with a beam at CERN. The in-orbit calibration of the Ecal is studied, it may proceed in two steps. First, the Ecal cells responses have to be equalized with minimum ionizing particles data. Then an absolute calibration can be performed with cosmic electrons. For both the relative and the absolute calibration, possible procedures are defined and realistic calibration times are estimated. The second part deals with the indirect searches for dark matter and the study of the AMS02 sensitivity. Dark matter stands for 84% of the Universe mass and could consist in new particles. Dark matter particles are expected to surround our Galaxy and annihilate in high density regions. These annihilations could become observable exotic primary cosmic ray sources. Searches for anomalous excesses in (p-bar, e + , D-bar) and γ ray fluxes will be performed by AMS02. A numerical tool allowing us to perform predictions for these exotic fluxes within supersymmetry or extra-dimension is developed and is presented in details. Phenomenological studies regarding possible enhancements of these signals by over-dense regions of the halo have also

  5. In search of the structure of human olfactory space

    Directory of Open Access Journals (Sweden)

    Alexei eKoulakov

    2011-09-01

    Full Text Available We analyze the responses of human observers to an ensemble of monomolecular odorants. Each odorant is characterized by a set of 146 perceptual descriptors obtained from a database of odor character profiles. Each odorant is therefore represented by a point in a highly multidimensional sensory space. In this work we study the arrangement of odorants in this perceptual space. We argue that odorants densely sample a two-dimensional curved surface embedded in the multidimensional sensory space. This surface can account for more than half of the variance of the perceptual data. We also show that only 12% of experimental variance cannot be explained by curved surfaces of substantially small dimensionality (<10. We suggest that these curved manifolds represent the relevant spaces sampled by the human olfactory system, thereby providing surrogates for olfactory sensory space. For the case of 2D approximation, we relate the two parameters on the curved surface to the physico-chemical parameters of odorant molecules. We show that one of the dimensions is related to eigenvalues of molecules’ connectivity matrix, while the other is correlated with measures of molecules’ polarity. We discuss the behavioral significance of these findings.

  6. Improved Trajectory Search Capability for Multi-Rendezvous and Flyby Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — This work made vital improvements to a primitive trajectory search algorithm known as NASA Exhaustive Lambert Lattice Search (NELLS). NELLS was created to work hand...

  7. Coupling gravity, electromagnetism and space-time for space propulsion breakthroughs

    Science.gov (United States)

    Millis, Marc G.

    1994-01-01

    spaceflight would be revolutionized if it were possible to propel a spacecraft without rockets using the coupling between gravity, electromagnetism, and space-time (hence called 'space coupling propulsion'). New theories and observations about the properties of space are emerging which offer new approaches to consider this breakthrough possibility. To guide the search, evaluation, and application of these emerging possibilities, a variety of hypothetical space coupling propulsion mechanisms are presented to highlight the issues that would have to be satisfied to enable such breakthroughs. A brief introduction of the emerging opportunities is also presented.

  8. Search route decision of environmental monitoring at emergency time

    International Nuclear Information System (INIS)

    Aoyama, Isao

    1979-01-01

    The search route decision method is reviewed, especially the adequate arrangement of monitors in view of time in the information-gathering activity by transferring the monitors on the horizontal space after the confirmation of the abnormal release of radioactive material. As for the field of the theory of search, the developmental history is explained, namely the experiences of the naval anti submarine operation in WW-2, the salvage activities and the search problem on the sea. The kinematics for search, the probability theory for detection and the optimum distribution for search are the most important contents of the application of theory of search relating to the environmental monitoring at emergency condition. The combination of a search model consists of the peculiarity of targets, the peculiarity of observers and the standard of optimality. The peculiarity of targets is divided into the space of search, the number of targets, the way of appearance of targets and the motion of targets. The peculiarity of observers is divided into the number of observers, the divisibility of efforts for search, the credibility of search information and the search process. The standard of optimality is divided into the maximum probability of detection, the minimum risk expected and the others. Each item written above of search model is explained. Concerning the formulation of the search model, the theoretical equations for detection probability, discovery potential and instantaneous detection probability, density are derived, and these equations are evaluated and explained. The future plan is to advance the search technology so as to evaluate the detection potential to decide the route of running a monitoring car for a nuclear power plant at accidental condition. (Nakai, Y.)

  9. Search-Order Independent State Caching

    DEFF Research Database (Denmark)

    Evangelista, Sami; Kristensen, Lars Michael

    2010-01-01

    State caching is a memory reduction technique used by model checkers to alleviate the state explosion problem. It has traditionally been coupled with a depth-first search to ensure termination.We propose and experimentally evaluate an extension of the state caching method for general state...

  10. Contested space in the pharmacy: public attitudes to pharmacy harm reduction services in the West of Scotland.

    Science.gov (United States)

    Gidman, Wendy; Coomber, Ross

    2014-01-01

    Internationally, community pharmacies have become increasingly involved in providing harm reduction services and health advice to people who use illicit drugs. This paper considers public opinion of community pharmacy services. It discusses attitudes to harm reduction services in the context of stigmatization of addiction and people who use drugs. This exploratory study involved twenty-six purposively sampled members of the public, from the West of Scotland, participating in one of 5 focus groups. The groups were composed to represent known groups of users and non-users of community pharmacy, none of whom were problem drug users. Three thematic categories were identified: methadone service users in community pharmacies; attitudes to harm reduction policies; contested space. Harm reduction service expansion has resulted in a high volume of drug users in and around some Scottish pharmacies. Even if harm reduction services are provided discretely users' behavior can differentiate them from other pharmacy users. Drug users' behavior in this setting is commonly perceived to be unacceptable and can deter other consumers from using pharmacy services. The results of this study infer that negative public opinion is highly suggestive of stereotyping and stigmatization of people who use drugs. Participants considered that (1) community pharmacies were unsuitable environments for harm reduction service provision, as they are used by older people and those with children; (2) current drug policy is perceived as ineffective, as abstinence is seldom achieved and methadone was reported to be re-sold; (3) people who use drugs were avoided where possible in community pharmacies. Community pharmacy harm reduction services increasingly bring together the public and drug users. Study participants were reluctant to share pharmacy facilities with drug users. This paper concludes by suggesting mechanisms to minimize stigmatization. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Logistics Reduction and Repurposing Technology for Long Duration Space Missions

    Science.gov (United States)

    Broyan, James L.; Chu, Andrew; Ewert, Michael K.

    2014-01-01

    One of NASA's Advanced Exploration Systems (AES) projects is the Logistics Reduction and Repurposing (LRR) project, which has the goal of reducing logistics resupply items through direct and indirect means. Various technologies under development in the project will reduce the launch mass of consumables and their packaging, enable reuse and repurposing of items and make logistics tracking more efficient. Repurposing also reduces the trash burden onboard spacecraft and indirectly reduces launch mass by replacing some items on the manifest. Examples include reuse of trash as radiation shielding or propellant. This paper provides the status of the LRR technologies in their third year of development under AES. Advanced clothing systems (ACS) are being developed to enable clothing to be worn longer, directly reducing launch mass. ACS has completed a ground exercise clothing study in preparation for an International Space Station (ISS) technology demonstration in 2014. Development of launch packaging containers and other items that can be repurposed on-orbit as part of habitation outfitting has resulted in a logistics-to-living (L2L) concept. L2L has fabricated and evaluated several multi-purpose cargo transfer bags (MCTBs) for potential reuse on orbit. Autonomous logistics management (ALM) is using radio frequency identification (RFID) to track items and thus reduce crew requirements for logistics functions. An RFID dense reader prototype is under construction and plans for integrated testing are being made. Development of a heat melt compactor (HMC) second generation unit for processing trash into compact and stable tiles is nearing completion. The HMC prototype compaction chamber has been completed and system development testing is underway. Research has been conducted on the conversion of trash-to-gas (TtG) for high levels of volume reduction and for use in propulsion systems. A steam reformation system was selected for further system definition of the TtG technology

  12. Study of the X-Ray Diagnosis of Unstable Pelvic Fracture Displacements in Three-Dimensional Space and its Application in Closed Reduction.

    Science.gov (United States)

    Shi, Chengdi; Cai, Leyi; Hu, Wei; Sun, Junying

    2017-09-19

    ABSTRACTS Objective: To study the method of X-ray diagnosis of unstable pelvic fractures displaced in three-dimensional (3D) space and its clinical application in closed reduction. Five models of hemipelvic displacement were made in an adult pelvic specimen. Anteroposterior radiographs of the pelvis were analyzed in PACS. The method of X-ray diagnosis was applied in closed reductions. From February 2012 to June 2016, 23 patients (15 men, 8 women; mean age, 43.4 years) with unstable pelvic fractures were included. All patients were treated by closed reduction and percutaneous cannulate screw fixation of the pelvic ring. According to Tile's classification, the patients were classified into type B1 in 7 cases, B2 in 3, B3 in 3, C1 in 5, C2 in 3, and C3 in 2. The operation time and intraoperative blood loss were recorded. Postoperative images were evaluated by Matta radiographic standards. Five models of displacement were made successfully. The X-ray features of the models were analyzed. For clinical patients, the average operation time was 44.8 min (range, 20-90 min) and the average intraoperative blood loss was 35.7 (range, 20-100) mL. According to the Matta standards, 7 cases were excellent, 12 cases were good, and 4 were fair. The displacements in 3D space of unstable pelvic fractures can be diagnosed rapidly by X-ray analysis to guide closed reduction, with a satisfactory clinical outcome.

  13. MAP: an iterative experimental design methodology for the optimization of catalytic search space structure modeling.

    Science.gov (United States)

    Baumes, Laurent A

    2006-01-01

    One of the main problems in high-throughput research for materials is still the design of experiments. At early stages of discovery programs, purely exploratory methodologies coupled with fast screening tools should be employed. This should lead to opportunities to find unexpected catalytic results and identify the "groups" of catalyst outputs, providing well-defined boundaries for future optimizations. However, very few new papers deal with strategies that guide exploratory studies. Mostly, traditional designs, homogeneous covering, or simple random samplings are exploited. Typical catalytic output distributions exhibit unbalanced datasets for which an efficient learning is hardly carried out, and interesting but rare classes are usually unrecognized. Here is suggested a new iterative algorithm for the characterization of the search space structure, working independently of learning processes. It enhances recognition rates by transferring catalysts to be screened from "performance-stable" space zones to "unsteady" ones which necessitate more experiments to be well-modeled. The evaluation of new algorithm attempts through benchmarks is compulsory due to the lack of past proofs about their efficiency. The method is detailed and thoroughly tested with mathematical functions exhibiting different levels of complexity. The strategy is not only empirically evaluated, the effect or efficiency of sampling on future Machine Learning performances is also quantified. The minimum sample size required by the algorithm for being statistically discriminated from simple random sampling is investigated.

  14. Beyond spaces of counselling

    DEFF Research Database (Denmark)

    Bank, Mads; Nissen, Morten

    2017-01-01

    The article articulates experiments with spatial constructions in two Danish social work agencies, basing on (a) a sketchy genealogical reconstruction of conceptualisations and uses of space in social work and counselling, (b) a search for theoretical resources to articulate new spaces, and (c...... spaces are forms of spatialisations which might be taken as prototypical in attempts to develop social work and counselling...

  15. Similarity search processing. Paralelization and indexing technologies.

    Directory of Open Access Journals (Sweden)

    Eder Dos Santos

    2015-08-01

    The next Scientific-Technical Report addresses the similarity search and the implementation of metric structures on parallel environments. It also presents the state of the art related to similarity search on metric structures and parallelism technologies. Comparative analysis are also proposed, seeking to identify the behavior of a set of metric spaces and metric structures over processing platforms multicore-based and GPU-based.

  16. Document Clustering Approach for Meta Search Engine

    Science.gov (United States)

    Kumar, Naresh, Dr.

    2017-08-01

    The size of WWW is growing exponentially with ever change in technology. This results in huge amount of information with long list of URLs. Manually it is not possible to visit each page individually. So, if the page ranking algorithms are used properly then user search space can be restricted up to some pages of searched results. But available literatures show that no single search system can provide qualitative results from all the domains. This paper provides solution to this problem by introducing a new meta search engine that determine the relevancy of query corresponding to web page and cluster the results accordingly. The proposed approach reduces the user efforts, improves the quality of results and performance of the meta search engine.

  17. Novelty Search for Soft Robotic Space Exploration

    NARCIS (Netherlands)

    Methenitis, G.; Hennes, D.; Izzo, D.; Visser, A.

    2015-01-01

    The use of soft robots in future space exploration is still a far-fetched idea, but an attractive one. Soft robots are inherently compliant mechanisms that are well suited for locomotion on rough terrain as often faced in extra-planetary environments. Depending on the particular application and

  18. Novelty search for soft robotic space exploration

    NARCIS (Netherlands)

    G. Methenitis (Georgios); D. Hennes; D. Izzo; A. Visser

    2015-01-01

    textabstractThe use of soft robots in future space exploration is still a far-fetched idea, but an attractive one. Soft robots are inherently compliant mechanisms that are well suited for locomotion on rough terrain as often faced in extra-planetary environments. Depending on the particular

  19. Learning Latent Vector Spaces for Product Search

    NARCIS (Netherlands)

    Van Gysel, C.; de Rijke, M.; Kanoulas, E.

    2016-01-01

    We introduce a novel latent vector space model that jointly learns the latent representations of words, e-commerce products and a mapping between the two without the need for explicit annotations. The power of the model lies in its ability to directly model the discriminative relation between

  20. Confluence reduction for Markov automata

    NARCIS (Netherlands)

    Timmer, Mark; Katoen, Joost P.; van de Pol, Jaco; Stoelinga, Mariëlle Ida Antoinette

    2016-01-01

    Markov automata are a novel formalism for specifying systems exhibiting nondeterminism, probabilistic choices and Markovian rates. As expected, the state space explosion threatens the analysability of these models. We therefore introduce confluence reduction for Markov automata, a powerful reduction

  1. Complexity in Simplicity: Flexible Agent-based State Space Exploration

    DEFF Research Database (Denmark)

    Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2007-01-01

    In this paper, we describe a new flexible framework for state space exploration based on cooperating agents. The idea is to let various agents with different search patterns explore the state space individually and communicate information about fruitful subpaths of the search tree to each other...

  2. Low-Cost Radon Reduction Pilot Study

    Energy Technology Data Exchange (ETDEWEB)

    Rose, William B. [Partnership for Advanced Residential Retrofit, Champaign, IL (United States); Francisco, Paul W. [Partnership for Advanced Residential Retrofit, Champaign, IL (United States); Merrin, Zachary [Partnership for Advanced Residential Retrofit, Champaign, IL (United States)

    2015-09-01

    The aim of the research was to conduct a primary scoping study on the impact of air sealing between the foundation and the living space on radon transport reduction across the foundation-living space floor assembly. Fifteen homes in the Champaign, Illinois area participated in the study. These homes were instrumented for hourly continuous radon measurements and simultaneous temperature and humidity the foundation was improved. However, this improved isolation did not lead to significant reductions in radon concentration in the living space. Other factors such as outdoor temperature were shown to have an impact on radon concentration.

  3. In-Space Propulsion, Logistics Reduction, and Evaluation of Steam Reformer Kinetics: Problems and Prospects

    Science.gov (United States)

    Jaworske, D. A.; Palaszewski, B. A.; Kulis, M. J.; Gokoglu, S. A.

    2015-01-01

    Human space missions generate waste materials. A 70-kg crewmember creates a waste stream of 1 kg per day, and a four-person crew on a deep space habitat for a 400+ day mission would create over 1600 kg of waste. Converted into methane, the carbon could be used as a fuel for propulsion or power. The NASA Advanced Exploration Systems (AES) Logistics Reduction and Repurposing (LRR) project is investing in space resource utilization with an emphasis on repurposing logistics materials for useful purposes and has selected steam reforming among many different competitive processes as the preferred method for repurposing organic waste into methane. Already demonstrated at the relevant processing rate of 5.4 kg of waste per day, high temperature oxygenated steam consumes waste and produces carbon dioxide, carbon monoxide, and hydrogen which can then be converted into methane catalytically. However, the steam reforming process has not been studied in microgravity. Data are critically needed to understand the mechanisms that allow use of steam reforming in a reduced gravity environment. This paper reviews the relevant literature, identifies gravity-dependent mechanisms within the steam gasification process, and describes an innovative experiment to acquire the crucial kinetic information in a small-scale reactor specifically designed to operate within the requirements of a reduced gravity aircraft flight. The experiment will determine if the steam reformer process is mass-transport limited, and if so, what level of forced convection will be needed to obtain performance comparable to that in 1-g.

  4. Optimal Control of Sensor Threshold for Autonomous Wide Area Search Munitions

    National Research Council Canada - National Science Library

    Kish, Brian A; Jacques, David R; Pachter, Meir

    2005-01-01

    The optimal employment of autonomous wide area search munitions is addressed. The scenario considered involves an airborne munition searching a battle space for stationary targets in the presence of false targets...

  5. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  6. The search for an alerted moving target; 2005BU2-OA

    NARCIS (Netherlands)

    Vermeulen, J.F.J.; Brink, M. van den

    2005-01-01

    We investigate a two-sided, multi-stage search problem where a continuous search effort is made by one or more search units to detect a moving target in a continuous target space, under noisy detection conditions. A specific example of this problem is hunting for an enemy submarine by naval forces.

  7. Towards improving searches for optimal phylogenies.

    Science.gov (United States)

    Ford, Eric; St John, Katherine; Wheeler, Ward C

    2015-01-01

    Finding the optimal evolutionary history for a set of taxa is a challenging computational problem, even when restricting possible solutions to be "tree-like" and focusing on the maximum-parsimony optimality criterion. This has led to much work on using heuristic tree searches to find approximate solutions. We present an approach for finding exact optimal solutions that employs and complements the current heuristic methods for finding optimal trees. Given a set of taxa and a set of aligned sequences of characters, there may be subsets of characters that are compatible, and for each such subset there is an associated (possibly partially resolved) phylogeny with edges corresponding to each character state change. These perfect phylogenies serve as anchor trees for our constrained search space. We show that, for sequences with compatible sites, the parsimony score of any tree [Formula: see text] is at least the parsimony score of the anchor trees plus the number of inferred changes between [Formula: see text] and the anchor trees. As the maximum-parsimony optimality score is additive, the sum of the lower bounds on compatible character partitions provides a lower bound on the complete alignment of characters. This yields a region in the space of trees within which the best tree is guaranteed to be found; limiting the search for the optimal tree to this region can significantly reduce the number of trees that must be examined in a search of the space of trees. We analyze this method empirically using four different biological data sets as well as surveying 400 data sets from the TreeBASE repository, demonstrating the effectiveness of our technique in reducing the number of steps in exact heuristic searches for trees under the maximum-parsimony optimality criterion. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Positions priming in briefly presented search arrays

    DEFF Research Database (Denmark)

    Asgeirsson, Arni Gunnar; Kristjánsson, Árni; Kyllingsbæk, Søren

    2011-01-01

    Repetition priming in visual search has been a topic of extensive research since Maljkovic & Nakayama [1994, Memory & Cognition, 22, 657-672] presented the first detailed studies of such effects. Their results showed large reductions in reaction times when target color was repeated on consecutive...... the targets are oddly colored alphanumeric characters. The effects arise at very low exposure durations and benefit accuracy at all exposure durations towards the subjects’ ceiling. We conclude that temporally constricted experimental conditions can add to our understanding priming in visual search...... pop-out search trials. Such repetition effects have since been generalized to a multitude of target attributes. Priming has primarily been investigated using self-terminating visual search paradigms, comparing differences in response times. Response accuracy has predominantly served as a control...

  9. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  10. Composite Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2014-01-01

    Full Text Available Differential search algorithm (DS is a relatively new evolutionary algorithm inspired by the Brownian-like random-walk movement which is used by an organism to migrate. It has been verified to be more effective than ABC, JDE, JADE, SADE, EPSDE, GSA, PSO2011, and CMA-ES. In this paper, we propose four improved solution search algorithms, namely “DS/rand/1,” “DS/rand/2,” “DS/current to rand/1,” and “DS/current to rand/2” to search the new space and enhance the convergence rate for the global optimization problem. In order to verify the performance of different solution search methods, 23 benchmark functions are employed. Experimental results indicate that the proposed algorithm performs better than, or at least comparable to, the original algorithm when considering the quality of the solution obtained. However, these schemes cannot still achieve the best solution for all functions. In order to further enhance the convergence rate and the diversity of the algorithm, a composite differential search algorithm (CDS is proposed in this paper. This new algorithm combines three new proposed search schemes including “DS/rand/1,” “DS/rand/2,” and “DS/current to rand/1” with three control parameters using a random method to generate the offspring. Experiment results show that CDS has a faster convergence rate and better search ability based on the 23 benchmark functions.

  11. Hybrid Projected Gradient-Evolutionary Search Algorithm for Mixed Integer Nonlinear Optimization Problems

    National Research Council Canada - National Science Library

    Homaifar, Abdollah; Esterline, Albert; Kimiaghalam, Bahram

    2005-01-01

    The Hybrid Projected Gradient-Evolutionary Search Algorithm (HPGES) algorithm uses a specially designed evolutionary-based global search strategy to efficiently create candidate solutions in the solution space...

  12. A Fast Exact k-Nearest Neighbors Algorithm for High Dimensional Search Using k-Means Clustering and Triangle Inequality.

    Science.gov (United States)

    Wang, Xueyi

    2012-02-08

    The k-nearest neighbors (k-NN) algorithm is a widely used machine learning method that finds nearest neighbors of a test object in a feature space. We present a new exact k-NN algorithm called kMkNN (k-Means for k-Nearest Neighbors) that uses the k-means clustering and the triangle inequality to accelerate the searching for nearest neighbors in a high dimensional space. The kMkNN algorithm has two stages. In the buildup stage, instead of using complex tree structures such as metric trees, kd-trees, or ball-tree, kMkNN uses a simple k-means clustering method to preprocess the training dataset. In the searching stage, given a query object, kMkNN finds nearest training objects starting from the nearest cluster to the query object and uses the triangle inequality to reduce the distance calculations. Experiments show that the performance of kMkNN is surprisingly good compared to the traditional k-NN algorithm and tree-based k-NN algorithms such as kd-trees and ball-trees. On a collection of 20 datasets with up to 10(6) records and 10(4) dimensions, kMkNN shows a 2-to 80-fold reduction of distance calculations and a 2- to 60-fold speedup over the traditional k-NN algorithm for 16 datasets. Furthermore, kMkNN performs significant better than a kd-tree based k-NN algorithm for all datasets and performs better than a ball-tree based k-NN algorithm for most datasets. The results show that kMkNN is effective for searching nearest neighbors in high dimensional spaces.

  13. Indirect and inclusive search for dark matter with AMS02 space spectrometer

    International Nuclear Information System (INIS)

    Brun, Pierre

    2007-01-01

    AMS02 is a particle physics detector designed for 3 years of data taking aboard the International Space Station. Equipped with a superconducting magnet, it will allow to measure gamma and cosmic ray fluxes in the GeV to TeV region with high particle identification capabilities. Its performance is based on the redundancy of measurements in specific sub-detectors: a Time-Of-Flight counter, a Transition Radiation Detector, a Silicon Tracker, a Ring Imaging Cherenkov counter and an Electromagnetic calorimeter (Ecal). The Ecal is studied in details, in particular with the qualification of a stand-alone trigger devoted to gamma ray astronomy. This system allows to increase the AMS02 sensitivity to photons, and to improve the reconstruction of electromagnetic events. The analog part of the trigger system has been tested with test benches and in-beam at CERN. The in-orbit calibration of the Ecal is studied, it may proceed in two steps. First, the Ecal cells responses have to be equalized with minimum ionizing particles data. Then an absolute calibration can be performed with cosmic electrons. For both the relative and the absolute calibration, possible procedures are defined and realistic calibration times are estimated. The second part deals with the indirect searches for dark matter and the study of the AMS02 sensitivity. Dark matter stands for 84% of the Universe mass and could consist in new particles. Dark matter particles are expected to surround our Galaxy and annihilate in high density regions. These annihilations could become observable exotic primary cosmic ray sources. Searches for anomalous excesses in (p-bar, e + , D-bar) and γ ray fluxes will be performed by AMS02. A numerical tool allowing to perform predictions for these exotic fluxes within supersymmetry or extra-dimension is developed and is presented in details. Phenomenological studies regarding possible enhancements of these signals by over-dense regions of the halo have also been performed. The

  14. Visual search deficits in amblyopia.

    Science.gov (United States)

    Tsirlin, Inna; Colpa, Linda; Goltz, Herbert C; Wong, Agnes M F

    2018-04-01

    Amblyopia is a neurodevelopmental disorder defined as a reduction in visual acuity that cannot be corrected by optical means. It has been associated with low-level deficits. However, research has demonstrated a link between amblyopia and visual attention deficits in counting, tracking, and identifying objects. Visual search is a useful tool for assessing visual attention but has not been well studied in amblyopia. Here, we assessed the extent of visual search deficits in amblyopia using feature and conjunction search tasks. We compared the performance of participants with amblyopia (n = 10) to those of controls (n = 12) on both feature and conjunction search tasks using Gabor patch stimuli, varying spatial bandwidth and orientation. To account for the low-level deficits inherent in amblyopia, we measured individual contrast and crowding thresholds and monitored eye movements. The display elements were then presented at suprathreshold levels to ensure that visibility was equalized across groups. There was no performance difference between groups on feature search, indicating that our experimental design controlled successfully for low-level amblyopia deficits. In contrast, during conjunction search, median reaction times and reaction time slopes were significantly larger in participants with amblyopia compared with controls. Amblyopia differentially affects performance on conjunction visual search, a more difficult task that requires feature binding and possibly the involvement of higher-level attention processes. Deficits in visual search may affect day-to-day functioning in people with amblyopia.

  15. Space-related pharma-motifs for fast search of protein binding motifs and polypharmacological targets.

    Science.gov (United States)

    Chiu, Yi-Yuan; Lin, Chun-Yu; Lin, Chih-Ta; Hsu, Kai-Cheng; Chang, Li-Zen; Yang, Jinn-Moon

    2012-01-01

    To discover a compound inhibiting multiple proteins (i.e. polypharmacological targets) is a new paradigm for the complex diseases (e.g. cancers and diabetes). In general, the polypharmacological proteins often share similar local binding environments and motifs. As the exponential growth of the number of protein structures, to find the similar structural binding motifs (pharma-motifs) is an emergency task for drug discovery (e.g. side effects and new uses for old drugs) and protein functions. We have developed a Space-Related Pharmamotifs (called SRPmotif) method to recognize the binding motifs by searching against protein structure database. SRPmotif is able to recognize conserved binding environments containing spatially discontinuous pharma-motifs which are often short conserved peptides with specific physico-chemical properties for protein functions. Among 356 pharma-motifs, 56.5% interacting residues are highly conserved. Experimental results indicate that 81.1% and 92.7% polypharmacological targets of each protein-ligand complex are annotated with same biological process (BP) and molecular function (MF) terms, respectively, based on Gene Ontology (GO). Our experimental results show that the identified pharma-motifs often consist of key residues in functional (active) sites and play the key roles for protein functions. The SRPmotif is available at http://gemdock.life.nctu.edu.tw/SRP/. SRPmotif is able to identify similar pharma-interfaces and pharma-motifs sharing similar binding environments for polypharmacological targets by rapidly searching against the protein structure database. Pharma-motifs describe the conservations of binding environments for drug discovery and protein functions. Additionally, these pharma-motifs provide the clues for discovering new sequence-based motifs to predict protein functions from protein sequence databases. We believe that SRPmotif is useful for elucidating protein functions and drug discovery.

  16. Changing Perspective: Zooming in and out during Visual Search

    Science.gov (United States)

    Solman, Grayden J. F.; Cheyne, J. Allan; Smilek, Daniel

    2013-01-01

    Laboratory studies of visual search are generally conducted in contexts with a static observer vantage point, constrained by a fixation cross or a headrest. In contrast, in many naturalistic search settings, observers freely adjust their vantage point by physically moving through space. In two experiments, we evaluate behavior during free vantage…

  17. Flavour Independent $h^{0}A^{0}$ Search and Two Higgs Doublet Model Interpretation of Neutral Higgs Boson Searches at LEP

    CERN Document Server

    Abbiendi, G.; Akesson, P.F.; Alexander, G.; Allison, John; Amaral, P.; Anagnostou, G.; Anderson, K.J.; Asai, S.; Axen, D.; Bailey, I.; Barberio, E.; Barillari, T.; Barlow, R.J.; Batley, R.J.; Bechtle, P.; Behnke, T.; Bell, Kenneth Watson; Bell, P.J.; Bella, G.; Bellerive, A.; Benelli, G.; Bethke, S.; Biebel, O.; Boeriu, O.; Bock, P.; Boutemeur, M.; Braibant, S.; Brown, Robert M.; Burckhart, H.J.; Campana, S.; Capiluppi, P.; Carnegie, R.K.; Carter, A.A.; Carter, J.R.; Chang, C.Y.; Charlton, D.G.; Ciocca, C.; Csilling, A.; Cuffiani, M.; Dado, S.; De Roeck, A.; De Wolf, E.A.; Desch, K.; Dienes, B.; Donkers, M.; Dubbert, J.; Duchovni, E.; Duckeck, G.; Duerdoth, I.P.; Etzion, E.; Fabbri, F.; Ferrari, P.; Fiedler, F.; Fleck, I.; Ford, M.; Frey, A.; Gagnon, P.; Gary, John William; Geich-Gimbel, C.; Giacomelli, G.; Giacomelli, P.; Giunta, Marina; Goldberg, J.; Gross, E.; Grunhaus, J.; Gruwe, M.; Gunther, P.O.; Gupta, A.; Hajdu, C.; Hamann, M.; Hanson, G.G.; Harel, A.; Hauschild, M.; Hawkes, C.M.; Hawkings, R.; Hemingway, R.J.; Herten, G.; Heuer, R.D.; Hill, J.C.; Hoffman, Kara Dion; Horvath, D.; Igo-Kemenes, P.; Ishii, K.; Jeremie, H.; Jovanovic, P.; Junk, T.R.; Kanzaki, J.; Karlen, D.; Kawagoe, K.; Kawamoto, T.; Keeler, R.K.; Kellogg, R.G.; Kennedy, B.W.; Kluth, S.; Kobayashi, T.; Kobel, M.; Komamiya, S.; Kramer, T.; Krieger, P.; von Krogh, J.; Kuhl, T.; Kupper, M.; Lafferty, G.D.; Landsman, H.; Lanske, D.; Lellouch, D.; Lettso, J.; Levinson, L.; Lillich, J.; Lloyd, S.L.; Loebinger, F.K.; Lu, J.; Ludwig, A.; Ludwig, J.; Mader, W.; Marcellini, S.; Martin, A.J.; Masetti, G.; Mashimo, T.; Mattig, Peter; McKenna, J.; McPherson, R.A.; Meijers, F.; Menges, W.; Merritt, F.S.; Mes, H.; Meyer, Niels T.; Michelini, A.; Mihara, S.; Mikenberg, G.; Miller, D.J.; Mohr, W.; Mori, T.; Mutter, A.; Nagai, K.; Nakamura, I.; Nanjo, H.; Neal, H.A.; Nisius, R.; O'Neale, S.W.; Oh, A.; Oreglia, M.J.; Orito, S.; Pahl, C.; Pasztor, G.; Pater, J.R.; Pilcher, J.E.; Pinfold, J.; Plane, David E.; Pooth, O.; Przybycien, M.; Quadt, A.; Rabbertz, K.; Rembser, C.; Renkel, P.; Roney, J.M.; Rossi, A.M.; Rozen, Y.; Runge, K.; Sachs, K.; Saeki, T.; Sarkisyan, E.K.G.; Schaile, A.D.; Schaile, O.; Scharff-Hansen, P.; Schieck, J.; Schorner-Sadenius, T.; Schroder, Matthias; Schumacher, M.; Seuster, R.; Shears, T.G.; Shen, B.C.; Sherwood, P.; Skuja, A.; Smith, A.M.; Sobie, R.; Soldner-Rembold, S.; Spano, F.; Stahl, A.; Strom, David M.; Strohmer, R.; Tarem, S.; Tasevsky, M.; Teuscher, R.; Thomson, M.A.; Torrence, E.; Toya, D.; Tran, P.; Trigger, I.; Trocsanyi, Z.; Tsur, E.; Turner-Watson, M.F.; Ueda, I.; Ujvari, B.; Vollmer, C.F.; Vannerem, P.; Vertesi, R.; Verzocchi, M.; Voss, H.; Vossebeld, J.; Ward, C.P.; Ward, D.R.; Watkins, P.M.; Watson, A.T.; Watson, N.K.; Wells, P.S.; Wengler, T.; Wermes, N.; Wilson, G.W.; Wilson, J.A.; Wolf, G.; Wyatt, T.R.; Yamashita, S.; Zer-Zion, D.; Zivkovic, Lidija

    2005-01-01

    Upper limits on the cross-section of the pair-production process e+e- -> h0A0 assuming 100% decays into hadrons, are derived from a new search for the h0A0 -> hadrons topology, independent of the hadronic flavour of the decay products. Searches for the neutral Higgs bosons h0 and A0, are used to obtain constraints on the Type II Two Higgs Doublet Model (2HDM(11)) with no CP violation in the Higgs sector and no additional non Standard Model particles besides the five Higgs bosons. The analysis combines LEP1 and LEP2 data collected with the OPAL detctor up to the highest available centre-of-mass energies. The searches are sensitive to the h0, A0 -> qq, gg,tau+tau- and h0 -> A0A0 decay modes of the Higgs bosons. The 2HDM(II) parameter space is explored in a detailed scan. Large regions of the 2HDM(II) parameter space are excluded at the 95% CL in the (mh, mA), (mh, tanb) and (mA, tanb) planes, using both direct neutral Higgs boson searches and indirect limits derived from Standard Model high precision measuremen...

  18. Semantic interpretation of search engine resultant

    Science.gov (United States)

    Nasution, M. K. M.

    2018-01-01

    In semantic, logical language can be interpreted in various forms, but the certainty of meaning is included in the uncertainty, which directly always influences the role of technology. One results of this uncertainty applies to search engines as user interfaces with information spaces such as the Web. Therefore, the behaviour of search engine results should be interpreted with certainty through semantic formulation as interpretation. Behaviour formulation shows there are various interpretations that can be done semantically either temporary, inclusion, or repeat.

  19. The search for identity in Bessie Head's Maru | Egbung | Sophia: An ...

    African Journals Online (AJOL)

    The search for identity is an inward search that is propelled by a situation where the real being of a person is questioned. Identity determines self-perception, configuration of spaces, and the politics of interpersonal relationship. This paper uses the feminist theory to argue that the search for identity and self-actualization is ...

  20. Geometric differential evolution for combinatorial and programs spaces.

    Science.gov (United States)

    Moraglio, A; Togelius, J; Silva, S

    2013-01-01

    Geometric differential evolution (GDE) is a recently introduced formal generalization of traditional differential evolution (DE) that can be used to derive specific differential evolution algorithms for both continuous and combinatorial spaces retaining the same geometric interpretation of the dynamics of the DE search across representations. In this article, we first review the theory behind the GDE algorithm, then, we use this framework to formally derive specific GDE for search spaces associated with binary strings, permutations, vectors of permutations and genetic programs. The resulting algorithms are representation-specific differential evolution algorithms searching the target spaces by acting directly on their underlying representations. We present experimental results for each of the new algorithms on a number of well-known problems comprising NK-landscapes, TSP, and Sudoku, for binary strings, permutations, and vectors of permutations. We also present results for the regression, artificial ant, parity, and multiplexer problems within the genetic programming domain. Experiments show that overall the new DE algorithms are competitive with well-tuned standard search algorithms.

  1. Precipitation-Static-Reduction Research

    Science.gov (United States)

    1943-03-31

    if» 85 z \\ PRECIPITATION-STATIC-REDUCTION RESEARCH study of the effects of flame length , flame spacing, and burner spacing on B shows that there...unod: Flame length *. The visual length of the flame from the burner tip to the flame tip when examined in a darkened room against a black background...Postlve and Negative Flames The use of the second flame-conduction coefficient, B, facilitates considerably the study of the effect of flame length , spacing

  2. Investigation of the equality constraint effect on the reduction of the rotational ambiguity in three-component system using a novel grid search method.

    Science.gov (United States)

    Beyramysoltan, Samira; Rajkó, Róbert; Abdollahi, Hamid

    2013-08-12

    The obtained results by soft modeling multivariate curve resolution methods often are not unique and are questionable because of rotational ambiguity. It means a range of feasible solutions equally fit experimental data and fulfill the constraints. Regarding to chemometric literature, a survey of useful constraints for the reduction of the rotational ambiguity is a big challenge for chemometrician. It is worth to study the effects of applying constraints on the reduction of rotational ambiguity, since it can help us to choose the useful constraints in order to impose in multivariate curve resolution methods for analyzing data sets. In this work, we have investigated the effect of equality constraint on decreasing of the rotational ambiguity. For calculation of all feasible solutions corresponding with known spectrum, a novel systematic grid search method based on Species-based Particle Swarm Optimization is proposed in a three-component system. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Phylogenetic search through partial tree mixing

    Science.gov (United States)

    2012-01-01

    Background Recent advances in sequencing technology have created large data sets upon which phylogenetic inference can be performed. Current research is limited by the prohibitive time necessary to perform tree search on a reasonable number of individuals. This research develops new phylogenetic algorithms that can operate on tens of thousands of species in a reasonable amount of time through several innovative search techniques. Results When compared to popular phylogenetic search algorithms, better trees are found much more quickly for large data sets. These algorithms are incorporated in the PSODA application available at http://dna.cs.byu.edu/psoda Conclusions The use of Partial Tree Mixing in a partition based tree space allows the algorithm to quickly converge on near optimal tree regions. These regions can then be searched in a methodical way to determine the overall optimal phylogenetic solution. PMID:23320449

  4. Recursions of Symmetry Orbits and Reduction without Reduction

    Directory of Open Access Journals (Sweden)

    Andrei A. Malykh

    2011-04-01

    Full Text Available We consider a four-dimensional PDE possessing partner symmetries mainly on the example of complex Monge-Ampère equation (CMA. We use simultaneously two pairs of symmetries related by a recursion relation, which are mutually complex conjugate for CMA. For both pairs of partner symmetries, using Lie equations, we introduce explicitly group parameters as additional variables, replacing symmetry characteristics and their complex conjugates by derivatives of the unknown with respect to group parameters. We study the resulting system of six equations in the eight-dimensional space, that includes CMA, four equations of the recursion between partner symmetries and one integrability condition of this system. We use point symmetries of this extended system for performing its symmetry reduction with respect to group parameters that facilitates solving the extended system. This procedure does not imply a reduction in the number of physical variables and hence we end up with orbits of non-invariant solutions of CMA, generated by one partner symmetry, not used in the reduction. These solutions are determined by six linear equations with constant coefficients in the five-dimensional space which are obtained by a three-dimensional Legendre transformation of the reduced extended system. We present algebraic and exponential examples of such solutions that govern Legendre-transformed Ricci-flat Kähler metrics with no Killing vectors. A similar procedure is briefly outlined for Husain equation.

  5. Reduction of respiratory ghosting motion artifacts in conventional two-dimensional multi-slice Cartesian turbo spin-echo: which k-space filling order is the best?

    Science.gov (United States)

    Inoue, Yuuji; Yoneyama, Masami; Nakamura, Masanobu; Takemura, Atsushi

    2018-06-01

    The two-dimensional Cartesian turbo spin-echo (TSE) sequence is widely used in routine clinical studies, but it is sensitive to respiratory motion. We investigated the k-space orders in Cartesian TSE that can effectively reduce motion artifacts. The purpose of this study was to demonstrate the relationship between k-space order and degree of motion artifacts using a moving phantom. We compared the degree of motion artifacts between linear and asymmetric k-space orders. The actual spacing of ghost artifacts in the asymmetric order was doubled compared with that in the linear order in the free-breathing situation. The asymmetric order clearly showed less sensitivity to incomplete breath-hold at the latter half of the imaging period. Because of the actual number of partitions of the k-space and the temporal filling order, the asymmetric k-space order of Cartesian TSE was superior to the linear k-space order for reduction of ghosting motion artifacts.

  6. Flexible Composites for Space

    Data.gov (United States)

    National Aeronautics and Space Administration — Payload mass reduction and packaging efficiency in launch vehicles are essential for deep space exploration.  Inflatable softgoods have been identified as attractive...

  7. The topography of the environment alters the optimal search strategy for active particles

    Science.gov (United States)

    Volpe, Giorgio; Volpe, Giovanni

    2017-10-01

    In environments with scarce resources, adopting the right search strategy can make the difference between succeeding and failing, even between life and death. At different scales, this applies to molecular encounters in the cell cytoplasm, to animals looking for food or mates in natural landscapes, to rescuers during search and rescue operations in disaster zones, and to genetic computer algorithms exploring parameter spaces. When looking for sparse targets in a homogeneous environment, a combination of ballistic and diffusive steps is considered optimal; in particular, more ballistic Lévy flights with exponent α≤1 are generally believed to optimize the search process. However, most search spaces present complex topographies. What is the best search strategy in these more realistic scenarios? Here, we show that the topography of the environment significantly alters the optimal search strategy toward less ballistic and more Brownian strategies. We consider an active particle performing a blind cruise search for nonregenerating sparse targets in a 2D space with steps drawn from a Lévy distribution with the exponent varying from α=1 to α=2 (Brownian). We show that, when boundaries, barriers, and obstacles are present, the optimal search strategy depends on the topography of the environment, with α assuming intermediate values in the whole range under consideration. We interpret these findings using simple scaling arguments and discuss their robustness to varying searcher's size. Our results are relevant for search problems at different length scales from animal and human foraging to microswimmers' taxis to biochemical rates of reaction.

  8. Co/N–C nanotubes with increased coupling sites by space-confined pyrolysis for high electrocatalytic activity

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2017-01-01

    Full Text Available Searching low cost and non-precious metal catalysts for high-performance oxygen reduction reaction is highly desired. Herein, Co nanoparticles embedded in nitrogen-doped carbon (Co/N–C nanotubes with internal void space are successfully synthesized by space-confined pyrolysis, which effectively improve the cobalt loading content and restrict the encapsulated particles down to nanometer. Different from the typical conformal carbon encapsulation, the resulting Co/N–C nanotubes possess more cobalt nanoparticles embedded in the nanotubes, which can provide more coupling sites and active sites in the oxygen reduction reaction (ORR. Moreover, the one-dimensional and porous structure provides a high surface area and a fast electron transfer pathway for the ORR. And the Co/N–C electrode presents excellent electrocatalytic ORR activity in terms of low onset potential (30 mV lower than that of Pt/C, small Tafel slop (45.5 mV dec−1 and good durability (88.5% retention after 10,000 s. Keywords: Co nanoparticles, Nitrogen-doped carbon nanotubes, Oxygen reduction reaction

  9. Recent results on the search for continuous sources with LIGO and GEO 600

    International Nuclear Information System (INIS)

    Sintes, Alicia M

    2006-01-01

    An overview of the searches for continuous gravitational wave signals in LIGO and GEO 600 performed on different recent science runs and results are presented. This includes both searching for gravitational waves from known pulsars as well as blind searches over a wide parameter space

  10. Search-based model identification of smart-structure damage

    Science.gov (United States)

    Glass, B. J.; Macalou, A.

    1991-01-01

    This paper describes the use of a combined model and parameter identification approach, based on modal analysis and artificial intelligence (AI) techniques, for identifying damage or flaws in a rotating truss structure incorporating embedded piezoceramic sensors. This smart structure example is representative of a class of structures commonly found in aerospace systems and next generation space structures. Artificial intelligence techniques of classification, heuristic search, and an object-oriented knowledge base are used in an AI-based model identification approach. A finite model space is classified into a search tree, over which a variant of best-first search is used to identify the model whose stored response most closely matches that of the input. Newly-encountered models can be incorporated into the model space. This adaptativeness demonstrates the potential for learning control. Following this output-error model identification, numerical parameter identification is used to further refine the identified model. Given the rotating truss example in this paper, noisy data corresponding to various damage configurations are input to both this approach and a conventional parameter identification method. The combination of the AI-based model identification with parameter identification is shown to lead to smaller parameter corrections than required by the use of parameter identification alone.

  11. A solution to energy and environmental problems of electric power system using hybrid harmony search-random search optimization algorithm

    Directory of Open Access Journals (Sweden)

    Vikram Kumar Kamboj

    2016-04-01

    Full Text Available In recent years, global warming and carbon dioxide (CO2 emission reduction have become important issues in India, as CO2 emission levels are continuing to rise in accordance with the increased volume of Indian national energy consumption under the pressure of global warming, it is crucial for Indian government to impose the effective policy to promote CO2 emission reduction. Challenge of supplying the nation with high quality and reliable electrical energy at a reasonable cost, converted government policy into deregulation and restructuring environment. This research paper presents aims to presents an effective solution for energy and environmental problems of electric power using an efficient and powerful hybrid optimization algorithm: Hybrid Harmony search-random search algorithm. The proposed algorithm is tested for standard IEEE-14 bus, -30 bus and -56 bus system. The effectiveness of proposed hybrid algorithm is compared with others well known evolutionary, heuristics and meta-heuristics search algorithms. For multi-objective unit commitment, it is found that as there are conflicting relationship between cost and emission, if the performance in cost criterion is improved, performance in the emission is seen to deteriorate.

  12. A comparative study of the A* heuristic search algorithm used to solve efficiently a puzzle game

    Science.gov (United States)

    Iordan, A. E.

    2018-01-01

    The puzzle game presented in this paper consists in polyhedra (prisms, pyramids or pyramidal frustums) which can be moved using the free available spaces. The problem requires to be found the minimum number of movements in order the game reaches to a goal configuration starting from an initial configuration. Because the problem is enough complex, the principal difficulty in solving it is given by dimension of search space, that leads to necessity of a heuristic search. The improving of the search method consists into determination of a strong estimation by the heuristic function which will guide the search process to the most promising side of the search tree. The comparative study is realized among Manhattan heuristic and the Hamming heuristic using A* search algorithm implemented in Java. This paper also presents the necessary stages in object oriented development of a software used to solve efficiently this puzzle game. The modelling of the software is achieved through specific UML diagrams representing the phases of analysis, design and implementation, the system thus being described in a clear and practical manner. With the purpose to confirm the theoretical results which demonstrates that Manhattan heuristic is more efficient was used space complexity criterion. The space complexity was measured by the number of generated nodes from the search tree, by the number of the expanded nodes and by the effective branching factor. From the experimental results obtained by using the Manhattan heuristic, improvements were observed regarding space complexity of A* algorithm versus Hamming heuristic.

  13. An extended dual search space model of scientific discovery learning

    NARCIS (Netherlands)

    van Joolingen, Wouter; de Jong, Anthonius J.M.

    1997-01-01

    This article describes a theory of scientific discovery learning which is an extension of Klahr and Dunbar''s model of Scientific Discovery as Dual Search (SDDS) model. We present a model capable of describing and understanding scientific discovery learning in complex domains in terms of the SDDS

  14. Optimum Design of Braced Steel Space Frames including Soil-Structure Interaction via Teaching-Learning-Based Optimization and Harmony Search Algorithms

    Directory of Open Access Journals (Sweden)

    Ayse T. Daloglu

    2018-01-01

    Full Text Available Optimum design of braced steel space frames including soil-structure interaction is studied by using harmony search (HS and teaching-learning-based optimization (TLBO algorithms. A three-parameter elastic foundation model is used to incorporate the soil-structure interaction effect. A 10-storey braced steel space frame example taken from literature is investigated according to four different bracing types for the cases with/without soil-structure interaction. X, V, Z, and eccentric V-shaped bracing types are considered in the study. Optimum solutions of examples are carried out by a computer program coded in MATLAB interacting with SAP2000-OAPI for two-way data exchange. The stress constraints according to AISC-ASD (American Institute of Steel Construction-Allowable Stress Design, maximum lateral displacement constraints, interstorey drift constraints, and beam-to-column connection constraints are taken into consideration in the optimum design process. The parameters of the foundation model are calculated depending on soil surface displacements by using an iterative approach. The results obtained in the study show that bracing types and soil-structure interaction play very important roles in the optimum design of steel space frames. Finally, the techniques used in the optimum design seem to be quite suitable for practical applications.

  15. Performance of genetic algorithms in search for water splitting perovskites

    DEFF Research Database (Denmark)

    Jain, A.; Castelli, Ivano Eligio; Hautier, G.

    2013-01-01

    We examine the performance of genetic algorithms (GAs) in uncovering solar water light splitters over a space of almost 19,000 perovskite materials. The entire search space was previously calculated using density functional theory to determine solutions that fulfill constraints on stability, band...

  16. Robots for hazardous duties: Military, space, and nuclear facility applications. (Latest citations from the NTIS bibliographic database). Published Search

    International Nuclear Information System (INIS)

    1993-09-01

    The bibliography contains citations concerning the design and application of robots used in place of humans where the environment could be hazardous. Military applications include autonomous land vehicles, robotic howitzers, and battlefield support operations. Space operations include docking, maintenance, mission support, and intra-vehicular and extra-vehicular activities. Nuclear applications include operations within the containment vessel, radioactive waste operations, fueling operations, and plant security. Many of the articles reference control techniques and the use of expert systems in robotic operations. Applications involving industrial manufacturing, walking robots, and robot welding are cited in other published searches in this series. (Contains a minimum of 183 citations and includes a subject term index and title list.)

  17. Collection of Medical Original Data with Search Engine for Decision Support.

    Science.gov (United States)

    Orthuber, Wolfgang

    2016-01-01

    Medicine is becoming more and more complex and humans can capture total medical knowledge only partially. For specific access a high resolution search engine is demonstrated, which allows besides conventional text search also search of precise quantitative data of medical findings, therapies and results. Users can define metric spaces ("Domain Spaces", DSs) with all searchable quantitative data ("Domain Vectors", DSs). An implementation of the search engine is online in http://numericsearch.com. In future medicine the doctor could make first a rough diagnosis and check which fine diagnostics (quantitative data) colleagues had collected in such a case. Then the doctor decides about fine diagnostics and results are sent (half automatically) to the search engine which filters a group of patients which best fits to these data. In this specific group variable therapies can be checked with associated therapeutic results, like in an individual scientific study for the current patient. The statistical (anonymous) results could be used for specific decision support. Reversely the therapeutic decision (in the best case with later results) could be used to enhance the collection of precise pseudonymous medical original data which is used for better and better statistical (anonymous) search results.

  18. Data Transformation Functions for Expanded Search Spaces in Geographic Sample Supervised Segment Generation

    Directory of Open Access Journals (Sweden)

    Christoff Fourie

    2014-04-01

    Full Text Available Sample supervised image analysis, in particular sample supervised segment generation, shows promise as a methodological avenue applicable within Geographic Object-Based Image Analysis (GEOBIA. Segmentation is acknowledged as a constituent component within typically expansive image analysis processes. A general extension to the basic formulation of an empirical discrepancy measure directed segmentation algorithm parameter tuning approach is proposed. An expanded search landscape is defined, consisting not only of the segmentation algorithm parameters, but also of low-level, parameterized image processing functions. Such higher dimensional search landscapes potentially allow for achieving better segmentation accuracies. The proposed method is tested with a range of low-level image transformation functions and two segmentation algorithms. The general effectiveness of such an approach is demonstrated compared to a variant only optimising segmentation algorithm parameters. Further, it is shown that the resultant search landscapes obtained from combining mid- and low-level image processing parameter domains, in our problem contexts, are sufficiently complex to warrant the use of population based stochastic search methods. Interdependencies of these two parameter domains are also demonstrated, necessitating simultaneous optimization.

  19. Collective search by ants in microgravity

    Directory of Open Access Journals (Sweden)

    Stefanie M. Countryman

    2015-03-01

    Full Text Available The problem of collective search is a tradeoff between searching thoroughly and covering as much area as possible. This tradeoff depends on the density of searchers. Solutions to the problem of collective search are currently of much interest in robotics and in the study of distributed algorithms, for example to design ways that without central control robots can use local information to perform search and rescue operations. Ant colonies operate without central control. Because they can perceive only local, mostly chemical and tactile cues, they must search collectively to find resources and to monitor the colony's environment. Examining how ants in diverse environments solve the problem of collective search can elucidate how evolution has led to diverse forms of collective behavior. An experiment on the International Space Station in January 2014 examined how ants (Tetramorium caespitum perform collective search in microgravity. In the ISS experiment, the ants explored a small arena in which a barrier was lowered to increase the area and thus lower ant density. In microgravity, relative to ground controls, ants explored the area less thoroughly and took more convoluted paths. It appears that the difficulty of holding on to the surface interfered with the ants’ ability to search collectively. Ants frequently lost contact with the surface, but showed a remarkable ability to regain contact with the surface.

  20. Searches for Higgs bosons and supersymmetry at LEP

    CERN Document Server

    van Vulpen, I B

    2004-01-01

    This note presents an overview of the main results from searches for Higgs bosons and supersymmetry at LEP. Most of the results presented here are combined results from the four LEP experiments (ALEPH, DELPHI, L3 and OPAL). No signal is observed and the (negative) search results are interpreted in a wide class of models allowing parameter space to be excluded. All limits are set at 95% CL.

  1. Search for intermediate vector bosons

    International Nuclear Information System (INIS)

    Klajn, D.B.; Rubbia, K.; Meer, S.

    1983-01-01

    Problem of registration and search for intermediate vector bosons is discussed. According to weak-current theory there are three intermediate vector bosons with +1(W + )-1(W - ) and zero (Z 0 ) electric charges. It was suggested to conduct the investigation into particles in 1976 by cline, Rubbia and Makintair using proton-antiproton beams. Major difficulties of the experiment are related to the necessity of formation of sufficient amount of antiparticles and the method of antiproton beam ''cooling'' for the purpose of reduction of its random movements. The stochastic method was suggested by van der Meer in 1968 as one of possible cooling methods. Several large detectors were designed for searching intermediate vector bosons

  2. An algebraic method for system reduction of stationary Gaussian systems

    NARCIS (Netherlands)

    D. Jibetean; J.H. van Schuppen (Jan)

    2003-01-01

    textabstractSystem identification for a particular approach reduces to system reduction, determining for a system with a high state-space dimension a system of low state-space dimension. For Gaussian systems the problem of system reduction is considered with the divergence rate criterion. The

  3. The modern trends in space electromagnetic instrumentation

    Science.gov (United States)

    Korepanov, V. E.

    The future trends of the experimental plasma physics development in outer space demand more and more exact and sophisticated scientific instrumentation. Moreover, the situation is complicated by constant reduction of financial support of scientific research, even in leading countries. This resulted in the development of mini; micro and nanosatellites with low price and short preparation time. Consequently, it provoked the creation of new generation of scientific instruments with reduced weight and power consumption but increased level of metrological parameters. The recent state of the development of electromagnetic (EM) sensors for microsatellites is reported. For flux-gate magnetometers (FGM) the reduction of weight as well as power consumption was achieved not only due to the use of new electronic components but also because of the new operation mode development. The scientific and technological study allowed to decrease FGM noise and now the typical noise figure is about 10 picotesla rms at 1 Hz and the record one is below 1 picotesla. The super-light version of search-coil magnetometers (SCM) was created as the result of intensive research. These new SCMs can have about six decades of operational frequency band with upper limit ˜ 1 MHz and noise level of few femtotesla with total weight about 75 grams, including electronics. A new instrument.- wave probe (WP) - which combines three independent sensors in one body - SCM, split Langmuir probe and electric potential sensor - was created. The developed theory confirms that WP can directly measure the wave vector components in space plasmas.

  4. Cost reduction improvement for power generation system integrating WECS using harmony search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ngonkham, S. [Khonkaen Univ., Amphur Muang (Thailand). Dept. of Electrical Engineering; Buasri, P. [Khonkaen Univ., Amphur Muang (Thailand). Embed System Research Group

    2009-03-11

    A harmony search (HS) algorithm was used to optimize economic dispatch (ED) in a wind energy conversion system (WECS) for power system integration. The HS algorithm was based on a stochastic random search method. System costs for the WECS system were estimated in relation to average wind speeds. The HS algorithm was implemented to optimize the ED with a simple programming procedure. The study showed that the initial parameters must be carefully selected to ensure the accuracy of the HS algorithm. The algorithm demonstrated that total costs of the WECS system were higher than costs associated with energy efficiency procedures that reduced the same amount of greenhouse gas (GHG) emissions. 7 refs,. 10 tabs., 16 figs.

  5. HUBBLE SPACE TELESCOPE SNAPSHOT SEARCH FOR PLANETARY NEBULAE IN GLOBULAR CLUSTERS OF THE LOCAL GROUP

    Energy Technology Data Exchange (ETDEWEB)

    Bond, Howard E., E-mail: heb11@psu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, University Park, PA 16802 (United States)

    2015-04-15

    Single stars in ancient globular clusters (GCs) are believed incapable of producing planetary nebulae (PNs), because their post-asymptotic-giant-branch evolutionary timescales are slower than the dissipation timescales for PNs. Nevertheless, four PNs are known in Galactic GCs. Their existence likely requires more exotic evolutionary channels, including stellar mergers and common-envelope binary interactions. I carried out a snapshot imaging search with the Hubble Space Telescope (HST) for PNs in bright Local Group GCs outside the Milky Way. I used a filter covering the 5007 Å nebular emission line of [O iii], and another one in the nearby continuum, to image 66 GCs. Inclusion of archival HST frames brought the total number of extragalactic GCs imaged at 5007 Å to 75, whose total luminosity slightly exceeds that of the entire Galactic GC system. I found no convincing PNs in these clusters, aside from one PN in a young M31 cluster misclassified as a GC, and two PNs at such large angular separations from an M31 GC that membership is doubtful. In a ground-based spectroscopic survey of 274 old GCs in M31, Jacoby et al. found three candidate PNs. My HST images of one of them suggest that the [O iii] emission actually arises from ambient interstellar medium rather than a PN; for the other two candidates, there are broadband archival UV HST images that show bright, blue point sources that are probably the PNs. In a literature search, I also identified five further PN candidates lying near old GCs in M31, for which follow-up observations are necessary to confirm their membership. The rates of incidence of PNs are similar, and small but nonzero, throughout the GCs of the Local Group.

  6. USING PRECEDENTS FOR REDUCTION OF DECISION TREE BY GRAPH SEARCH

    Directory of Open Access Journals (Sweden)

    I. A. Bessmertny

    2015-01-01

    Full Text Available The paper considers the problem of mutual payment organization between business entities by means of clearing that is solved by search of graph paths. To reduce the decision tree complexity a method of precedents is proposed that consists in saving the intermediate solution during the moving along decision tree. An algorithm and example are presented demonstrating solution complexity coming close to a linear one. The tests carried out in civil aviation settlement system demonstrate approximately 30 percent shortage of real money transfer. The proposed algorithm is planned to be implemented also in other clearing organizations of the Russian Federation.

  7. The impact of reduction of doublet well spacing on the Net Present Value and the life time of fluvial Hot Sedimentary Aquifer doublets

    NARCIS (Netherlands)

    Willems, C.J.L.; Maghami Nick, Hamidreza M.; Bruhn, D.F.

    This paper evaluates the impact of reduction of doublet well spacing, below the current West Netherlands Basin standard of 1000 to 1500 m, on the Net Present Value (NPV) and the life time of fluvial Hot Sedimentary Aquifer (HSA) doublets. First, a sensitivity analysis is used to show the possible

  8. The Role of Domain Knowledge in Cognitive Modeling of Information Search

    NARCIS (Netherlands)

    Karanam, S.; Jorge-Botana, Guillermo; Olmos, Ricardo; van Oostendorp, H.

    2017-01-01

    Computational cognitive models developed so far do not incorporate individual differences in domain knowledge in predicting user clicks on search result pages. We address this problem using a cognitive model of information search which enables us to use two semantic spaces having a low (non-expert

  9. Sampling optimization for printer characterization by direct search.

    Science.gov (United States)

    Bianco, Simone; Schettini, Raimondo

    2012-12-01

    Printer characterization usually requires many printer inputs and corresponding color measurements of the printed outputs. In this brief, a sampling optimization for printer characterization on the basis of direct search is proposed to maintain high color accuracy with a reduction in the number of characterization samples required. The proposed method is able to match a given level of color accuracy requiring, on average, a characterization set cardinality which is almost one-fourth of that required by the uniform sampling, while the best method in the state of the art needs almost one-third. The number of characterization samples required can be further reduced if the proposed algorithm is coupled with a sequential optimization method that refines the sample values in the device-independent color space. The proposed sampling optimization method is extended to deal with multiple substrates simultaneously, giving statistically better colorimetric accuracy (at the α = 0.05 significance level) than sampling optimization techniques in the state of the art optimized for each individual substrate, thus allowing use of a single set of characterization samples for multiple substrates.

  10. Three extensions to subtractive crosstalk reduction

    NARCIS (Netherlands)

    Smit, F.A.; Liere, van R.; Fröhlich, B.; Fröhlich, B.; Blach, R.; Liere, van R.

    2007-01-01

    Stereo displays suffer from crosstalk, an effect that reduces or even inhibits the viewer¿s ability to correctly fuse stereoscopic images. In this paper, three extensions for improved software crosstalk reduction are introduced. First, we propose a reduction method operating in CIELAB color space to

  11. Search for dark-matter particles

    International Nuclear Information System (INIS)

    Cowsik, R.

    1991-01-01

    Experiments performed over the last two years have been very successful in drastically reducing the number of viable elementary particles that could possibly constitute the dark matter that dominates the large-scale gravitational dynamics of astronomical systems. The candidates that survive are the light neutrinos, the axion, and a supersymmetric particle with carefully chosen parameters called the neutralino. Baryonic dark matter, which might contribute not insignificantly over small scales, is perhaps present in the form of brown dwarfs, and a search for these is under way. In this article, the astrophysical studies which bear on the density and the phase-space structure of the dark-matter particles are reviewed and the implications of the various direct and indirect searches for these particles are discussed and, finally, alternative suggestions for the candidates and directions for further searches are pointed out. (author). 35 refs., 29 figs

  12. Active3 noise reduction

    International Nuclear Information System (INIS)

    Holzfuss, J.

    1996-01-01

    Noise reduction is a problem being encountered in a variety of applications, such as environmental noise cancellation, signal recovery and separation. Passive noise reduction is done with the help of absorbers. Active noise reduction includes the transmission of phase inverted signals for the cancellation. This paper is about a threefold active approach to noise reduction. It includes the separation of a combined source, which consists of both a noise and a signal part. With the help of interaction with the source by scanning it and recording its response, modeling as a nonlinear dynamical system is achieved. The analysis includes phase space analysis and global radial basis functions as tools for the prediction used in a subsequent cancellation procedure. Examples are given which include noise reduction of speech. copyright 1996 American Institute of Physics

  13. Search for extraterrestrial life: recent developments. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Papagiannis, M D [ed.

    1985-01-01

    Seventy experts from 20 different countries discuss the many interrelated aspects of the search for extraterrestrial life, including the search for other planetary systems where life may originate and evolve, the widespread presence of complex prebiotic molecules in our Solar System and in interstellar space which could be precursors of life, and the universal aspects of the biological evolution on Earth. They also discuss the nearly 50 radio searches that were undertaken in the last 25 years, the technological progress that has occurred in this period, and the plans for the future including the comprehensive SETI search program that NASA is now preparing for the 1990's. Extensive introductions by the Editor to each of the 8 sections, make this volume friendly even to the non-specialist who has a genuine interest for this new field. 549 refs.; 84 figs.; 21 tabs.

  14. Determining frequentist confidence limits using a directed parameter space search

    International Nuclear Information System (INIS)

    Daniel, Scott F.; Connolly, Andrew J.; Schneider, Jeff

    2014-01-01

    We consider the problem of inferring constraints on a high-dimensional parameter space with a computationally expensive likelihood function. We propose a machine learning algorithm that maps out the Frequentist confidence limit on parameter space by intelligently targeting likelihood evaluations so as to quickly and accurately characterize the likelihood surface in both low- and high-likelihood regions. We compare our algorithm to Bayesian credible limits derived by the well-tested Markov Chain Monte Carlo (MCMC) algorithm using both multi-modal toy likelihood functions and the seven yr Wilkinson Microwave Anisotropy Probe cosmic microwave background likelihood function. We find that our algorithm correctly identifies the location, general size, and general shape of high-likelihood regions in parameter space while being more robust against multi-modality than MCMC.

  15. First Run 2 Searches for Exotica at CMS

    CERN Document Server

    Başeğmez du Pree, S

    2016-01-01

    An overview of the first results of the experimental searches for exotica at the CMS experiment with 13 TeV collision data is presented. The results cover various models with different topologies such as searches for new heavy resonances, extra space dimensions, black holes and dark matter. The analysis results with 13 TeV data are emphasized, corresponding to an integrated luminosity in the range of 2.1–2.8 fb

  16. Quality and efficiency in high dimensional Nearest neighbor search

    KAUST Repository

    Tao, Yufei; Yi, Ke; Sheng, Cheng; Kalnis, Panos

    2009-01-01

    Nearest neighbor (NN) search in high dimensional space is an important problem in many applications. Ideally, a practical solution (i) should be implementable in a relational database, and (ii) its query cost should grow sub-linearly with the dataset size, regardless of the data and query distributions. Despite the bulk of NN literature, no solution fulfills both requirements, except locality sensitive hashing (LSH). The existing LSH implementations are either rigorous or adhoc. Rigorous-LSH ensures good quality of query results, but requires expensive space and query cost. Although adhoc-LSH is more efficient, it abandons quality control, i.e., the neighbor it outputs can be arbitrarily bad. As a result, currently no method is able to ensure both quality and efficiency simultaneously in practice. Motivated by this, we propose a new access method called the locality sensitive B-tree (LSB-tree) that enables fast highdimensional NN search with excellent quality. The combination of several LSB-trees leads to a structure called the LSB-forest that ensures the same result quality as rigorous-LSH, but reduces its space and query cost dramatically. The LSB-forest also outperforms adhoc-LSH, even though the latter has no quality guarantee. Besides its appealing theoretical properties, the LSB-tree itself also serves as an effective index that consumes linear space, and supports efficient updates. Our extensive experiments confirm that the LSB-tree is faster than (i) the state of the art of exact NN search by two orders of magnitude, and (ii) the best (linear-space) method of approximate retrieval by an order of magnitude, and at the same time, returns neighbors with much better quality. © 2009 ACM.

  17. Quantum computers in phase space

    International Nuclear Information System (INIS)

    Miquel, Cesar; Paz, Juan Pablo; Saraceno, Marcos

    2002-01-01

    We represent both the states and the evolution of a quantum computer in phase space using the discrete Wigner function. We study properties of the phase space representation of quantum algorithms: apart from analyzing important examples, such as the Fourier transform and Grover's search, we examine the conditions for the existence of a direct correspondence between quantum and classical evolutions in phase space. Finally, we describe how to measure directly the Wigner function in a given phase-space point by means of a tomographic method that, itself, can be interpreted as a simple quantum algorithm

  18. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  19. Where to search top-K biomedical ontologies?

    Science.gov (United States)

    Oliveira, Daniela; Butt, Anila Sahar; Haller, Armin; Rebholz-Schuhmann, Dietrich; Sahay, Ratnesh

    2018-03-20

    Searching for precise terms and terminological definitions in the biomedical data space is problematic, as researchers find overlapping, closely related and even equivalent concepts in a single or multiple ontologies. Search engines that retrieve ontological resources often suggest an extensive list of search results for a given input term, which leads to the tedious task of selecting the best-fit ontological resource (class or property) for the input term and reduces user confidence in the retrieval engines. A systematic evaluation of these search engines is necessary to understand their strengths and weaknesses in different search requirements. We have implemented seven comparable Information Retrieval ranking algorithms to search through ontologies and compared them against four search engines for ontologies. Free-text queries have been performed, the outcomes have been judged by experts and the ranking algorithms and search engines have been evaluated against the expert-based ground truth (GT). In addition, we propose a probabilistic GT that is developed automatically to provide deeper insights and confidence to the expert-based GT as well as evaluating a broader range of search queries. The main outcome of this work is the identification of key search factors for biomedical ontologies together with search requirements and a set of recommendations that will help biomedical experts and ontology engineers to select the best-suited retrieval mechanism in their search scenarios. We expect that this evaluation will allow researchers and practitioners to apply the current search techniques more reliably and that it will help them to select the right solution for their daily work. The source code (of seven ranking algorithms), ground truths and experimental results are available at https://github.com/danielapoliveira/bioont-search-benchmark.

  20. Crowded visual search in children with normal vision and children with visual impairment.

    Science.gov (United States)

    Huurneman, Bianca; Cox, Ralf F A; Vlaskamp, Björn N S; Boonstra, F Nienke

    2014-03-01

    This study investigates the influence of oculomotor control, crowding, and attentional factors on visual search in children with normal vision ([NV], n=11), children with visual impairment without nystagmus ([VI-nys], n=11), and children with VI with accompanying nystagmus ([VI+nys], n=26). Exclusion criteria for children with VI were: multiple impairments and visual acuity poorer than 20/400 or better than 20/50. Three search conditions were presented: a row with homogeneous distractors, a matrix with homogeneous distractors, and a matrix with heterogeneous distractors. Element spacing was manipulated in 5 steps from 2 to 32 minutes of arc. Symbols were sized 2 times the threshold acuity to guarantee visibility for the VI groups. During simple row and matrix search with homogeneous distractors children in the VI+nys group were less accurate than children with NV at smaller spacings. Group differences were even more pronounced during matrix search with heterogeneous distractors. Search times were longer in children with VI compared to children with NV. The more extended impairments during serial search reveal greater dependence on oculomotor control during serial compared to parallel search. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Selective data reduction in gas chromatography/infrared spectrometry

    International Nuclear Information System (INIS)

    Pyo, Dong Jin; Shin, Hyun Du

    2001-01-01

    As gas chromatography/infrared spectrometry (GC/IR) becomes routinely available, methods must be developed to deal with the large amount of data produced. We demonstrate computer methods that quickly search through a large data file, locating those spectra that display a spectral feature of interest. Based on a modified library search routine, these selective data reduction methods retrieve all or nearly all of the compounds of interest, while rejection the vast majority of unrelated compounds. To overcome the shifting problem of IR spectra, a search method of moving the average pattern was designed. In this moving pattern search, the average pattern of a particular functional group was not held stationary, but was allowed to be moved a little bit right and left

  2. Visual scan-path analysis with feature space transient fixation moments

    Science.gov (United States)

    Dempere-Marco, Laura; Hu, Xiao-Peng; Yang, Guang-Zhong

    2003-05-01

    The study of eye movements provides useful insight into the cognitive processes underlying visual search tasks. The analysis of the dynamics of eye movements has often been approached from a purely spatial perspective. In many cases, however, it may not be possible to define meaningful or consistent dynamics without considering the features underlying the scan paths. In this paper, the definition of the feature space has been attempted through the concept of visual similarity and non-linear low dimensional embedding, which defines a mapping from the image space into a low dimensional feature manifold that preserves the intrinsic similarity of image patterns. This has enabled the definition of perceptually meaningful features without the use of domain specific knowledge. Based on this, this paper introduces a new concept called Feature Space Transient Fixation Moments (TFM). The approach presented tackles the problem of feature space representation of visual search through the use of TFM. We demonstrate the practical values of this concept for characterizing the dynamics of eye movements in goal directed visual search tasks. We also illustrate how this model can be used to elucidate the fundamental steps involved in skilled search tasks through the evolution of transient fixation moments.

  3. Logistics Reduction: Heat Melt Compactor

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Exploration Systems (AES) Logistics Reduction (LR) project Heat Melt Compactor (HMC) technology is a waste management technology. Currently, there are...

  4. A Semidefinite Programming Based Search Strategy for Feature Selection with Mutual Information Measure.

    Science.gov (United States)

    Naghibi, Tofigh; Hoffmann, Sarah; Pfister, Beat

    2015-08-01

    Feature subset selection, as a special case of the general subset selection problem, has been the topic of a considerable number of studies due to the growing importance of data-mining applications. In the feature subset selection problem there are two main issues that need to be addressed: (i) Finding an appropriate measure function than can be fairly fast and robustly computed for high-dimensional data. (ii) A search strategy to optimize the measure over the subset space in a reasonable amount of time. In this article mutual information between features and class labels is considered to be the measure function. Two series expansions for mutual information are proposed, and it is shown that most heuristic criteria suggested in the literature are truncated approximations of these expansions. It is well-known that searching the whole subset space is an NP-hard problem. Here, instead of the conventional sequential search algorithms, we suggest a parallel search strategy based on semidefinite programming (SDP) that can search through the subset space in polynomial time. By exploiting the similarities between the proposed algorithm and an instance of the maximum-cut problem in graph theory, the approximation ratio of this algorithm is derived and is compared with the approximation ratio of the backward elimination method. The experiments show that it can be misleading to judge the quality of a measure solely based on the classification accuracy, without taking the effect of the non-optimum search strategy into account.

  5. Less accurate but more efficient family of search templates for detection of gravitational waves from inspiraling compact binaries

    International Nuclear Information System (INIS)

    Chronopoulos, Andreas E.; Apostolatos, Theocharis A.

    2001-01-01

    The network of interferometric detectors that is under construction at various locations on Earth is expected to start searching for gravitational waves in a few years. The number of search templates that is needed to be cross correlated with the noisy output of the detectors is a major issue since computing power capabilities are restricted. By choosing higher and higher post-Newtonian order expansions for the family of search templates we make sure that our filters are more accurate copies of the real waves that hit our detectors. However, this is not the only criterion for choosing a family of search templates. To make the process of detection as efficient as possible, one needs a family of templates with a relatively small number of members that manages to pick up any detectable signal with only a tiny reduction in signal-to-noise ratio. Evidently, one family is better than another if it accomplishes its goal with a smaller number of templates. Following the geometric language of Owen, we have studied the performance of the post 1.5 -Newtonian family of templates on detecting post 2 -Newtonian signals for binaries. Several technical issues arise from the fact that the two types of waveforms cannot be made to coincide by a suitable choice of parameters. In general, the parameter space of the signals is not identical with the parameter space of the templates, although in our case they are of the same dimension, and one has to take into account all such peculiarities before drawing any conclusion. An interesting result we have obtained is that the post 1.5 -Newtonian family of templates happens to be more economical for detecting post 2 -Newtonian signals than the perfectly accurate post 2 -Newtonian family of templates itself. The number of templates is reduced by 20-30%, depending on the acceptable level of reduction in signal-to-noise ratio due to discretization of the family of templates. This makes the post 1.5 -Newtonian family of templates more favorable

  6. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  7. Nearest Neighbor Search in the Metric Space of a Complex Network for Community Detection

    Directory of Open Access Journals (Sweden)

    Suman Saha

    2016-03-01

    Full Text Available The objective of this article is to bridge the gap between two important research directions: (1 nearest neighbor search, which is a fundamental computational tool for large data analysis; and (2 complex network analysis, which deals with large real graphs but is generally studied via graph theoretic analysis or spectral analysis. In this article, we have studied the nearest neighbor search problem in a complex network by the development of a suitable notion of nearness. The computation of efficient nearest neighbor search among the nodes of a complex network using the metric tree and locality sensitive hashing (LSH are also studied and experimented. For evaluation of the proposed nearest neighbor search in a complex network, we applied it to a network community detection problem. Experiments are performed to verify the usefulness of nearness measures for the complex networks, the role of metric tree and LSH to compute fast and approximate node nearness and the the efficiency of community detection using nearest neighbor search. We observed that nearest neighbor between network nodes is a very efficient tool to explore better the community structure of the real networks. Several efficient approximation schemes are very useful for large networks, which hardly made any degradation of results, whereas they save lot of computational times, and nearest neighbor based community detection approach is very competitive in terms of efficiency and time.

  8. Inter-proximal enamel reduction in contemporary orthodontics.

    Science.gov (United States)

    Pindoria, J; Fleming, P S; Sharma, P K

    2016-12-16

    Inter-proximal enamel reduction has gained increasing prominence in recent years being advocated to provide space for orthodontic alignment, to refine contact points and to potentially improve long-term stability. An array of techniques and products are available ranging from hand-held abrasive strips to handpiece mounted burs and discs. The indications for inter-proximal enamel reduction and the importance of formal space analysis, together with the various techniques and armamentarium which may be used to perform it safely in both the labial and buccal segments are outlined.

  9. Classical optics and curved spaces

    International Nuclear Information System (INIS)

    Bailyn, M.; Ragusa, S.

    1976-01-01

    In the eikonal approximation of classical optics, the unit polarization 3-vector of light satisfies an equation that depends only on the index, n, of refraction. It is known that if the original 3-space line element is d sigma 2 , then this polarization direction propagates parallely in the fictitious space n 2 d sigma 2 . Since the equation depends only on n, it is possible to invent a fictitious curved 4-space in which the light performs a null geodesic, and the polarization 3-vector behaves as the 'shadow' of a parallely propagated 4-vector. The inverse, namely, the reduction of Maxwell's equation, on a curve 'dielectric free) space, to a classical space with dielectric constant n=(-g 00 ) -1 / 2 is well known, but in the latter the dielectric constant epsilon and permeability μ must also equal (-g 00 ) -1 / 2 . The rotation of polarization as light bends around the sun by utilizing the reduction to the classical space, is calculated. This (non-) rotation may then be interpreted as parallel transport in the 3-space n 2 d sigma 2 [pt

  10. How does external technology search become balanced? A three-dimensional approach

    DEFF Research Database (Denmark)

    Li-Ying, Jason; Wang, Yuandi

    2015-01-01

    Firms need to search for external knowledge in a balanced way as over-search entails too much risks and uncertainty and local-search does not promise novel opportunities, as the literature has suggested. We conceptually position firms? search behavior within a three-dimensional knowledge search...... space, including cognitive, temporal, and geographic dimensions. We suggest that the balance is no longer a matter of finding optimal search distance along a single dimension. Instead, it becomes an art to maintain balance in a dynamic manner across three dimensions. Using empirical evidence from...... Chinese licensee firms, we show that such a three-dimension balance does exist among firms? practice. The findings in this respect provide promising opportunities for future research, which will significantly contribute to our understanding of how firms search for external knowledge and the implications...

  11. THE QUASIPERIODIC AUTOMATED TRANSIT SEARCH ALGORITHM

    International Nuclear Information System (INIS)

    Carter, Joshua A.; Agol, Eric

    2013-01-01

    We present a new algorithm for detecting transiting extrasolar planets in time-series photometry. The Quasiperiodic Automated Transit Search (QATS) algorithm relaxes the usual assumption of strictly periodic transits by permitting a variable, but bounded, interval between successive transits. We show that this method is capable of detecting transiting planets with significant transit timing variations without any loss of significance— s mearing — as would be incurred with traditional algorithms; however, this is at the cost of a slightly increased stochastic background. The approximate times of transit are standard products of the QATS search. Despite the increased flexibility, we show that QATS has a run-time complexity that is comparable to traditional search codes and is comparably easy to implement. QATS is applicable to data having a nearly uninterrupted, uniform cadence and is therefore well suited to the modern class of space-based transit searches (e.g., Kepler, CoRoT). Applications of QATS include transiting planets in dynamically active multi-planet systems and transiting planets in stellar binary systems.

  12. Searching for better plasmonic materials

    DEFF Research Database (Denmark)

    West, P.; Ishii, S.; Naik, G.

    2010-01-01

    Plasmonics is a research area merging the fields of optics and nanoelectronics by confining light with relatively large free-space wavelength to the nanometer scale - thereby enabling a family of novel devices. Current plasmonic devices at telecommunication and optical frequencies face significan...... for realizing optimal plasmonic material properties for specific frequencies and applications, thereby providing a reference for those searching for better plasmonic materials....

  13. Searches for Electroweak Signatures of Supersymmetry at ATLAS and CMS

    CERN Document Server

    Khoo, Teng Jian; The ATLAS collaboration

    2018-01-01

    Searches for strongly-produced superparticles at the Large Hadron Collider have excluded gluinos and squarks of all generations up to the TeV scale. While limited by statistics, electroweak signatures remain less thoroughly explored, and in particular the Higgsino sector has proven challenging. Conventional searches for leptons associated with missing transverse momentum do not fully cover the phase space, requiring new approaches to extend experimental sensitivity. Dedicated reconstruction techniques address the challenge posed by mass-degenerate spectra. By looking beyond the assumption of leptonic signatures, searches for gauge-mediated supersymmetry have broken new ground.

  14. Computational Search for Improved Ammonia Storage Materials

    DEFF Research Database (Denmark)

    Jensen, Peter Bjerre; Lysgaard, Steen; Vegge, Tejs

    Metal halide ammines, e.g. Mg(NH3)6Cl2 and Sr(NH3)8Cl2, can reversibly store ammonia, with high volumetric hydrogen storage capacities. The storage in the halide ammines is very safe, and the salts are therefore highly relevant as a carbon-free energy carrier in future transportation infrastructure...... selection. The GA is evolving from an initial (random) population and selecting those with highest fitness, a function based on e.g. stability, release temperature, storage capacity and the price of the elements. The search space includes all alkaline earth, 3d and 4d metals in combination with chloride......, bromide or iodide, and mixtures thereof. In total the search space consists of thousands of combinations, which makes a GA ideal, to reduce the number of necessary calculations. We are screening for a one step release from either a hexa or octa ammine, and we have found promising candidates, which...

  15. Switching Reinforcement Learning for Continuous Action Space

    Science.gov (United States)

    Nagayoshi, Masato; Murao, Hajime; Tamaki, Hisashi

    Reinforcement Learning (RL) attracts much attention as a technique of realizing computational intelligence such as adaptive and autonomous decentralized systems. In general, however, it is not easy to put RL into practical use. This difficulty includes a problem of designing a suitable action space of an agent, i.e., satisfying two requirements in trade-off: (i) to keep the characteristics (or structure) of an original search space as much as possible in order to seek strategies that lie close to the optimal, and (ii) to reduce the search space as much as possible in order to expedite the learning process. In order to design a suitable action space adaptively, we propose switching RL model to mimic a process of an infant's motor development in which gross motor skills develop before fine motor skills. Then, a method for switching controllers is constructed by introducing and referring to the “entropy”. Further, through computational experiments by using robot navigation problems with one and two-dimensional continuous action space, the validity of the proposed method has been confirmed.

  16. Searching for Lorentz violation

    International Nuclear Information System (INIS)

    Allen, Roland E.; Yokoo, Seiichirou

    2004-01-01

    Astrophysical, terrestrial, and space-based searches for Lorentz violation are very briefly reviewed. Such searches are motivated by the fact that all superunified theories (and other theories that attempt to include quantum gravity) have some potential for observable violations of Lorentz invariance. Another motivation is the exquisite sensitivity of certain well-designed experiments and observations to particular forms of Lorentz violation. We also review some new predictions of a specific Lorentz-violating theory: If a fundamental energy m-bar c2 in this theory lies below the usual GZK cutoff E GZK , the cutoff is shifted to infinite energy; i.e., it no longer exists. On the other hand, if m-bar c2 lies above E GZK , there is a high-energy branch of the fermion dispersion relation which provides an alternative mechanism for super-GZK cosmic-ray protons

  17. On the Interpretation of Top Partners Searches

    CERN Document Server

    Matsedonskyi, Oleksii; Wulzer, Andrea

    2014-01-01

    Relatively light Top Partners are unmistakable signatures of reasonably Natural Composite Higgs models and as such they are worth searching for at the LHC. Their phenomenology is characterized by a certain amount of model-dependence, which makes the interpretation of Top Partner experimental searches not completely straightforward especially if one is willing to take also single production into account. We describe a model-independent strategy by which the interpretation is provided on the parameter space of a Simplified Model that captures the relevant features of all the explicit constructions. The Simplified Model limits are easy to interpret within explicit models, in a way that requires no recasting and no knowledge of the experimental details of the analyses. We illustrate the method by concrete examples, among which the searches for a charge 5/3 Partner in same-sign dileptons and the searches for a charge 2/3 singlet. In each case we perform a theory recasting of the available 8 TeV Run-1 results and a...

  18. In search of empty space: the Cluster mission

    International Nuclear Information System (INIS)

    Johnstone, Alan

    1990-01-01

    Using four spacecraft, orbiting the earth, in the formation of a regular tetrahedron, European scientists will study the auroras around the planet caused by variations in the Sun's magnetic field. These cluster satellites will also study supernovae from their interplanetary position and the plasma of space surrounding us on the Earth. A growing understanding of the plasma dynamics is hoped to assist in the study of nuclear fusion. (UK)

  19. Episodic retrieval and feature facilitation in intertrial priming of visual search

    DEFF Research Database (Denmark)

    Asgeirsson, Arni Gunnar; Kristjánsson, Árni

    2011-01-01

    Abstract Huang, Holcombe, and Pashler (Memory & Cognition, 32, 12–20, 2004) found that priming from repetition of different features of a target in a visual search task resulted in significant response time (RT) reductions when both target brightness and size were repeated. But when only one...... feature was repeated and the other changed, RTs were longer than when neither feature was repeated. From this, they argued that priming in visual search reflected episodic retrieval of memory traces, rather than facilitation of repeated features. We tested different varia- tions of the search task...

  20. Observations of the Hubble Deep Field with the Infrared Space Observatory .1. Data reduction, maps and sky coverage

    DEFF Research Database (Denmark)

    Serjeant, S.B.G.; Eaton, N.; Oliver, S.J.

    1997-01-01

    We present deep imaging at 6.7 and 15 mu m from the CAM instrument on the Infrared Space Observatory (ISO), centred on the Hubble Deep Field (HDF). These are the deepest integrations published to date at these wavelengths in any region of sky. We discuss the observational strategy and the data...... reduction. The observed source density appears to approach the CAM confusion limit at 15 mu m, and fluctuations in the 6.7-mu m sky background may be identifiable with similar spatial fluctuations in the HDF galaxy counts. ISO appears to be detecting comparable field galaxy populations to the HDF, and our...

  1. Choosing colors for map display icons using models of visual search.

    Science.gov (United States)

    Shive, Joshua; Francis, Gregory

    2013-04-01

    We show how to choose colors for icons on maps to minimize search time using predictions of a model of visual search. The model analyzes digital images of a search target (an icon on a map) and a search display (the map containing the icon) and predicts search time as a function of target-distractor color distinctiveness and target eccentricity. We parameterized the model using data from a visual search task and performed a series of optimization tasks to test the model's ability to choose colors for icons to minimize search time across icons. Map display designs made by this procedure were tested experimentally. In a follow-up experiment, we examined the model's flexibility to assign colors in novel search situations. The model fits human performance, performs well on the optimization tasks, and can choose colors for icons on maps with novel stimuli to minimize search time without requiring additional model parameter fitting. Models of visual search can suggest color choices that produce search time reductions for display icons. Designers should consider constructing visual search models as a low-cost method of evaluating color assignments.

  2. Modernizing quantum annealing using local searches

    International Nuclear Information System (INIS)

    Chancellor, Nicholas

    2017-01-01

    I describe how real quantum annealers may be used to perform local (in state space) searches around specified states, rather than the global searches traditionally implemented in the quantum annealing algorithm (QAA). Such protocols will have numerous advantages over simple quantum annealing. By using such searches the effect of problem mis-specification can be reduced, as only energy differences between the searched states will be relevant. The QAA is an analogue of simulated annealing, a classical numerical technique which has now been superseded. Hence, I explore two strategies to use an annealer in a way which takes advantage of modern classical optimization algorithms. Specifically, I show how sequential calls to quantum annealers can be used to construct analogues of population annealing and parallel tempering which use quantum searches as subroutines. The techniques given here can be applied not only to optimization, but also to sampling. I examine the feasibility of these protocols on real devices and note that implementing such protocols should require minimal if any change to the current design of the flux qubit-based annealers by D-Wave Systems Inc. I further provide proof-of-principle numerical experiments based on quantum Monte Carlo that demonstrate simple examples of the discussed techniques. (paper)

  3. A swarm intelligence framework for reconstructing gene networks: searching for biologically plausible architectures.

    Science.gov (United States)

    Kentzoglanakis, Kyriakos; Poole, Matthew

    2012-01-01

    In this paper, we investigate the problem of reverse engineering the topology of gene regulatory networks from temporal gene expression data. We adopt a computational intelligence approach comprising swarm intelligence techniques, namely particle swarm optimization (PSO) and ant colony optimization (ACO). In addition, the recurrent neural network (RNN) formalism is employed for modeling the dynamical behavior of gene regulatory systems. More specifically, ACO is used for searching the discrete space of network architectures and PSO for searching the corresponding continuous space of RNN model parameters. We propose a novel solution construction process in the context of ACO for generating biologically plausible candidate architectures. The objective is to concentrate the search effort into areas of the structure space that contain architectures which are feasible in terms of their topological resemblance to real-world networks. The proposed framework is initially applied to the reconstruction of a small artificial network that has previously been studied in the context of gene network reverse engineering. Subsequently, we consider an artificial data set with added noise for reconstructing a subnetwork of the genetic interaction network of S. cerevisiae (yeast). Finally, the framework is applied to a real-world data set for reverse engineering the SOS response system of the bacterium Escherichia coli. Results demonstrate the relative advantage of utilizing problem-specific knowledge regarding biologically plausible structural properties of gene networks over conducting a problem-agnostic search in the vast space of network architectures.

  4. Reduction in wick drain effectiveness with spacing for Utah silts and clays.

    Science.gov (United States)

    2012-04-01

    Although decreasing the spacing of vertical drains usually decreases the time for consolidation, previous field tests have shown that there is a critical drain spacing for which tighter spacing does not decrease the time for consolidation. This...

  5. Making Temporal Search More Central in Spatial Data Infrastructures

    Science.gov (United States)

    Corti, P.; Lewis, B.

    2017-10-01

    A temporally enabled Spatial Data Infrastructure (SDI) is a framework of geospatial data, metadata, users, and tools intended to provide an efficient and flexible way to use spatial information which includes the historical dimension. One of the key software components of an SDI is the catalogue service which is needed to discover, query, and manage the metadata. A search engine is a software system capable of supporting fast and reliable search, which may use any means necessary to get users to the resources they need quickly and efficiently. These techniques may include features such as full text search, natural language processing, weighted results, temporal search based on enrichment, visualization of patterns in distributions of results in time and space using temporal and spatial faceting, and many others. In this paper we will focus on the temporal aspects of search which include temporal enrichment using a time miner - a software engine able to search for date components within a larger block of text, the storage of time ranges in the search engine, handling historical dates, and the use of temporal histograms in the user interface to display the temporal distribution of search results.

  6. Efficient search by optimized intermittent random walks

    International Nuclear Information System (INIS)

    Oshanin, Gleb; Lindenberg, Katja; Wio, Horacio S; Burlatsky, Sergei

    2009-01-01

    We study the kinetics for the search of an immobile target by randomly moving searchers that detect it only upon encounter. The searchers perform intermittent random walks on a one-dimensional lattice. Each searcher can step on a nearest neighbor site with probability α or go off lattice with probability 1 - α to move in a random direction until it lands back on the lattice at a fixed distance L away from the departure point. Considering α and L as optimization parameters, we seek to enhance the chances of successful detection by minimizing the probability P N that the target remains undetected up to the maximal search time N. We show that even in this simple model, a number of very efficient search strategies can lead to a decrease of P N by orders of magnitude upon appropriate choices of α and L. We demonstrate that, in general, such optimal intermittent strategies are much more efficient than Brownian searches and are as efficient as search algorithms based on random walks with heavy-tailed Cauchy jump-length distributions. In addition, such intermittent strategies appear to be more advantageous than Levy-based ones in that they lead to more thorough exploration of visited regions in space and thus lend themselves to parallelization of the search processes.

  7. Selecting the Mercury Seven The Search for America's First Astronauts

    CERN Document Server

    Burgess, Colin

    2011-01-01

    In January 1959, after an exhaustive search through military service records, a number of Americas elite test pilots received orders to attend a series of top-secret briefings in Washington, D.C. These briefings were designed to assist in selecting a group of astronauts for the newly formed National Aeronautics and Space Administration (NASA) and its man-in-space program, Project Mercury. Following in-depth medical and psychological screening, 32 finalists were chosen. They would be subjected to the most rigorous, exploratory, and even degrading medical and psychological stress tests ever imposed on the nation's service personnel. NASA wanted the best of the best in its quest for the nation's first astronauts, and this is the story of that search for a group of near-supermen who were destined to become trailblazing pioneers of American space flight. For the very first time, after extensive research and numerous interviews, the names and amazing stories of those 32 finalists are finally revealed in this book. ...

  8. Fermion masses from dimensional reduction

    International Nuclear Information System (INIS)

    Kapetanakis, D.; Zoupanos, G.

    1990-01-01

    We consider the fermion masses in gauge theories obtained from ten dimensions through dimensional reduction on coset spaces. We calculate the general fermion mass matrix and we apply the mass formula in illustrative examples. (orig.)

  9. Fermion masses from dimensional reduction

    Energy Technology Data Exchange (ETDEWEB)

    Kapetanakis, D. (National Research Centre for the Physical Sciences Democritos, Athens (Greece)); Zoupanos, G. (European Organization for Nuclear Research, Geneva (Switzerland))

    1990-10-11

    We consider the fermion masses in gauge theories obtained from ten dimensions through dimensional reduction on coset spaces. We calculate the general fermion mass matrix and we apply the mass formula in illustrative examples. (orig.).

  10. Space Environmental Effects (SEE) Testing Capability: NASA/Marshall Space Flight Center

    Science.gov (United States)

    DeWittBurns, H.; Crave, Paul; Finckenor, Miria; Finchum, Charles; Nehls, Mary; Schneider, Todd; Vaughn, Jason

    2012-01-01

    Understanding the effects of the space environment on materials and systems is fundamental and essential for mission success. If not properly understood and designed for, the space environment can lead to materials degradation, reduction of functional lifetime, and system failure. Ground based testing is critical in predicting performance NASA/MSFC's expertise and capabilities make up the most complete SEE testing capability available.

  11. End-to-End Trade-space Analysis for Designing Constellation Missions

    Science.gov (United States)

    LeMoigne, J.; Dabney, P.; Foreman, V.; Grogan, P.; Hache, S.; Holland, M. P.; Hughes, S. P.; Nag, S.; Siddiqi, A.

    2017-12-01

    Multipoint measurement missions can provide a significant advancement in science return and this science interest coupled with many recent technological advances are driving a growing trend in exploring distributed architectures for future NASA missions. Distributed Spacecraft Missions (DSMs) leverage multiple spacecraft to achieve one or more common goals. In particular, a constellation is the most general form of DSM with two or more spacecraft placed into specific orbit(s) for the purpose of serving a common objective (e.g., CYGNSS). Because a DSM architectural trade-space includes both monolithic and distributed design variables, DSM optimization is a large and complex problem with multiple conflicting objectives. Over the last two years, our team has been developing a Trade-space Analysis Tool for Constellations (TAT-C), implemented in common programming languages for pre-Phase A constellation mission analysis. By evaluating alternative mission architectures, TAT-C seeks to minimize cost and maximize performance for pre-defined science goals. This presentation will describe the overall architecture of TAT-C including: a User Interface (UI) at several levels of details and user expertise; Trade-space Search Requests that are created from the Science requirements gathered by the UI and validated by a Knowledge Base; a Knowledge Base to compare the current requests to prior mission concepts to potentially prune the trade-space; a Trade-space Search Iterator which, with inputs from the Knowledge Base, and, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generates multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, modeling orbits to balance accuracy and performance. The current version includes uniform and non-uniform Walker constellations as well as Ad-Hoc and precessing constellations, and its

  12. Searches for Charginos and Neutralinos with the D0 Detector

    International Nuclear Information System (INIS)

    Adams, T.

    2009-01-01

    Within the framework of supersymmetry, charginos and/or neutralinos are often the preferred topics of searches for experimental evidence. This is due to the facts that in much of parameter space they are the lightest supersymmetric partners and they offer unique final states to separate from standard model backgrounds. The D0 experiment has performed several recent searches including the traditional trilepton final state and a decay chain involving dark photons

  13. Spontaneous compactification to homogeneous spaces

    International Nuclear Information System (INIS)

    Mourao, J.M.

    1988-01-01

    The spontaneous compactification of extra dimensions to compact homogeneous spaces is studied. The methods developed within the framework of coset space dimensional reduction scheme and the most general form of invariant metrics are used to find solutions of spontaneous compactification equations

  14. Investigations on search methods for speech recognition using weighted finite state transducers

    OpenAIRE

    Rybach, David

    2014-01-01

    The search problem in the statistical approach to speech recognition is to find the most likely word sequence for an observed speech signal using a combination of knowledge sources, i.e. the language model, the pronunciation model, and the acoustic models of phones. The resulting search space is enormous. Therefore, an efficient search strategy is required to compute the result with a feasible amount of time and memory. The structured statistical models as well as their combination, the searc...

  15. Search in spatial scale-free networks

    International Nuclear Information System (INIS)

    Thadakamalla, H P; Albert, R; Kumara, S R T

    2007-01-01

    We study the decentralized search problem in a family of parameterized spatial network models that are heterogeneous in node degree. We investigate several algorithms and illustrate that some of these algorithms exploit the heterogeneity in the network to find short paths by using only local information. In addition, we demonstrate that the spatial network model belongs to a classof searchable networks for a wide range of parameter space. Further, we test these algorithms on the US airline network which belongs to this class of networks and demonstrate that searchability is a generic property of the US airline network. These results provide insights on designing the structure of distributed networks that need effective decentralized search algorithms

  16. Multilevel Thresholding Segmentation Based on Harmony Search Optimization

    Directory of Open Access Journals (Sweden)

    Diego Oliva

    2013-01-01

    Full Text Available In this paper, a multilevel thresholding (MT algorithm based on the harmony search algorithm (HSA is introduced. HSA is an evolutionary method which is inspired in musicians improvising new harmonies while playing. Different to other evolutionary algorithms, HSA exhibits interesting search capabilities still keeping a low computational overhead. The proposed algorithm encodes random samples from a feasible search space inside the image histogram as candidate solutions, whereas their quality is evaluated considering the objective functions that are employed by the Otsu’s or Kapur’s methods. Guided by these objective values, the set of candidate solutions are evolved through the HSA operators until an optimal solution is found. Experimental results demonstrate the high performance of the proposed method for the segmentation of digital images.

  17. SUSY and BSM Higgs boson searches with ATLAS and CMS

    CERN Document Server

    Dasu, S

    2012-01-01

    Results of searches for super-symmetric and other beyond the Standard Model Higgs boson searches from ATLAS and CMS experiments at the LHC arc presented. Some Standard Model (SM) higgs searches are reinterpreted in SM with four quark generations and fermio­ phobic models. Stringent limits) covering a large portion of the allowed parameter space in (MA, tan/3) plane are set for MSSM neutral higgs bosons decaying to T-lepton pairs, and charged higgs boson decaying to TV. Limits are set on a light NMSSM neutral higgs boson and on doubly charged higgs bosons predicted in some models are also set.

  18. Update on Risk Reduction Activities for a Liquid Advanced Booster for NASA's Space Launch System

    Science.gov (United States)

    Crocker, Andrew M.; Greene, William D.

    2017-01-01

    The stated goals of NASA's Research Announcement for the Space Launch System (SLS) Advanced Booster Engineering Demonstration and/or Risk Reduction (ABEDRR) are to reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS and enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability. Dynetics, Inc. and Aerojet Rocketdyne (AR) formed a team to offer a wide-ranging set of risk reduction activities and full-scale, system-level demonstrations that support NASA's ABEDRR goals. During the ABEDRR effort, the Dynetics Team has modified flight-proven Apollo-Saturn F-1 engine components and subsystems to improve affordability and reliability (e.g., reduce parts counts, touch labor, or use lower cost manufacturing processes and materials). The team has built hardware to validate production costs and completed tests to demonstrate it can meet performance requirements. State-of-the-art manufacturing and processing techniques have been applied to the heritage F-1, resulting in a low recurring cost engine while retaining the benefits of Apollo-era experience. NASA test facilities have been used to perform low-cost risk-reduction engine testing. In early 2014, NASA and the Dynetics Team agreed to move additional large liquid oxygen/kerosene engine work under Dynetics' ABEDRR contract. Also led by AR, the objectives of this work are to demonstrate combustion stability and measure performance of a 500,000 lbf class Oxidizer-Rich Staged Combustion (ORSC) cycle main injector. A trade study was completed to investigate the feasibility, cost effectiveness, and technical maturity of a domestically-produced engine that could potentially both replace the RD-180 on Atlas V and satisfy NASA SLS payload-to-orbit requirements via an advanced booster application. Engine physical dimensions and performance parameters resulting from this study provide the system level requirements for the ORSC risk reduction test article

  19. Improving 3d Spatial Queries Search: Newfangled Technique of Space Filling Curves in 3d City Modeling

    Science.gov (United States)

    Uznir, U.; Anton, F.; Suhaibah, A.; Rahman, A. A.; Mioc, D.

    2013-09-01

    The advantages of three dimensional (3D) city models can be seen in various applications including photogrammetry, urban and regional planning, computer games, etc.. They expand the visualization and analysis capabilities of Geographic Information Systems on cities, and they can be developed using web standards. However, these 3D city models consume much more storage compared to two dimensional (2D) spatial data. They involve extra geometrical and topological information together with semantic data. Without a proper spatial data clustering method and its corresponding spatial data access method, retrieving portions of and especially searching these 3D city models, will not be done optimally. Even though current developments are based on an open data model allotted by the Open Geospatial Consortium (OGC) called CityGML, its XML-based structure makes it challenging to cluster the 3D urban objects. In this research, we propose an opponent data constellation technique of space-filling curves (3D Hilbert curves) for 3D city model data representation. Unlike previous methods, that try to project 3D or n-dimensional data down to 2D or 3D using Principal Component Analysis (PCA) or Hilbert mappings, in this research, we extend the Hilbert space-filling curve to one higher dimension for 3D city model data implementations. The query performance was tested using a CityGML dataset of 1,000 building blocks and the results are presented in this paper. The advantages of implementing space-filling curves in 3D city modeling will improve data retrieval time by means of optimized 3D adjacency, nearest neighbor information and 3D indexing. The Hilbert mapping, which maps a subinterval of the [0, 1] interval to the corresponding portion of the d-dimensional Hilbert's curve, preserves the Lebesgue measure and is Lipschitz continuous. Depending on the applications, several alternatives are possible in order to cluster spatial data together in the third dimension compared to its

  20. Rubin's CMS reduction method for general state-space models

    NARCIS (Netherlands)

    Kraker, de A.; Campen, van D.H.

    1996-01-01

    In this paper the Rubin CMS procedure for the reduction and successive coupling of undamped structural subsystems with symmetric system matrices will be modified for the case of general damping. The final coordinate transformation is based on the use of complex (residual) flexibility modes,

  1. A matter of timing: identifying significant multi-dose radiotherapy improvements by numerical simulation and genetic algorithm search.

    Directory of Open Access Journals (Sweden)

    Simon D Angus

    Full Text Available Multi-dose radiotherapy protocols (fraction dose and timing currently used in the clinic are the product of human selection based on habit, received wisdom, physician experience and intra-day patient timetabling. However, due to combinatorial considerations, the potential treatment protocol space for a given total dose or treatment length is enormous, even for relatively coarse search; well beyond the capacity of traditional in-vitro methods. In constrast, high fidelity numerical simulation of tumor development is well suited to the challenge. Building on our previous single-dose numerical simulation model of EMT6/Ro spheroids, a multi-dose irradiation response module is added and calibrated to the effective dose arising from 18 independent multi-dose treatment programs available in the experimental literature. With the developed model a constrained, non-linear, search for better performing cadidate protocols is conducted within the vicinity of two benchmarks by genetic algorithm (GA techniques. After evaluating less than 0.01% of the potential benchmark protocol space, candidate protocols were identified by the GA which conferred an average of 9.4% (max benefit 16.5% and 7.1% (13.3% improvement (reduction on tumour cell count compared to the two benchmarks, respectively. Noticing that a convergent phenomenon of the top performing protocols was their temporal synchronicity, a further series of numerical experiments was conducted with periodic time-gap protocols (10 h to 23 h, leading to the discovery that the performance of the GA search candidates could be replicated by 17-18 h periodic candidates. Further dynamic irradiation-response cell-phase analysis revealed that such periodicity cohered with latent EMT6/Ro cell-phase temporal patterning. Taken together, this study provides powerful evidence towards the hypothesis that even simple inter-fraction timing variations for a given fractional dose program may present a facile, and highly cost

  2. A matter of timing: identifying significant multi-dose radiotherapy improvements by numerical simulation and genetic algorithm search.

    Science.gov (United States)

    Angus, Simon D; Piotrowska, Monika Joanna

    2014-01-01

    Multi-dose radiotherapy protocols (fraction dose and timing) currently used in the clinic are the product of human selection based on habit, received wisdom, physician experience and intra-day patient timetabling. However, due to combinatorial considerations, the potential treatment protocol space for a given total dose or treatment length is enormous, even for relatively coarse search; well beyond the capacity of traditional in-vitro methods. In constrast, high fidelity numerical simulation of tumor development is well suited to the challenge. Building on our previous single-dose numerical simulation model of EMT6/Ro spheroids, a multi-dose irradiation response module is added and calibrated to the effective dose arising from 18 independent multi-dose treatment programs available in the experimental literature. With the developed model a constrained, non-linear, search for better performing cadidate protocols is conducted within the vicinity of two benchmarks by genetic algorithm (GA) techniques. After evaluating less than 0.01% of the potential benchmark protocol space, candidate protocols were identified by the GA which conferred an average of 9.4% (max benefit 16.5%) and 7.1% (13.3%) improvement (reduction) on tumour cell count compared to the two benchmarks, respectively. Noticing that a convergent phenomenon of the top performing protocols was their temporal synchronicity, a further series of numerical experiments was conducted with periodic time-gap protocols (10 h to 23 h), leading to the discovery that the performance of the GA search candidates could be replicated by 17-18 h periodic candidates. Further dynamic irradiation-response cell-phase analysis revealed that such periodicity cohered with latent EMT6/Ro cell-phase temporal patterning. Taken together, this study provides powerful evidence towards the hypothesis that even simple inter-fraction timing variations for a given fractional dose program may present a facile, and highly cost-effecitive means

  3. Empty space-times with separable Hamilton-Jacobi equation

    International Nuclear Information System (INIS)

    Collinson, C.D.; Fugere, J.

    1977-01-01

    All empty space-times admitting a one-parameter group of motions and in which the Hamilton-Jacobi equation is (partially) separable are obtained. Several different cases of such empty space-times exist and the Riemann tensor is found to be either type D or N. The results presented here complete the search for empty space-times with separable Hamilton-Jacobi equation. (author)

  4. Deep Space Detectives: Searching for Planets Suitable for Life

    Science.gov (United States)

    Pallant, Amy; Damelin, Daniel; Pryputniewicz, Sarah

    2013-01-01

    This article describes the High-Adventure Science curriculum unit "Is There Life in Space?" This free online investigation, developed by The Concord Consortium, helps students see how scientists use modern tools to locate planets around distant stars and explore the probability of finding extraterrestrial life. This innovative curriculum…

  5. Searches for supersymmetry with the CMS detector at 13 TeV

    CERN Document Server

    Bainbridge, Robert

    2016-01-01

    Several searches are performed for R-parity-conserving supersymmetry in final states containing either zero, one, or two leptons, one or more jets, and an imbalance in transverse momentum in pp collisions at 13 TeV. The data are recorded with the CMS detector at the CERN LHC and correspond to an integrated luminosity of 2.3/fb. The results of the searches are interpreted in the mass parameter space of several simplified models of supersymmetry that assume the pair production of gluinos and squarks. Constraints on the natural parameter space are reported, with gluino masses excluded up to ~1.6 TeV, an increase of ~300 GeV with respect to the strongest exclusions obtained during Run 1.

  6. Local beam angle optimization with linear programming and gradient search

    International Nuclear Information System (INIS)

    Craft, David

    2007-01-01

    The optimization of beam angles in IMRT planning is still an open problem, with literature focusing on heuristic strategies and exhaustive searches on discrete angle grids. We show how a beam angle set can be locally refined in a continuous manner using gradient-based optimization in the beam angle space. The gradient is derived using linear programming duality theory. Applying this local search to 100 random initial angle sets of a phantom pancreatic case demonstrates the method, and highlights the many-local-minima aspect of the BAO problem. Due to this function structure, we recommend a search strategy of a thorough global search followed by local refinement at promising beam angle sets. Extensions to nonlinear IMRT formulations are discussed. (note)

  7. Searching for confining hidden valleys at LHCb, ATLAS, and CMS

    Science.gov (United States)

    Pierce, Aaron; Shakya, Bibhushan; Tsai, Yuhsin; Zhao, Yue

    2018-05-01

    We explore strategies for probing hidden valley scenarios exhibiting confinement. Such scenarios lead to a moderate multiplicity of light hidden hadrons for generic showering and hadronization similar to QCD. Their decays are typically soft and displaced, making them challenging to probe with traditional LHC searches. We show that the low trigger requirements and excellent track and vertex reconstruction at LHCb provide a favorable environment to search for such signals. We propose novel search strategies in both muonic and hadronic channels. We also study existing ATLAS and CMS searches and compare them with our proposals at LHCb. We find that the reach at LHCb is generically better in the parameter space we consider here, even with optimistic background estimations for ATLAS and CMS searches. We discuss potential modifications at ATLAS and CMS that might make these experiments competitive with the LHCb reach. Our proposed searches can be applied to general hidden valley models as well as exotic Higgs boson decays, such as in twin Higgs models.

  8. Astronomical Observations Astronomy and the Study of Deep Space

    CERN Document Server

    2010-01-01

    Our Search for knowledge about the universe has been remarkable, heartbreaking, fantastical, and inspiring, and this search is just beginning. Astronomical Observations is part of a 7 book series that takes readers through a virtual time warp of our discovery. From the nascent space programs of the 1960's to today's space tourism and the promise of distant planet colonization, readers will be transfixed. Throughout this journey of the mind, Earth-bound explorers gain keen insight into the celestial phenomena that have fascinated humans for centuries. Thrilling narratives about indefatigable sc

  9. 10 CFR 9.41 - Requests for waiver or reduction of fees.

    Science.gov (United States)

    2010-01-01

    ... publication fee; and (8) Describe any commercial or private interest the requester or any other party has in... 10 Energy 1 2010-01-01 2010-01-01 false Requests for waiver or reduction of fees. 9.41 Section 9... Requests for waiver or reduction of fees. (a)(1) The NRC will collect fees for searching for, reviewing...

  10. Proposal to Search for Heavy Neutral Leptons at the SPS

    CERN Document Server

    Bonivento, W.; Dijkstra, H.; Egede, U.; Ferro-Luzzi, M.; Goddard, B.; Golutvin, A.; Gorbunov, D.; Jacobsson, R.; Panman, J.; Patel, M.; Ruchayskiy, O.; Ruf, T.; Serra, N.; Shaposhnikov, M.; Treille, D.; CERN. Geneva. SPS and PS Experiments Committee; SPSC

    2013-01-01

    A new fixed-target experiment at the CERN SPS accelerator is proposed that will use decays of charm mesons to search for Heavy Neutral Leptons (HNLs), which are right-handed partners of the Standard Model neutrinos. The existence of such particles is strongly motivated by theory, as they can simultaneously explain the baryon asymmetry of the Universe, account for the pattern of neutrino masses and oscillations and provide a Dark Matter candidate. Cosmological constraints on the properties of HNLs now indicate that the majority of the interesting parameter space for such particles was beyond the reach of the previous searches at the PS191, BEBC, CHARM, CCFR and NuTeV experiments. For HNLs with mass below 2 GeV, the proposed experiment will improve on the sensitivity of previous searches by four orders of magnitude and will cover a major fraction of the parameter space favoured by theoretical models. The experiment requires a 400 GeV proton beam from the SPS with a total of 2x10^20 protons on target, achie...

  11. Stochastic search in structural optimization - Genetic algorithms and simulated annealing

    Science.gov (United States)

    Hajela, Prabhat

    1993-01-01

    An account is given of illustrative applications of genetic algorithms and simulated annealing methods in structural optimization. The advantages of such stochastic search methods over traditional mathematical programming strategies are emphasized; it is noted that these methods offer a significantly higher probability of locating the global optimum in a multimodal design space. Both genetic-search and simulated annealing can be effectively used in problems with a mix of continuous, discrete, and integer design variables.

  12. Large Scale Reduction of Graphite Oxide

    Data.gov (United States)

    National Aeronautics and Space Administration — This project seeks to develop an optical method to reduce graphite oxide into graphene efficiently and in larger formats than currently available. Current reduction...

  13. Detection from space of a reduction in anthropogenic emissions of nitrogen oxides during the Chinese economic downturn

    Science.gov (United States)

    Lin, J.-T.; McElroy, M. B.

    2011-08-01

    Rapid economic and industrial development in China and relatively weak emission controls have resulted in significant increases in emissions of nitrogen oxides (NOx) in recent years, with the exception of late 2008 to mid 2009 when the economic downturn led to emission reductions detectable from space. Here vertical column densities (VCDs) of tropospheric NO2 retrieved from satellite observations by SCIAMACHY, GOME-2 and OMI (both by KNMI and by NASA) are used to evaluate changes in emissions of NOx from October 2004 to February 2010 identifying impacts of the economic downturn. Data over polluted regions of Northern East China suggest an increase of 27-33 % in 12-month mean VCD of NO2 prior to the downturn, consistent with an increase of 49 % in thermal power generation (TPG) reflecting the economic growth. More detailed analysis is used to quantify changes in emissions of NOx in January over the period 2005-2010 when the effect of the downturn was most evident. The GEOS-Chem model is employed to evaluate the effect of changes in chemistry and meteorology on VCD of NO2. This analysis indicates that emissions decreased by 20 % from January 2008 to January 2009, close to the reduction of 18 % in TPG that occurred over the same interval. A combination of three independent approaches indicates that the economic downturn was responsible for a reduction in emissions by 9-11 % in January 2009 with an additional decrease of 10 % attributed to the slow-down in industrial activity associated with the coincident celebration of the Chinese New Year; errors in the estimate are most likely less than 3.4 %.

  14. Search for charged Higgs Bosons with the ATLAS detector

    CERN Document Server

    Casado, Maria Pilar; The ATLAS collaboration

    2017-01-01

    Present searches on charged Higgs in ATLAS are presented. Data taken at 13 TeV during 2015 and 2016 is used to obtain exclusion limits in a parameter space between 200 and 2000 GeV of charged Higgs mass.

  15. Optimal neighborhood indexing for protein similarity search.

    Science.gov (United States)

    Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu

    2008-12-16

    Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website http://bioinfo.lifl.fr/reblosum. We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction.

  16. Optimal neighborhood indexing for protein similarity search

    Directory of Open Access Journals (Sweden)

    Nguyen Van

    2008-12-01

    Full Text Available Abstract Background Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. Results The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website http://bioinfo.lifl.fr/reblosum. Conclusion We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction.

  17. The impact of reduction of doublet well spacing on the Net Present Value and the life time of fluvial Hot Sedimentary Aquifer doublets

    DEFF Research Database (Denmark)

    Willems, C. J. L.; Nick, H. M.; Goense, T.

    2017-01-01

    This paper evaluates the impact of reduction of doublet well spacing, below the current West Netherlands Basin standard of 1000 - 1500 m, on the Net Present Value (NPV) and the life time of fluvial Hot Sedimentary Aquifer (HSA) doublets. First, a sensitivity analysis is used to show the possible ...... the potential and risks of HSA doublets. This factor significantly affects doublet life time and net energy production of the doublet....

  18. Multipass Target Search in Natural Environments.

    Science.gov (United States)

    Kuhlman, Michael J; Otte, Michael W; Sofge, Donald; Gupta, Satyandra K

    2017-11-02

    Consider a disaster scenario where search and rescue workers must search difficult to access buildings during an earthquake or flood. Often, finding survivors a few hours sooner results in a dramatic increase in saved lives, suggesting the use of drones for expedient rescue operations. Entropy can be used to quantify the generation and resolution of uncertainty. When searching for targets, maximizing mutual information of future sensor observations will minimize expected target location uncertainty by minimizing the entropy of the future estimate. Motion planning for multi-target autonomous search requires planning over an area with an imperfect sensor and may require multiple passes, which is hindered by the submodularity property of mutual information. Further, mission duration constraints must be handled accordingly, requiring consideration of the vehicle's dynamics to generate feasible trajectories and must plan trajectories spanning the entire mission duration, something which most information gathering algorithms are incapable of doing. If unanticipated changes occur in an uncertain environment, new plans must be generated quickly. In addition, planning multipass trajectories requires evaluating path dependent rewards, requiring planning in the space of all previously selected actions, compounding the problem. We present an anytime algorithm for autonomous multipass target search in natural environments. The algorithm is capable of generating long duration dynamically feasible multipass coverage plans that maximize mutual information using a variety of techniques such as ϵ -admissible heuristics to speed up the search. To the authors' knowledge this is the first attempt at efficiently solving multipass target search problems of such long duration. The proposed algorithm is based on best first branch and bound and is benchmarked against state of the art algorithms adapted to the problem in natural Simplex environments, gathering the most information in the

  19. Model independent search for new particles in two-dimensional mass space using events with missing energy, two jets and two leptons with the CMS detector

    CERN Document Server

    AUTHOR|(CDS)2080070; Hebbeker, Thomas

    2017-07-07

    The discovery of a new particle consistent with the standard model Higgs boson at the Large Hadron Collider in 2012 completed the standard model of particle physics (SM). Despite its remarkable success many questions remain unexplained. Numerous theoretical models, predicting the existence of new heavy particles, provide answers to these unresolved questions and are tested at high energy experiments such as the Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC). In this thesis a model independent search method for new particles in two-dimensional mass space in events with missing transverse energy is presented using 19.7 $\\mbox{fb}^{-1}$ of proton-proton collision data recorded by the CMS detector at a centre of mass energy $\\sqrt{s}$ = 8 TeV at the LHC. The analysis searches for signatures of pair-produced new heavy particles $\\mbox{T}^\\prime$ which decay further into unknown heavy particles $\\mbox{W}^\\prime$ and SM quarks $q$ ($\\mbox{T}^\\prime\\overline{\\mbox{T}^\\prime} \\rightarrow {...

  20. Reduction of space charge breakdown in e-beam irradiated nano/polymethyl methacrylate composites

    International Nuclear Information System (INIS)

    Zheng Feihu; Zhang Yewen; An Zhenlian; Dong Jianxing; Lei Qingquan

    2013-01-01

    Fast discharge of numerous space charges in dielectric materials can cause space charge breakdown. This letter reports the role of nanoparticles in affecting space charge breakdown of nano/polymethyl methacrylate composites. Space charge distributions in the composites, implanted by electron beam irradiation, were measured by pressure wave propagation method. The results show that the nanoparticles have significant effects on the isothermal charge decay and space charge breakdown in the nanocomposites. The resistance to space charge breakdown in the nanocomposites is attributed to the combined action of the introduction of deep trapping states and the scattering effect by the added nanoparticles.

  1. Logistics Reduction: Advanced Clothing System (ACS)

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the Advanced Exploration System (AES) Logistics Reduction (LR) project's Advanced Clothing System (ACS) is to use advanced commercial off-the-shelf...

  2. Awareness-based game-theoretic space resource management

    Science.gov (United States)

    Chen, Genshe; Chen, Huimin; Pham, Khanh; Blasch, Erik; Cruz, Jose B., Jr.

    2009-05-01

    Over recent decades, the space environment becomes more complex with a significant increase in space debris and a greater density of spacecraft, which poses great difficulties to efficient and reliable space operations. In this paper we present a Hierarchical Sensor Management (HSM) method to space operations by (a) accommodating awareness modeling and updating and (b) collaborative search and tracking space objects. The basic approach is described as follows. Firstly, partition the relevant region of interest into district cells. Second, initialize and model the dynamics of each cell with awareness and object covariance according to prior information. Secondly, explicitly assign sensing resources to objects with user specified requirements. Note that when an object has intelligent response to the sensing event, the sensor assigned to observe an intelligent object may switch from time-to-time between a strong, active signal mode and a passive mode to maximize the total amount of information to be obtained over a multi-step time horizon and avoid risks. Thirdly, if all explicitly specified requirements are satisfied and there are still more sensing resources available, we assign the additional sensing resources to objects without explicitly specified requirements via an information based approach. Finally, sensor scheduling is applied to each sensor-object or sensor-cell pair according to the object type. We demonstrate our method with realistic space resources management scenario using NASA's General Mission Analysis Tool (GMAT) for space object search and track with multiple space borne observers.

  3. Exploration Opportunity Search of Near-earth Objects Based on Analytical Gradients

    Science.gov (United States)

    Ren, Yuan; Cui, Ping-Yuan; Luan, En-Jie

    2008-07-01

    The problem of search of opportunity for the exploration of near-earth minor objects is investigated. For rendezvous missions, the analytical gradients of the performance index with respect to the free parameters are derived using the variational calculus and the theory of state-transition matrix. After generating randomly some initial guesses in the search space, the performance index is optimized, guided by the analytical gradients, leading to the local minimum points representing the potential launch opportunities. This method not only keeps the global-search property of the traditional method, but also avoids the blindness in the latter, thereby increasing greatly the computing speed. Furthermore, with this method, the searching precision could be controlled effectively.

  4. NASA's GeneLab Phase II: Federated Search and Data Discovery

    Science.gov (United States)

    Berrios, Daniel C.; Costes, Sylvain V.; Tran, Peter B.

    2017-01-01

    GeneLab is currently being developed by NASA to accelerate 'open science' biomedical research in support of the human exploration of space and the improvement of life on earth. Phase I of the four-phase GeneLab Data Systems (GLDS) project emphasized capabilities for submission, curation, search, and retrieval of genomics, transcriptomics and proteomics ('omics') data from biomedical research of space environments. The focus of development of the GLDS for Phase II has been federated data search for and retrieval of these kinds of data across other open-access systems, so that users are able to conduct biological meta-investigations using data from a variety of sources. Such meta-investigations are key to corroborating findings from many kinds of assays and translating them into systems biology knowledge and, eventually, therapeutics.

  5. NASAs GeneLab Phase II: Federated Search and Data Discovery

    Science.gov (United States)

    Berrios, Daniel C.; Costes, Sylvain; Tran, Peter

    2017-01-01

    GeneLab is currently being developed by NASA to accelerate open science biomedical research in support of the human exploration of space and the improvement of life on earth. Phase I of the four-phase GeneLab Data Systems (GLDS) project emphasized capabilities for submission, curation, search, and retrieval of genomics, transcriptomics and proteomics (omics) data from biomedical research of space environments. The focus of development of the GLDS for Phase II has been federated data search for and retrieval of these kinds of data across other open-access systems, so that users are able to conduct biological meta-investigations using data from a variety of sources. Such meta-investigations are key to corroborating findings from many kinds of assays and translating them into systems biology knowledge and, eventually, therapeutics.

  6. Cache-Oblivious Planar Orthogonal Range Searching and Counting

    DEFF Research Database (Denmark)

    Arge, Lars; Brodal, Gerth Stølting; Fagerberg, Rolf

    2005-01-01

    present the first cache-oblivious data structure for planar orthogonal range counting, and improve on previous results for cache-oblivious planar orthogonal range searching. Our range counting structure uses O(Nlog2 N) space and answers queries using O(logB N) memory transfers, where B is the block...... size of any memory level in a multilevel memory hierarchy. Using bit manipulation techniques, the space can be further reduced to O(N). The structure can also be modified to support more general semigroup range sum queries in O(logB N) memory transfers, using O(Nlog2 N) space for three-sided queries...... and O(Nlog22 N/log2log2 N) space for four-sided queries. Based on the O(Nlog N) space range counting structure, we develop a data structure that uses O(Nlog2 N) space and answers three-sided range queries in O(logB N+T/B) memory transfers, where T is the number of reported points. Based...

  7. Singular reduction of Nambu-Poisson manifolds

    Science.gov (United States)

    Das, Apurba

    The version of Marsden-Ratiu Poisson reduction theorem for Nambu-Poisson manifolds by a regular foliation have been studied by Ibáñez et al. In this paper, we show that this reduction procedure can be extended to the singular case. Under a suitable notion of Hamiltonian flow on the reduced space, we show that a set of Hamiltonians on a Nambu-Poisson manifold can also be reduced.

  8. Neutrino astronomy and search for WIMPs with MACRO

    CERN Document Server

    Bernardini, P

    2000-01-01

    Upward-going muons, induced primarily by atmospheric neutrinos, are used to search for neutrinos of astrophysical origin. No evidence has been found looking at the event direction and flux limits are obtained on candidate sources. A space-time correlation between gamma ray bursts and upward-going muons has been also investigated. Furthermore the search for a neutrino signal from the Earth and the Sun induced by weakly interacting massive particles (WIMP) has been updated. The number of events from the Sun and from the Earth is compatible with the background from atmospheric neutrinos. Therefore flux limits for different search cones have been estimated. Here we concentrate on neutralinos as WIMP candidates and limits depending on the neutralino mass are given and compared with the prediction of supersymmetric models. (11 refs).

  9. Effects space velocity and gas velocity on DeNOx catalyst with HC reductant; HC tenka NOx kangen shokubai no kukan sokudo oyobi gas ryusoku no eikyo

    Energy Technology Data Exchange (ETDEWEB)

    Niimura, K.; Tsujimura, K.

    1995-04-20

    Discussions were given on the hydrocarbon added reduction catalyst method to reduce NOx in diesel engine exhaust gas. An experiment was carried out with actual exhaust gas from a diesel engine by using a copper ion exchanged zeolite catalyst that has been coated on a honeycomb type substrate, and using propylene as a reductant. When the catalyst volume was changed with the exhaust gas space velocity kept constant, the NOx conversion ratio decreased as the catalyst length is decreased, and the activity shifted to the lower temperature side. The NOx reduction efficiency increased if the faster the gas flow velocity. On the other hand, if the gas flow velocity is slow, the NOx reduction can be carried out with relatively small amount of the reductant. When the catalyst volume was changed with the passing gas amount kept constant, the NOx conversion ratio decreased largely if the catalyst length is decreased. Further, the NOx reduction characteristics shift to the higher temperature side. In the catalyst length direction, the NOx reduction activity shows a relatively uniform action. However, a detailed observation reveals that the reaction heat in the catalyst is transmitted to the wake improving the activity, hence the further down the flow, the NOx conversion ratio gets higher in efficiency. 5 refs., 5 figs., 3 tabs.

  10. Origin of Disagreements in Tandem Mass Spectra Interpretation by Search Engines.

    Science.gov (United States)

    Tessier, Dominique; Lollier, Virginie; Larré, Colette; Rogniaux, Hélène

    2016-10-07

    Several proteomic database search engines that interpret LC-MS/MS data do not identify the same set of peptides. These disagreements occur even when the scores of the peptide-to-spectrum matches suggest good confidence in the interpretation. Our study shows that these disagreements observed for the interpretations of a given spectrum are almost exclusively due to the variation of what we call the "peptide space", i.e., the set of peptides that are actually compared to the experimental spectra. We discuss the potential difficulties of precisely defining the "peptide space." Indeed, although several parameters that are generally reported in publications can easily be set to the same values, many additional parameters-with much less straightforward user access-might impact the "peptide space" used by each program. Moreover, in a configuration where each search engine identifies the same candidates for each spectrum, the inference of the proteins may remain quite different depending on the false discovery rate selected.

  11. Planning for a space infrastructure for disposal of nuclear space power systems

    International Nuclear Information System (INIS)

    Angelo, J. Jr.; Albert, T.E.; Lee, J.

    1989-01-01

    The development of safe, reliable, and compact power systems is vital to humanity's exploration, development, and, ultimately, civilization of space. Nuclear power systems appear to present to offer the only practical option of compact high-power systems. From the very beginning of US space nuclear power activities, safety has been a paramount requirement. Assurance of nuclear safety has included prelaunch ground handling operations, launch, and space operations of nuclear power sources, and more recently serious attention has been given to postoperational disposal of spent or errant nuclear reactor systems. The purpose of this paper is to describe the progress of a project to utilize the capabilities of an evolving space infrastructure for planning for disposal of space nuclear systems. Project SIREN (Search, Intercept, Retrieve, Expulsion - Nuclear) is a project that has been initiated to consider post-operational disposal options for nuclear space power systems. The key finding of Project SIREN was that although no system currently exists to affect the disposal of a nuclear space power system, the requisite technologies for such a system either exist or are planned for part of the evolving space infrastructure

  12. Metaheuristics-Assisted Combinatorial Screening of Eu2+-Doped Ca-Sr-Ba-Li-Mg-Al-Si-Ge-N Compositional Space in Search of a Narrow-Band Green Emitting Phosphor and Density Functional Theory Calculations.

    Science.gov (United States)

    Lee, Jin-Woong; Singh, Satendra Pal; Kim, Minseuk; Hong, Sung Un; Park, Woon Bae; Sohn, Kee-Sun

    2017-08-21

    A metaheuristics-based design would be of great help in relieving the enormous experimental burdens faced during the combinatorial screening of a huge, multidimensional search space, while providing the same effect as total enumeration. In order to tackle the high-throughput powder processing complications and to secure practical phosphors, metaheuristics, an elitism-reinforced nondominated sorting genetic algorithm (NSGA-II), was employed in this study. The NSGA-II iteration targeted two objective functions. The first was to search for a higher emission efficacy. The second was to search for narrow-band green color emissions. The NSGA-II iteration finally converged on BaLi 2 Al 2 Si 2 N 6 :Eu 2+ phosphors in the Eu 2+ -doped Ca-Sr-Ba-Li-Mg-Al-Si-Ge-N compositional search space. The BaLi 2 Al 2 Si 2 N 6 :Eu 2+ phosphor, which was synthesized with no human intervention via the assistance of NSGA-II, was a clear single phase and gave an acceptable luminescence. The BaLi 2 Al 2 Si 2 N 6 :Eu 2+ phosphor as well as all other phosphors that appeared during the NSGA-II iterations were examined in detail by employing powder X-ray diffraction-based Rietveld refinement, X-ray absorption near edge structure, density functional theory calculation, and time-resolved photoluminescence. The thermodynamic stability and the band structure plausibility were confirmed, and more importantly a novel approach to the energy transfer analysis was also introduced for BaLi 2 Al 2 Si 2 N 6 :Eu 2+ phosphors.

  13. Modular Orbital Demonstration of an Evolvable Space Telescope (MODEST)

    Science.gov (United States)

    Baldauf, Brian; Conti, Alberto

    2016-01-01

    The "Search for Life" via imaging of exoplanets is a mission that requires extremely stable telescopes with apertures in the 10 m to 20 m range. The High Definition Space Telescope (HDST) envisioned for this mission would have an aperture >10 m, which is a larger payload than what can be delivered to space using a single launch vehicle. Building and assembling the mirror segments enabling large telescopes will likely require multiple launches and assembly in space. Space-based telescopes with large apertures will require major changes to system architectures.The Optical Telescope Assembly (OTA) for HDST is a primary mission cost driver. Enabling and affordable solutions for this next generation of large aperture space-based telescope are needed.This paper reports on the concept for the Modular Orbital Demonstration of an Evolvable Space Telescope (MODEST), which demonstrates on-orbit robotic and/or astronaut assembly of a precision optical telescope in space. It will also facilitate demonstration of active correction of phase and mirror shape. MODEST is proposed to be delivered to the ISS using standard Express Logistics Carriers (ELCs) and can mounted to one of a variety of ISS pallets. Post-assembly value includes space, ground, and environmental studies, and a testbed for new instruments. This demonstration program for next generation mirror technology provides significant risk reduction and demonstrates the technology in a six-mirror phased telescope. Other key features of the demonstration include the use of an active primary optical surface with wavefront feedback control that allows on-orbit optimization and demonstration of precise surface control to meet optical system wavefront and stability requirements.MODEST will also be used to evaluate advances in lightweight mirror and metering structure materials such as SiC or Carbon Fiber Reinforced Polymer that have excellent mechanical and thermal properties, e.g. high stiffness, high modulus, high thermal

  14. Recent advances in intelligent image search and video retrieval

    CERN Document Server

    2017-01-01

    This book initially reviews the major feature representation and extraction methods and effective learning and recognition approaches, which have broad applications in the context of intelligent image search and video retrieval. It subsequently presents novel methods, such as improved soft assignment coding, Inheritable Color Space (InCS) and the Generalized InCS framework, the sparse kernel manifold learner method, the efficient Support Vector Machine (eSVM), and the Scale-Invariant Feature Transform (SIFT) features in multiple color spaces. Lastly, the book presents clothing analysis for subject identification and retrieval, and performance evaluation methods of video analytics for traffic monitoring. Digital images and videos are proliferating at an amazing speed in the fields of science, engineering and technology, media and entertainment. With the huge accumulation of such data, keyword searches and manual annotation schemes may no longer be able to meet the practical demand for retrieving relevant conte...

  15. Literature in Focus: "Axions: Theory, Cosmology, and Experimental Searches"

    CERN Document Server

    2009-01-01

    Axions are peculiar hypothetical particles that could both solve the CP problem of quantum chromodynamics and at the same time account for the dark matter of the universe. Based on a series of lectures by world experts in this field held at CERN, this volume provides a pedagogical introduction to the theory, cosmology and astrophysics of these fascinating particles and gives an up-to-date account of the status and prospect of ongoing and planned experimental searches. Learners and practitioners of astroparticle physics will find in this book both a concise introduction and a current reference work to a showcase topic that connects the "inner space" of the elementary particle world with the "outer space" of the universe at large. The book will be presented by Markus Kuster. "Axions: Theory, Cosmology, and Experimental Searches", edited by M. Kuster (Technische Universität Darmstadt), G. Raffelt (Max-Planck-Institu...

  16. Preliminary comparison of different reduction methods of graphene ...

    Indian Academy of Sciences (India)

    diverse applications and developing a simple, green, and efficient method for the mass production of ... properties of graphene have driven the search to find methods ... Chemical reduction of GO sheets has been performed with ... efficient method for the mass production of graphene. 2. ... temperature was raised to 35.

  17. From people to entities new semantic search paradigms for the web

    CERN Document Server

    Demartini, G

    2014-01-01

    The exponential growth of digital information available in companies and on the Web creates the need for search tools that can respond to the most sophisticated information needs. Many user tasks would be simplified if Search Engines would support typed search, and return entities instead of just Web documents. For example, an executive who tries to solve a problem needs to find people in the company who are knowledgeable about a certain topic.In the first part of the book, we propose a model for expert finding based on the well-consolidated vector space model for Information Retrieval and inv

  18. Stop searches in flavourful supersymmetry

    CERN Document Server

    Crivellin, Andreas; Tunstall, Lewis C.

    2016-01-01

    Natural realisations of supersymmetry require light stops ${\\tilde t}_1$, making them a prime target of LHC searches for physics beyond the Standard Model. Depending on the kinematic region, the main search channels are ${\\tilde t_1}\\to t \\tilde \\chi^0_1$, ${\\tilde t_1}\\to W b \\tilde \\chi^0_1$ and ${\\tilde t_1}\\to c \\tilde \\chi^0_1$. We first examine the interplay of these decay modes with ${\\tilde c_1}\\to c \\tilde \\chi^0_1$ in a model-independent fashion, revealing the existence of large regions in parameter space which are excluded for any ${\\tilde t_1}\\to c \\tilde \\chi^0_1$ branching ratio. This effect is then illustrated for scenarios with stop-scharm mixing in the right-handed sector, where it has previously been observed that the stop mass limits can be significantly weakened for large mixing. Our analysis shows that once the LHC bounds from ${\\tilde c_1}\\to c \\tilde \\chi^0_1$ searches are taken into account, non-zero stop-scharm mixing leads only to a modest increase in the allowed regions of parameter...

  19. Sweeping the State Space

    DEFF Research Database (Denmark)

    Mailund, Thomas

    The thesis describes the sweep-line method, a newly developed reduction method for alleviating the state explosion problem inherent in explicit-state state space exploration. The basic idea underlying the sweep-line method is, when calculating the state space, to recognise and delete states...... that are not reachable from the currently unprocessed states. Intuitively we drag a sweep-line through the state space with the invariant that all states behind the sweep-line have been processed and are unreachable from the states in front of the sweep-line. When calculating the state space of a system we iteratively...

  20. Enhancements to Constrained Novelty Search: Two-Population Novelty Search for Generating Game Content

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    pop genetic algorithm. These algorithms are applied to the problem of creating diverse and feasible game levels, representative of a large class of important problems in procedural content generation for games. Results show that the new algorithms under certain conditions can produce larger and more...... diverse sets of feasible strategy game maps than existing algorithms. However, the best algorithm is contingent on the particularities of the search space and the genetic operators used. It is also shown that the proposed enhancement of offspring boosting increases performance in all cases....

  1. Searching for galactic white-dwarf binaries in mock LISA data using an F-statistic template bank

    International Nuclear Information System (INIS)

    Whelan, John T; Prix, Reinhard; Khurana, Deepak

    2010-01-01

    We describe an F-statistic search for continuous gravitational waves from galactic white-dwarf binaries in simulated LISA data. Our search method employs a hierarchical template-grid-based exploration of the parameter space. In the first stage, candidate sources are identified in searches using different simulated laser signal combinations (known as TDI variables). Since each source generates a primary maximum near its true 'Doppler parameters' (intrinsic frequency and sky position) as well as numerous secondary maxima of the F-statistic in Doppler parameter space, a search for multiple sources needs to distinguish between true signals and secondary maxima associated with other 'louder' signals. Our method does this by applying a coincidence test to reject candidates which are not found at nearby parameter space positions in searches using each of the three TDI variables. For signals surviving the coincidence test, we perform a fully coherent search over a refined parameter grid to provide an accurate parameter estimation for the final candidates. Suitably tuned, the pipeline is able to extract 1989 true signals with only 5 false alarms. The use of the rigid adiabatic approximation allows recovery of signal parameters with errors comparable to statistical expectations, although there is still some systematic excess with respect to statistical errors expected from Gaussian noise. An experimental iterative pipeline with seven rounds of signal subtraction and reanalysis of the residuals allows us to increase the number of signals recovered to a total of 3419 with 29 false alarms.

  2. Searching for galactic white-dwarf binaries in mock LISA data using an F-statistic template bank

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, John T [Center for Computational Relativity and Gravitation, School of Mathematical Sciences, Rochester Institute of Technology, 85 Lomb Memorial Drive, Rochester, NY 14623 (United States); Prix, Reinhard [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut), D-30167 Hannover (Germany); Khurana, Deepak, E-mail: john.whelan@astro.rit.ed, E-mail: reinhard.prix@aei.mpg.d [Indian Institute of Technology, Kharagpur, West Bengal 721302 (India)

    2010-03-07

    We describe an F-statistic search for continuous gravitational waves from galactic white-dwarf binaries in simulated LISA data. Our search method employs a hierarchical template-grid-based exploration of the parameter space. In the first stage, candidate sources are identified in searches using different simulated laser signal combinations (known as TDI variables). Since each source generates a primary maximum near its true 'Doppler parameters' (intrinsic frequency and sky position) as well as numerous secondary maxima of the F-statistic in Doppler parameter space, a search for multiple sources needs to distinguish between true signals and secondary maxima associated with other 'louder' signals. Our method does this by applying a coincidence test to reject candidates which are not found at nearby parameter space positions in searches using each of the three TDI variables. For signals surviving the coincidence test, we perform a fully coherent search over a refined parameter grid to provide an accurate parameter estimation for the final candidates. Suitably tuned, the pipeline is able to extract 1989 true signals with only 5 false alarms. The use of the rigid adiabatic approximation allows recovery of signal parameters with errors comparable to statistical expectations, although there is still some systematic excess with respect to statistical errors expected from Gaussian noise. An experimental iterative pipeline with seven rounds of signal subtraction and reanalysis of the residuals allows us to increase the number of signals recovered to a total of 3419 with 29 false alarms.

  3. Basis reduction for layered lattices

    NARCIS (Netherlands)

    E.L. Torreão Dassen (Erwin)

    2011-01-01

    htmlabstractWe develop the theory of layered Euclidean spaces and layered lattices. With this new theory certain problems that usually are solved by using classical lattices with a "weighting" gain a new, more natural form. Using the layered lattice basis reduction algorithms introduced here these

  4. Heuristic space diversity control for improved meta-hyper-heuristic performance

    CSIR Research Space (South Africa)

    Grobler, J

    2015-04-01

    Full Text Available This paper expands on the concept of heuristic space diversity and investigates various strategies for the management of heuristic space diversity within the context of a meta-hyper-heuristic algorithm in search of greater performance benefits...

  5. Search for SUSY in the AMSB scenario with the DELPHI detector

    CERN Document Server

    Abdallah, J.; Adam, W.; Adzic, P.; Albrecht, T.; Alderweireld, T.; Alemany-Fernandez, R.; Allmendinger, T.; Allport, P.P.; Amaldi, U.; Amapane, N.; Amato, S.; Anashkin, E.; Andreazza, A.; Andringa, S.; Anjos, N.; Antilogus, P.; Apel, W.D.; Arnoud, Y.; Ask, S.; Asman, B.; Augustin, J.E.; Augustinus, A.; Baillon, P.; Ballestrero, A.; Bambade, P.; Barbier, R.; Bardin, D.; Barker, G.; Baroncelli, A.; Battaglia, M.; Baubillier, M.; Becks, K.H.; Begalli, M.; Behrmann, A.; Ben-Haim, E.; Benekos, N.; Benvenuti, A.; Berat, C.; Berggren, M.; Berntzon, L.; Bertrand, D.; Besancon, M.; Besson, N.; Bloch, D.; Blom, M.; Bluj, M.; Bonesini, M.; Boonekamp, M.; Booth, P.S.L.; Borisov, G.; Botner, O.; Bouquet, B.; Bowcock, T.J.V.; Boyko, I.; Bracko, M.; Brenner, R.; Brodet, E.; Bruckman, P.; Brunet, J.M.; Bugge, L.; Buschmann, P.; Calvi, M.; Camporesi, T.; Canale, V.; Carena, F.; Castro, Nuno Filipe; Cavallo, F.; Chapkin, M.; Charpentier, Ph.; Checchia, P.; Chierici, R.; Chliapnikov, P.; Chudoba, J.; Chung, S.U.; Cieslik, K.; Collins, P.; Contri, R.; Cosme, G.; Cossutti, F.; Costa, M.J.; Crennell, D.; Cuevas, J.; D'Hondt, J.; Dalmau, J.; da Silva, T.; Da Silva, W.; Della Ricca, G.; De Angelis, A.; De Boer, W.; De Clercq, C.; De Lotto, B.; De Maria, N.; De Min, A.; de Paula, L.; Di Ciaccio, L.; Di Simone, A.; Doroba, K.; Drees, J.; Dris, M.; Eigen, G.; Ekelof, T.; Ellert, M.; Elsing, M.; Espirito Santo, M.C.; Fanourakis, G.; Fassouliotis, D.; Feindt, M.; Fernandez, J.; Ferrer, A.; Ferro, F.; Flagmeyer, U.; Foeth, H.; Fokitis, E.; Fulda-Quenzer, F.; Fuster, J.; Gandelman, M.; Garcia, C.; Gavillet, Ph.; Gazis, Evangelos; Gokieli, R.; Golob, B.; Gomez-Ceballos, G.; Goncalves, P.; Graziani, E.; Grosdidier, G.; Grzelak, K.; Guy, J.; Haag, C.; Hallgren, A.; Hamacher, K.; Hamilton, K.; Haug, S.; Hauler, F.; Hedberg, V.; Hennecke, M.; Herr, H.; Hoffman, J.; Holmgren, S.O.; Holt, P.J.; Houlden, M.A.; Hultqvist, K.; Jackson, John Neil; Jarlskog, G.; Jarry, P.; Jeans, D.; Johansson, Erik Karl; Johansson, P.D.; Jonsson, P.; Joram, C.; Jungermann, L.; Kapusta, Frederic; Katsanevas, S.; Katsoufis, E.; Kernel, G.; Kersevan, B.P.; Kerzel, U.; Kiiskinen, A.; King, B.T.; Kjaer, N.J.; Kluit, P.; Kokkinias, P.; Kourkoumelis, C.; Kouznetsov, O.; Krumstein, Z.; Kucharczyk, M.; Lamsa, J.; Leder, G.; Ledroit, Fabienne; Leinonen, L.; Leitner, R.; Lemonne, J.; Lepeltier, V.; Lesiak, T.; Liebig, W.; Liko, D.; Lipniacka, A.; Lopes, J.H.; Lopez, J.M.; Loukas, D.; Lutz, P.; Lyons, L.; MacNaughton, J.; Malek, A.; Maltezos, S.; Mandl, F.; Marco, J.; Marco, R.; Marechal, B.; Margoni, M.; Marin, J.C.; Mariotti, C.; Markou, A.; Martinez-Rivero, C.; Masik, J.; Mastroyiannopoulos, N.; Matorras, F.; Matteuzzi, C.; Mazzucato, F.; Mazzucato, M.; McNulty, R.; Meroni, C.; Migliore, E.; Mitaroff, W.; Mjoernmark, U.; Moa, T.; Moch, M.; Monig, Klaus; Monge, R.; Montenegro, J.; Moraes, D.; Moreno, S.; Morettini, P.; Mueller, U.; Muenich, K.; Mulders, M.; Mundim, L.; Murray, W.; Muryn, B.; Myatt, G.; Myklebust, T.; Nassiakou, M.; Navarria, F.; Nawrocki, K.; Nicolaidou, R.; Nikolenko, M.; Oblakowska-Mucha, A.; Obraztsov, V.; Olshevski, A.; Onofre, A.; Orava, R.; Osterberg, K.; Ouraou, A.; Oyanguren, A.; Paganoni, M.; Paiano, S.; Palacios, J.P.; Palka, H.; Papadopoulou, Th.D.; Pape, L.; Parkes, C.; Parodi, F.; Parzefall, U.; Passeri, A.; Passon, O.; Peralta, L.; Perepelitsa, V.; Perrotta, A.; Petrolini, A.; Piedra, J.; Pieri, L.; Pierre, F.; Pimenta, M.; Piotto, E.; Podobnik, T.; Poireau, V.; Pol, M.E.; Polok, G.; Poropat, P.; Pozdniakov, V.; Pukhaeva, N.; Pullia, A.; Rames, J.; Ramler, L.; Read, Alexander L.; Rebecchi, P.; Rehn, J.; Reid, D.; Reinhardt, R.; Renton, P.; Richard, F.; Ridky, J.; Rivero, M.; Rodriguez, D.; Romero, A.; Ronchese, P.; Roudeau, P.; Rovelli, T.; Ruhlmann-Kleider, V.; Ryabtchikov, D.; Sadovsky, A.; Salmi, L.; Salt, J.; Savoy-Navarro, A.; Schwickerath, U.; Segar, A.; Sekulin, R.; Siebel, M.; Sisakian, A.; Smadja, G.; Smirnova, O.; Sokolov, A.; Sopczak, A.; Sosnowski, R.; Spassov, T.; Stanitzki, M.; Stocchi, A.; Strauss, J.; Stugu, B.; Szczekowski, M.; Szeptycka, M.; Szumlak, T.; Tabarelli, T.; Taffard, A.C.; Tegenfeldt, F.; Timmermans, Jan; Tkatchev, L.; Tobin, M.; Todorovova, S.; Tome, B.; Tonazzo, A.; Tortosa, P.; Travnicek, P.; Treille, D.; Tristram, G.; Trochimczuk, M.; Troncon, C.; Turluer, M.L.; Tyapkin, I.A.; Tyapkin, P.; Tzamarias, S.; Uvarov, V.; Valenti, G.; Van Dam, Piet; Van Eldik, J.; Van Lysebetten, A.; van Remortel, N.; Van Vulpen, I.; Vegni, G.; Veloso, F.; Venus, W.; Verdier, P.; Verzi, V.; Vilanova, D.; Vitale, L.; Vrba, V.; Wahlen, H.; Washbrook, A.J.; Weiser, C.; Wicke, D.; Wickens, J.; Wilkinson, G.; Winter, M.; Witek, M.; Yushchenko, O.; Zalewska, A.; Zalewski, P.; Zavrtanik, D.; Zhuravlov, V.; Zimine, N.I.; Zintchenko, A.; Zupan, M.

    2004-01-01

    The DELPHI experiment at the LEP e+e- collider collected almost 700 pb^-1 at centre-of-mass energies above the Z0 mass pole and up to 208 GeV. Those data were used to search for SUSY in the Anomaly Mediated SUSY Breaking (AMSB) scenario with a flavour independent common sfermion mass parameter. The searches covered several possible signatures experimentally accessible at LEP, with either the neutralino, the sneutrino or the stau being the Lightest Supersymmetric Particle (LSP). They included: the search for nearly mass-degenerate chargino and neutralino, which is a typical feature of AMSB; the search for Standard-Model-like or invisibly decaying Higgs boson; the search for stable staus; the search for cascade decays of SUSY particles resulting in the LSP and a low multiplicity final state containing neutrinos. No evidence of a signal was found, and thus constraints were set in the space of the parameters of the model.

  6. Search for Radions at LEP2

    CERN Document Server

    Abbiendi, G.; Akesson, P.F.; Alexander, G.; Allison, John; Amaral, P.; Anagnostou, G.; Anderson, K.J.; Asai, S.; Axen, D.; Bailey, I.; Barberio, E.; Barillari, T.; Barlow, R.J.; Batley, R.J.; Bechtle, P.; Behnke, T.; Bell, Kenneth Watson; Bell, P.J.; Bella, G.; Bellerive, A.; Benelli, G.; Bethke, S.; Biebel, O.; Boeriu, O.; Bock, P.; Boutemeur, M.; Braibant, S.; Brown, Robert M.; Burckhart, H.J.; Campana, S.; Capiluppi, P.; Carnegie, R.K.; Carter, A.A.; Carter, J.R.; Chang, C.Y.; Charlton, D.G.; Ciocca, C.; Csilling, A.; Cuffiani, M.; Dado, S.; De Roeck, A.; De Wolf, E.A.; Desch, K.; Dienes, B.; Donkers, M.; Dubbert, J.; Duchovni, E.; Duckeck, G.; Duerdoth, I.P.; Etzion, E.; Fabbri, F.; Ferrari, P.; Fiedler, F.; Fleck, I.; Ford, M.; Frey, A.; Gagnon, P.; Gary, John William; Geich-Gimbel, C.; Giacomelli, G.; Giacomelli, P.; Giunta, Marina; Goldberg, J.; Gross, E.; Grunhaus, J.; Gruwe, M.; Gunther, P.O.; Gupta, A.; Hajdu, C.; Hamann, M.; Hanson, G.G.; Harel, A.; Hauschild, M.; Hawkes, C.M.; Hawkings, R.; Hemingway, R.J.; Herten, G.; Heuer, R.D.; Hill, J.C.; Hoffman, Kara Dion; Horvath, D.; Igo-Kemenes, P.; Ishii, K.; Jeremie, H.; Jovanovic, P.; Junk, T.R.; Kanzaki, J.; Karlen, D.; Kawagoe, K.; Kawamoto, T.; Keeler, R.K.; Kellogg, R.G.; Kennedy, B.W.; Kluth, S.; Kobayashi, T.; Kobel, M.; Komamiya, S.; Kramer, T.; Krieger, P.; von Krogh, J.; Kuhl, T.; Kupper, M.; Lafferty, G.D.; Landsman, H.; Lanske, D.; Lellouch, D.; Lettso, J.; Levinson, L.; Lillich, J.; Lloyd, S.L.; Loebinger, F.K.; Lu, J.; Ludwig, A.; Ludwig, J.; Mader, W.; Marcellini, S.; Martin, A.J.; Masetti, G.; Mashimo, T.; Mattig, Peter; McKenna, J.; McPherson, R.A.; Meijers, F.; Menges, W.; Merritt, F.S.; Mes, H.; Meyer, Niels T.; Michelini, A.; Mihara, S.; Mikenberg, G.; Miller, D.J.; Mohr, W.; Mori, T.; Mutter, A.; Nagai, K.; Nakamura, I.; Nanjo, H.; Neal, H.A.; Nisius, R.; O'Neale, S.W.; Oh, A.; Oreglia, M.J.; Orito, S.; Pahl, C.; Pasztor, G.; Pater, J.R.; Pilcher, J.E.; Pinfold, J.; Plane, David E.; Pooth, O.; Przybycien, M.; Quadt, A.; Rabbertz, K.; Rembser, C.; Renkel, P.; Roney, J.M.; Rossi, A.M.; Rozen, Y.; Runge, K.; Sachs, K.; Saeki, T.; Sarkisyan, E.K.G.; Schaile, A.D.; Schaile, O.; Scharff-Hansen, P.; Schieck, J.; Schorner-Sadenius, T.; Schroder, Matthias; Schumacher, M.; Seuster, R.; Shears, T.G.; Shen, B.C.; Sherwood, P.; Skuja, A.; Smith, A.M.; Sobie, R.; Soldner-Rembold, S.; Spano, F.; Stahl, A.; Strom, David M.; Strohmer, R.; Tarem, S.; Tasevsky, M.; Teuscher, R.; Thomson, M.A.; Torrence, E.; Toya, D.; Tran, P.; Trigger, I.; Trocsanyi, Z.; Tsur, E.; Turner-Watson, M.F.; Ueda, I.; Ujvari, B.; Vollmer, C.F.; Vannerem, P.; Vertesi, R.; Verzocchi, M.; Voss, H.; Vossebeld, J.; Ward, C.P.; Ward, D.R.; Watkins, P.M.; Watson, A.T.; Watson, N.K.; Wells, P.S.; Wengler, T.; Wermes, N.; Wilson, G.W.; Wilson, J.A.; Wolf, G.; Wyatt, T.R.; Yamashita, S.; Zer-Zion, D.; Zivkovic, Lidija

    2005-01-01

    A new scalar resonance, called the radion, with couplings to fermions and bosons similar to those of the Higgs boson, is predicted in the framework of Randall-Sundrum models, proposed solutions to the hierarchy problem with one extra dimension. An important distinction between the radion and the Higgs boson is that the radion would couple directly to gluon pairs, and in particular its decay products would include a significant fraction of gluon jets. The radion had the same quantum numbers as the Standard Model (SM) Higgs boson, and therefore they can mix, with the resulting mass eigenstates having properties different from those of the SM Higgs boson. Existing searches for the Higgs boson are sensitive to the possible production and decay of radions and Higgs bosons in these models. For the first time, searches for the SM Higgs boson and flavour-independent and decay-mode independent searches for a neutral Higgs boson are used in combination to explore the parameter space of the Randall-Sundrum model. In the...

  7. Searches for Prompt R-Parity-Violating Supersymmetry at the LHC

    International Nuclear Information System (INIS)

    Redelbach, Andreas

    2015-01-01

    Searches for supersymmetry (SUSY) at the LHC frequently assume the conservation of R-parity in their design, optimization, and interpretation. In the case that R-parity is not conserved, constraints on SUSY particle masses tend to be weakened with respect to R-parity-conserving models. We review the current status of searches for R-parity-violating (RPV) supersymmetry models at the ATLAS and CMS experiments, limited to 8 TeV search results published or submitted for publication as of the end of March 2015. All forms of renormalisable RPV terms leading to prompt signatures have been considered in the set of analyses under review. Discussing results for searches for prompt R-parity-violating SUSY signatures summarizes the main constraints for various RPV models from LHC Run I and also defines the basis for promising signal regions to be optimized for Run II. In addition to identifying highly constrained regions from existing searches, also gaps in the coverage of the parameter space of RPV SUSY are outlined

  8. Central subspace dimensionality reduction using covariance operators.

    Science.gov (United States)

    Kim, Minyoung; Pavlovic, Vladimir

    2011-04-01

    We consider the task of dimensionality reduction informed by real-valued multivariate labels. The problem is often treated as Dimensionality Reduction for Regression (DRR), whose goal is to find a low-dimensional representation, the central subspace, of the input data that preserves the statistical correlation with the targets. A class of DRR methods exploits the notion of inverse regression (IR) to discover central subspaces. Whereas most existing IR techniques rely on explicit output space slicing, we propose a novel method called the Covariance Operator Inverse Regression (COIR) that generalizes IR to nonlinear input/output spaces without explicit target slicing. COIR's unique properties make DRR applicable to problem domains with high-dimensional output data corrupted by potentially significant amounts of noise. Unlike recent kernel dimensionality reduction methods that employ iterative nonconvex optimization, COIR yields a closed-form solution. We also establish the link between COIR, other DRR techniques, and popular supervised dimensionality reduction methods, including canonical correlation analysis and linear discriminant analysis. We then extend COIR to semi-supervised settings where many of the input points lack their labels. We demonstrate the benefits of COIR on several important regression problems in both fully supervised and semi-supervised settings.

  9. Reduction in secondary dendrite arm spacing in cast eutectic Al-Si piston alloys by cerium addition

    Science.gov (United States)

    Ahmad, R.; Asmael, M. B. A.; Shahizan, N. R.; Gandouz, S.

    2017-01-01

    The effects of Ce on the secondary dendrite arm spacing (SDAS) and mechanical behavior of Al-Si-Cu-Mg alloys were investigated. The reduction of SDAS at different Ce concentrations was evaluated in a directional solidification experiment via computer-aided cooling curve thermal analysis (CA‒CCTA). The results showed that 0.1wt%-1.0wt% Ce addition resulted in a rapid solidification time, Δ t s, and low solidification temperature, Δ T S, whereas 0.1wt% Ce resulted in a fast solidification time, Δ t a-Al, of the α-Al phase. Furthermore, Ce addition refined the SDAS, which was reduced to approximately 36%. The mechanical properties of the alloys with and without Ce were investigated using tensile and hardness tests. The quality index ( Q) and ultimate tensile strength of (UTS) Al-Si-Cu-Mg alloys significantly improved with the addition of 0.1wt% Ce. Moreover, the base alloy hardness was improved with increasing Ce concentration.

  10. Application of artificial intelligence to search ground-state geometry of clusters

    International Nuclear Information System (INIS)

    Lemes, Mauricio Ruv; Marim, L.R.; Dal Pino, A. Jr.

    2002-01-01

    We introduce a global optimization procedure, the neural-assisted genetic algorithm (NAGA). It combines the power of an artificial neural network (ANN) with the versatility of the genetic algorithm. This method is suitable to solve optimization problems that depend on some kind of heuristics to limit the search space. If a reasonable amount of data is available, the ANN can 'understand' the problem and provide the genetic algorithm with a selected population of elements that will speed up the search for the optimum solution. We tested the method in a search for the ground-state geometry of silicon clusters. We trained the ANN with information about the geometry and energetics of small silicon clusters. Next, the ANN learned how to restrict the configurational space for larger silicon clusters. For Si 10 and Si 20 , we noticed that the NAGA is at least three times faster than the 'pure' genetic algorithm. As the size of the cluster increases, it is expected that the gain in terms of time will increase as well

  11. δ-Similar Elimination to Enhance Search Performance of Multiobjective Evolutionary Algorithms

    Science.gov (United States)

    Aguirre, Hernán; Sato, Masahiko; Tanaka, Kiyoshi

    In this paper, we propose δ-similar elimination to improve the search performance of multiobjective evolutionary algorithms in combinatorial optimization problems. This method eliminates similar individuals in objective space to fairly distribute selection among the different regions of the instantaneous Pareto front. We investigate four eliminating methods analyzing their effects using NSGA-II. In addition, we compare the search performance of NSGA-II enhanced by our method and NSGA-II enhanced by controlled elitism.

  12. Space, politics, and the political

    OpenAIRE

    dikec , mustafa

    1987-01-01

    International audience; Introduction Geography and politics'', Gottmann wrote in 1980, ``have long been in search of each other'' (page 11). Debates in the literature suggest not only that they have found each other, but also that the encounter has instigated, notably in the last decade or so, a body of literature seeking to think space politically, and to think politics spatially. This is not to suggest that previous work on space was apolitical, nor to suggest that previous work on politics...

  13. A Functional Programming Approach to AI Search Algorithms

    Science.gov (United States)

    Panovics, Janos

    2012-01-01

    The theory and practice of search algorithms related to state-space represented problems form the major part of the introductory course of Artificial Intelligence at most of the universities and colleges offering a degree in the area of computer science. Students usually meet these algorithms only in some imperative or object-oriented language…

  14. Optimizing the search for transiting planets in long time series

    Science.gov (United States)

    Ofir, Aviv

    2014-01-01

    Context. Transit surveys, both ground- and space-based, have already accumulated a large number of light curves that span several years. Aims: The search for transiting planets in these long time series is computationally intensive. We wish to optimize the search for both detection and computational efficiencies. Methods: We assume that the searched systems can be described well by Keplerian orbits. We then propagate the effects of different system parameters to the detection parameters. Results: We show that the frequency information content of the light curve is primarily determined by the duty cycle of the transit signal, and thus the optimal frequency sampling is found to be cubic and not linear. Further optimization is achieved by considering duty-cycle dependent binning of the phased light curve. By using the (standard) BLS, one is either fairly insensitive to long-period planets or less sensitive to short-period planets and computationally slower by a significant factor of ~330 (for a 3 yr long dataset). We also show how the physical system parameters, such as the host star's size and mass, directly affect transit detection. This understanding can then be used to optimize the search for every star individually. Conclusions: By considering Keplerian dynamics explicitly rather than implicitly one can optimally search the BLS parameter space. The presented Optimal BLS enhances the detectability of both very short and very long period planets, while allowing such searches to be done with much reduced resources and time. The Matlab/Octave source code for Optimal BLS is made available. The MATLAB code is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/561/A138

  15. OGC Geographic Information Service Deductive Semantic Reasoning Based on Description Vocabularies Reduction

    Directory of Open Access Journals (Sweden)

    MIAO Lizhi

    2015-09-01

    Full Text Available As geographic information interoperability and sharing developing, more and more interoperable OGC (open geospatial consortium Web services (OWS are generated and published through the internet. These services can facilitate the integration of different scientific applications by searching, finding, and utilizing the large number of scientific data and Web services. However, these services are widely dispersed and hard to be found and utilized with executive semantic retrieval. This is especially true when considering the weak semantic description of geographic information service data. Focusing on semantic retrieval and reasoning of the distributed OWS resources, a deductive and semantic reasoning method is proposed to describe and search relevant OWS resources. Specifically, ①description words are extracted from OWS metadata file to generate GISe ontology-database and instance-database based on geographic ontology according to basic geographic elements category, ②a description words reduction model is put forward to implement knowledge reduction on GISe instance-database based on rough set theory and generate optimized instances database, ③utilizing GISe ontology-database and optimized instance-database to implement semantic inference and reasoning of geographic searching objects is used as an example to demonstrate the efficiency, feasibility and recall ration of the proposed description-word-based reduction model.

  16. OAST Space Theme Workshop. Volume 2: Theme summary. 3: Search for extraterrestrial intelligence (no. 9). A: Theme statement. B. 26 April 1976 presentation. C. Summary. D. Newer initiatives (form 4). E. Initiative actions (form 5)

    Science.gov (United States)

    1976-01-01

    Preliminary (1977-1983), intermediate (1982-1988), and long term (1989+) phases of the search for extraterrestrial intelligence (SETI) program are examined as well as the benefits to be derived in radioastronomy and the problems to be surmounted in radio frequency interference. The priorities, intrinsic value, criteria, and strategy for the search are discussed for both terrestrial and lunar-based CYCLOPS and for a space SETI system located at lunar liberation point L4. New initiatives related to antenna independent technology, multichannel analyzers, and radio frequency interference shielding are listed. Projected SETI program costs are included.

  17. Identification of Predictive Cis-Regulatory Elements Using a Discriminative Objective Function and a Dynamic Search Space.

    Directory of Open Access Journals (Sweden)

    Rahul Karnik

    Full Text Available The generation of genomic binding or accessibility data from massively parallel sequencing technologies such as ChIP-seq and DNase-seq continues to accelerate. Yet state-of-the-art computational approaches for the identification of DNA binding motifs often yield motifs of weak predictive power. Here we present a novel computational algorithm called MotifSpec, designed to find predictive motifs, in contrast to over-represented sequence elements. The key distinguishing feature of this algorithm is that it uses a dynamic search space and a learned threshold to find discriminative motifs in combination with the modeling of motifs using a full PWM (position weight matrix rather than k-mer words or regular expressions. We demonstrate that our approach finds motifs corresponding to known binding specificities in several mammalian ChIP-seq datasets, and that our PWMs classify the ChIP-seq signals with accuracy comparable to, or marginally better than motifs from the best existing algorithms. In other datasets, our algorithm identifies novel motifs where other methods fail. Finally, we apply this algorithm to detect motifs from expression datasets in C. elegans using a dynamic expression similarity metric rather than fixed expression clusters, and find novel predictive motifs.

  18. SPACE 365: Upgraded App for Aviation and Space-Related Information and Program Planning

    Science.gov (United States)

    Williams, S.; Maples, J. E.; Castle, C. E.

    2014-12-01

    Foreknowledge of upcoming events and anniversary dates can be extraordinarily valuable in the planning and preparation of a variety of aviation and Space-related educational programming. Alignment of programming with items "newsworthy" enough to attract media attention on their own can result in effective program promotion at low/no cost. Similarly, awareness and avoidance of dates upon which media and public attention will likely be elsewhere can keep programs from being lost in the noise.NASA has created a useful and entertaining app called "SPACE 365" to help supply that foreknowledge. The app contains an extensive database of historical aviation and Space exploration-related events, along with other events and birthdays to provide socio-historical context, as well as an extensive file of present and future space missions, complete with images and videos. The user can search by entry topic category, date, and key words. Upcoming Events allows the user to plan, participate, and engage in significant "don't miss" happenings.The historical database was originally developed for use at the National Air and Space Museum, then expanded significantly to include more NASA-related information. The CIMA team at NASA MSFC, sponsored by the Planetary Science Division, added NASA current events and NASA educational programming information, and are continually adding new information and improving the functionality and features of the app. Features of SPACE 365 now include: NASA Image of the Day, Upcoming NASA Events, Event Save, Do Not Miss, and Ask Dr. Steve functions, and the CIMA team recently added a new start page and added improved search and navigation capabilities. App users can now socialize the Images of the Day via Twitter, Pinterest, Facebook, and other social media outlets.SPACE 365 is available at no cost from both the Apple appstore and GooglePlay, and has helped NASA, NASM, and other educators plan and schedule programming events. It could help you, too!

  19. Comparison of Algorithms for the Optimal Location of Control Valves for Leakage Reduction in WDNs

    Directory of Open Access Journals (Sweden)

    Enrico Creaco

    2018-04-01

    Full Text Available The paper presents the comparison of two different algorithms for the optimal location of control valves for leakage reduction in water distribution networks (WDNs. The former is based on the sequential addition (SA of control valves. At the generic step Nval of SA, the search for the optimal combination of Nval valves is carried out, while containing the optimal combination of Nval − 1 valves found at the previous step. Therefore, only one new valve location is searched for at each step of SA, among all the remaining available locations. The latter algorithm consists of a multi-objective genetic algorithm (GA, in which valve locations are encoded inside individual genes. For the sake of consistency, the same embedded algorithm, based on iterated linear programming (LP, was used inside SA and GA, to search for the optimal valve settings at various time slots in the day. The results of applications to two WDNs show that SA and GA yield identical results for small values of Nval. When this number grows, the limitations of SA, related to its reduced exploration of the research space, emerge. In fact, for higher values of Nval, SA tends to produce less beneficial valve locations in terms of leakage abatement. However, the smaller computation time of SA may make this algorithm preferable in the case of large WDNs, for which the application of GA would be overly burdensome.

  20. Microblowing Technique for Drag Reduction, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA seeks to develop technologies for aircraft drag reduction which contribute to improved aerodynamic efficiency in support of national goals for reducing fuel...

  1. Characteristic sounds facilitate visual search.

    Science.gov (United States)

    Iordanescu, Lucica; Guzman-Martinez, Emmanuel; Grabowecky, Marcia; Suzuki, Satoru

    2008-06-01

    In a natural environment, objects that we look for often make characteristic sounds. A hiding cat may meow, or the keys in the cluttered drawer may jingle when moved. Using a visual search paradigm, we demonstrated that characteristic sounds facilitated visual localization of objects, even when the sounds carried no location information. For example, finding a cat was faster when participants heard a meow sound. In contrast, sounds had no effect when participants searched for names rather than pictures of objects. For example, hearing "meow" did not facilitate localization of the word cat. These results suggest that characteristic sounds cross-modally enhance visual (rather than conceptual) processing of the corresponding objects. Our behavioral demonstration of object-based cross-modal enhancement complements the extensive literature on space-based cross-modal interactions. When looking for your keys next time, you might want to play jingling sounds.

  2. Search paths of swans foraging on spatially autocorrelated tubers

    NARCIS (Netherlands)

    Nolet, B.A.; Mooij, W.M.

    2002-01-01

    1. Tundra swans forage on below-ground pondweed tubers that are heterogeneously distributed in space. The swans have no visual cues to delineate patches. It was tested whether swans employ an area-restricted search tactic. Theory predicts that swans should alternate between an intensive (low-speed,

  3. The millimeter wave spectrum of methyl cyanate: a laboratory study and astronomical search in space ⋆,⋆⋆

    Science.gov (United States)

    Kolesniková, L.; Alonso, J. L.; Bermúdez, C.; Alonso, E. R.; Tercero, B.; Cernicharo, J.; Guillemin, J.-C.

    2016-01-01

    Aims The recent discovery of methyl isocyanate (CH3NCO) in Sgr B2(N) and Orion KL makes methyl cyanate (CH3OCN) a potential molecule in the interstellar medium. The aim of this work is to fulfill the first requirement for its unequivocal identification in space, i.e. the availability of transition frequencies with high accuracy. Methods The room-temperature rotational spectrum of methyl cyanate was recorded in the millimeter wave domain from 130 to 350 GHz. All rotational transitions revealed A-E splitting owing to methyl internal rotation and were globally analyzed using the ERHAM program. Results The data set for the ground torsional state of methyl cyanate exceeds 700 transitions within J″ = 10 – 35 and Ka″=0−13 and newly derived spectroscopic constants reproduce the spectrum close to the experimental uncertainty. Spectral features of methyl cyanate were then searched for in Orion KL, Sgr B2(N), B1-b, and TMC-1 molecular clouds. Upper limits to the column density of methyl cyanate are provided. PMID:27721514

  4. Searching for the right word: Hybrid visual and memory search for words.

    Science.gov (United States)

    Boettcher, Sage E P; Wolfe, Jeremy M

    2015-05-01

    In "hybrid search" (Wolfe Psychological Science, 23(7), 698-703, 2012), observers search through visual space for any of multiple targets held in memory. With photorealistic objects as the stimuli, response times (RTs) increase linearly with the visual set size and logarithmically with the memory set size, even when over 100 items are committed to memory. It is well-established that pictures of objects are particularly easy to memorize (Brady, Konkle, Alvarez, & Oliva Proceedings of the National Academy of Sciences, 105, 14325-14329, 2008). Would hybrid-search performance be similar if the targets were words or phrases, in which word order can be important, so that the processes of memorization might be different? In Experiment 1, observers memorized 2, 4, 8, or 16 words in four different blocks. After passing a memory test, confirming their memorization of the list, the observers searched for these words in visual displays containing two to 16 words. Replicating Wolfe (Psychological Science, 23(7), 698-703, 2012), the RTs increased linearly with the visual set size and logarithmically with the length of the word list. The word lists of Experiment 1 were random. In Experiment 2, words were drawn from phrases that observers reported knowing by heart (e.g., "London Bridge is falling down"). Observers were asked to provide four phrases, ranging in length from two words to no less than 20 words (range 21-86). All words longer than two characters from the phrase, constituted the target list. Distractor words were matched for length and frequency. Even with these strongly ordered lists, the results again replicated the curvilinear function of memory set size seen in hybrid search. One might expect to find serial position effects, perhaps reducing the RTs for the first (primacy) and/or the last (recency) members of a list (Atkinson & Shiffrin, 1968; Murdock Journal of Experimental Psychology, 64, 482-488, 1962). Surprisingly, we showed no reliable effects of word order

  5. van Manen's method and reduction in a phenomenological hermeneutic study.

    Science.gov (United States)

    Heinonen, Kristiina

    2015-03-01

    To describe van Manen's method and concept of reduction in a study that used a phenomenological hermeneutic approach. Nurse researchers have used van Manen's method in different ways. Participants' lifeworlds are described in depth, but descriptions of reduction have been brief. The literature and knowledge review and manual search of research articles. Databases Web Science, PubMed, CINAHL and PsycINFO, without applying a time period, to identify uses of van Manen's method. This paper shows how van Manen's method has been used in nursing research and gives some examples of van Manen's reduction. Reduction enables us to conduct in-depth phenomenological hermeneutic research and understand people's lifeworlds. As there are many variations in adapting reduction, it is complex and confusing. This paper contributes to the discussion of phenomenology, hermeneutic study and reduction. It opens up reduction as a method for researchers to exploit.

  6. An autonomous organic reaction search engine for chemical reactivity

    Science.gov (United States)

    Dragone, Vincenza; Sans, Victor; Henson, Alon B.; Granda, Jaroslaw M.; Cronin, Leroy

    2017-06-01

    The exploration of chemical space for new reactivity, reactions and molecules is limited by the need for separate work-up-separation steps searching for molecules rather than reactivity. Herein we present a system that can autonomously evaluate chemical reactivity within a network of 64 possible reaction combinations and aims for new reactivity, rather than a predefined set of targets. The robotic system combines chemical handling, in-line spectroscopy and real-time feedback and analysis with an algorithm that is able to distinguish and select the most reactive pathways, generating a reaction selection index (RSI) without need for separate work-up or purification steps. This allows the automatic navigation of a chemical network, leading to previously unreported molecules while needing only to do a fraction of the total possible reactions without any prior knowledge of the chemistry. We show the RSI correlates with reactivity and is able to search chemical space using the most reactive pathways.

  7. A search for time variability and its possible regularities in linear polarization of Be stars

    International Nuclear Information System (INIS)

    Huang, L.; Guo, Z.H.; Hsu, J.C.; Huang, L.

    1989-01-01

    Linear polarization measurements are presented for 14 Be stars obtained at McDonald Observatory during four observing runs from June to November of 1983. Methods of observation and data reduction are described. Seven of eight program stars which were observed on six or more nights exhibited obvious polarimetric variations on time-scales of days or months. The incidence is estimated as 50% and may be as high as 93%. No connection can be found between polarimetric variability and rapid periodic light or spectroscopic variability for our stars. Ultra-rapid variability on time-scale of minutes was searched for with negative results. In all cases the position angles also show variations indicating that the axis of symmetry of the circumstellar envelope changes its orientation in space. For the Be binary CX Dra the variations in polarization seems to have a period which is just half of the orbital period

  8. Searching for millisecond pulsars: surveys, techniques and prospects

    International Nuclear Information System (INIS)

    Stovall, K; Lorimer, D R; Lynch, R S

    2013-01-01

    Searches for millisecond pulsars (which we here loosely define as those with periods < 20 ms) in the galactic field have undergone a renaissance in the past five years. New or recently refurbished radio telescopes utilizing cooled receivers and state-of-the art digital data acquisition systems are carrying out surveys of the entire sky at a variety of radio frequencies. Targeted searches for millisecond pulsars in point sources identified by the Fermi Gamma-ray Space Telescope have proved phenomenally successful, with over 50 discoveries in the past five years. The current sample of millisecond pulsars now numbers almost 200 and, for the first time in 25 years, now outnumbers their counterparts in galactic globular clusters. While many of these searches are motivated to find pulsars which form part of pulsar timing arrays, a wide variety of interesting systems are now being found. Following a brief overview of the millisecond pulsar phenomenon, we describe these searches and present some of the highlights of the new discoveries in the past decade. We conclude with predictions and prospects for ongoing and future surveys. (paper)

  9. Highlights on searches for supersymmetry and exotic models

    International Nuclear Information System (INIS)

    Clerbaux, B.

    2015-01-01

    In this review, we present highlight results of the first three years of the LHC running on searches for new physics beyond the Standard Model (BSM). The excellent performance of the LHC machine and detectors has provided a large, high-quality dataset, mainly proton-proton interactions at a centre of mass energy of 7 TeV (collected in 2010 and 2011) and 8 TeV (collected in 2012). This allowed the experiments to test the Standard Model (SM) at the highest available energy and to search for new phenomena in a considerably enlarged phase space compared to previous colliders. The present review is organised as follows. Section 2 gives motivations to search for new physics beyond the SM, and a brief description of the main classes of BSM theory candidates is reported in Section 3. Section 4 summarises the characteristics of the 3-year LHC dataset, called in the following the Run 1 dataset. Precise tests of the SM are reported in Section 5. The following next sections are the core of the review and present a selection of results from the ATLAS and CMS experiments on BSM searches, gathered in four parts: the search for new physics in the scalar sector in Section 6, the search for supersymmetric particles in Section 7, the search for dark matter candidates in Section 8, and a non-exhaustive list of other exotics BSM (heavy resonances, excited fermions, leptoquarks and vector-like quarks) searches in Section 9. Future plans of the LHC running are reported in Section 10

  10. QCD processes and search for supersymmetry at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Schum, Torben

    2012-07-15

    In this thesis, a data-driven method to estimate the number of QCD background events in a multijet search for supersymmetry at the LHC was developed. The method makes use of two models which predict the correlation of two key search variables, the missing transverse momentum and an angular variable, in order to extrapolate from a QCD dominated control region to the signal region. A good performance of the method was demonstrated by its application to 36 pb{sup -1} data, taken by the CMS experiment in 2010, and by the comparison with an alternative method. Comparing the number of data events to a combined background expectation of QCD and data-driven estimates of the electroweak and top background, no statistically significant excess was observed for three pre-defined search regions. Limits were calculated for the (m{sub 0},m{sub 1/2}) parameter space of the cMSSM, exceeding previous measurements. The expected sensitivity for further refined search regions was investigated.

  11. QCD processes and search for supersymmetry at the LHC

    International Nuclear Information System (INIS)

    Schum, Torben

    2012-07-01

    In this thesis, a data-driven method to estimate the number of QCD background events in a multijet search for supersymmetry at the LHC was developed. The method makes use of two models which predict the correlation of two key search variables, the missing transverse momentum and an angular variable, in order to extrapolate from a QCD dominated control region to the signal region. A good performance of the method was demonstrated by its application to 36 pb -1 data, taken by the CMS experiment in 2010, and by the comparison with an alternative method. Comparing the number of data events to a combined background expectation of QCD and data-driven estimates of the electroweak and top background, no statistically significant excess was observed for three pre-defined search regions. Limits were calculated for the (m 0 ,m 1/2 ) parameter space of the cMSSM, exceeding previous measurements. The expected sensitivity for further refined search regions was investigated.

  12. The space elevator: a new tool for space studies.

    Science.gov (United States)

    Edwards, Bradley C

    2003-06-01

    The objective has been to develop a viable scenario for the construction, deployment and operation of a space elevator using current or near future technology. This effort has been primarily a paper study with several experimental tests of specific systems. Computer simulations, engineering designs, literature studies and inclusion of existing programs have been utilized to produce a design for the first space elevator. The results from this effort illustrate a viable design using current and near-term technology for the construction of the first space elevator. The timeline for possible construction is within the coming decades and estimated costs are less than $10 B. The initial elevator would have a 5 ton/day capacity and operating costs near $100/lb for payloads going to any Earth orbit or traveling to the Moon, Mars, Venus or the asteroids. An operational space elevator would allow for larger and much longer-term biological space studies at selectable gravity levels. The high-capacity and low operational cost of this system would also allow for inexpensive searches for life throughout our solar system and the first tests of environmental engineering. This work is supported by a grant from the NASA Institute for Advanced Concepts (NIAC).

  13. Gamma ray astronomy and search for antimatter in the universe

    International Nuclear Information System (INIS)

    Schoenfelder, V.

    1989-01-01

    Gamma ray astronomy provides a powerful tool for searching antimatter in the universe; it probably provides the only means to determine, if the universe has baryon symmetry. Presently existing gamma-ray observations can be interpreted without postulating the existence of antimatter. However, the measurements are not precise enough to definitely exclude the possibility of its existence. The search for antimatter belongs to one of the main scientific objectives of the Gamma Ray Observatory GRO of NASA, which will be launched in 1990 by the Space Shuttle. (orig.)

  14. A Suboptimal PTS Algorithm Based on Particle Swarm Optimization Technique for PAPR Reduction in OFDM Systems

    Directory of Open Access Journals (Sweden)

    Ho-Lung Hung

    2008-08-01

    Full Text Available A suboptimal partial transmit sequence (PTS based on particle swarm optimization (PSO algorithm is presented for the low computation complexity and the reduction of the peak-to-average power ratio (PAPR of an orthogonal frequency division multiplexing (OFDM system. In general, PTS technique can improve the PAPR statistics of an OFDM system. However, it will come with an exhaustive search over all combinations of allowed phase weighting factors and the search complexity increasing exponentially with the number of subblocks. In this paper, we work around potentially computational intractability; the proposed PSO scheme exploits heuristics to search the optimal combination of phase factors with low complexity. Simulation results show that the new technique can effectively reduce the computation complexity and PAPR reduction.

  15. A Suboptimal PTS Algorithm Based on Particle Swarm Optimization Technique for PAPR Reduction in OFDM Systems

    Directory of Open Access Journals (Sweden)

    Lee Shu-Hong

    2008-01-01

    Full Text Available Abstract A suboptimal partial transmit sequence (PTS based on particle swarm optimization (PSO algorithm is presented for the low computation complexity and the reduction of the peak-to-average power ratio (PAPR of an orthogonal frequency division multiplexing (OFDM system. In general, PTS technique can improve the PAPR statistics of an OFDM system. However, it will come with an exhaustive search over all combinations of allowed phase weighting factors and the search complexity increasing exponentially with the number of subblocks. In this paper, we work around potentially computational intractability; the proposed PSO scheme exploits heuristics to search the optimal combination of phase factors with low complexity. Simulation results show that the new technique can effectively reduce the computation complexity and PAPR reduction.

  16. The Search Performance Evaluation and Prediction in Exploratory Search

    OpenAIRE

    LIU, FEI

    2016-01-01

    The exploratory search for complex search tasks requires an effective search behavior model to evaluate and predict user search performance. Few studies have investigated the relationship between user search behavior and search performance in exploratory search. This research adopts a mixed approach combining search system development, user search experiment, search query log analysis, and multivariate regression analysis to resolve the knowledge gap. Through this study, it is shown that expl...

  17. Modified Parameters of Harmony Search Algorithm for Better Searching

    Science.gov (United States)

    Farraliza Mansor, Nur; Abal Abas, Zuraida; Samad Shibghatullah, Abdul; Rahman, Ahmad Fadzli Nizam Abdul

    2017-08-01

    The scheduling and rostering problems are deliberated as integrated due to they depend on each other whereby the input of rostering problems is a scheduling problems. In this research, the integrated scheduling and rostering bus driver problems are defined as maximising the balance of the assignment of tasks in term of distribution of shifts and routes. It is essential to achieve is fairer among driver because this can bring to increase in driver levels of satisfaction. The latest approaches still unable to address the fairness problem that has emerged, thus this research proposes a strategy to adopt an amendment of a harmony search algorithm in order to address the fairness issue and thus the level of fairness will be escalate. The harmony search algorithm is classified as a meta-heuristics algorithm that is capable of solving hard and combinatorial or discrete optimisation problems. In this respect, the three main operators in HS, namely the Harmony Memory Consideration Rate (HMCR), Pitch Adjustment Rate (PAR) and Bandwidth (BW) play a vital role in balancing local exploitation and global exploration. These parameters influence the overall performance of the HS algorithm, and therefore it is crucial to fine-tune them. The contributions to this research are the HMCR parameter using step function while the fret spacing concept on guitars that is associated with mathematical formulae is also applied in the BW parameter. The model of constant step function is introduced in the alteration of HMCR parameter. The experimental results revealed that our proposed approach is superior than parameter adaptive harmony search algorithm. In conclusion, this proposed approach managed to generate a fairer roster and was thus capable of maximising the balancing distribution of shifts and routes among drivers, which contributed to the lowering of illness, incidents, absenteeism and accidents.

  18. Satisfaction of search experiments in advanced imaging

    Science.gov (United States)

    Berbaum, Kevin S.

    2012-03-01

    The objective of our research is to understand the perception of multiple abnormalities in an imaging examination and to develop strategies for improved diagnostic. We are one of the few laboratories in the world pursuing the goal of reducing detection errors through a better understanding of the underlying perceptual processes involved. Failure to detect an abnormality is the most common class of error in diagnostic imaging and generally is considered the most serious by the medical community. Many of these errors have been attributed to "satisfaction of search," which occurs when a lesion is not reported because discovery of another abnormality has "satisfied" the goal of the search. We have gained some understanding of the mechanisms of satisfaction of search (SOS) traditional radiographic modalities. Currently, there are few interventions to remedy SOS error. For example, patient history that the prompts specific abnormalities, protects the radiologist from missing them even when other abnormalities are present. The knowledge gained from this programmatic research will lead to reduction of observer error.

  19. SHiP: a new facility to search for heavy neutrinos and study $\

    CERN Document Server

    De Serio, Marilisa

    2016-01-01

    SHiP (Search for Hidden Particles) is a newly designed fixed target facility, proposed at the CERN SPS accelerator, with the aim of complementing searches for New Physics at LHC by searching for light long-lived exotic particles with masses below a few GeV/c2. The sensitivity to Heavy Neutrinos will allow for the first time probing a region of the parameter space where Baryogenesis and active neutrino masses and oscillation could also be explained. A dedicated detector, based on OPERA-like bricks, will provide the first observation of the tau anti-neutrino. Moreover, $\

  20. Searches for phenomena beyond the Standard Model at the Large ...

    Indian Academy of Sciences (India)

    A summary of the searches conducted by the ATLAS and CMS experiments based on about 1 fb .... space mρTC < mπTC + mW [35]. Figure 5. ... expected dilepton and diphoton events at high invariant mass through virtual graviton exchange.

  1. Logistics Reduction: RFID Enabled Autonomous Logistics Management (REALM)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Exploration Systems (AES) Logistics Reduction (LR) project Radio-frequency identification (RFID) Enabled Autonomous Logistics Management (REALM) task...

  2. SOLAR-LIKE OSCILLATIONS IN A METAL-POOR GLOBULAR CLUSTER WITH THE HUBBLE SPACE TELESCOPE

    International Nuclear Information System (INIS)

    Stello, Dennis; Gilliland, Ronald L.

    2009-01-01

    We present analyses of variability in the red giant stars in the metal-poor globular cluster NGC 6397, based on data obtained with the Hubble Space Telescope. We use a nonstandard data reduction approach to turn a 23 day observing run originally aimed at imaging the white dwarf population, into time-series photometry of the cluster's highly saturated red giant stars. With this technique we obtain noise levels in the final power spectra down to 50 parts per million, which allows us to search for low-amplitude solar-like oscillations. We compare the observed excess power seen in the power spectra with estimates of the typical frequency range, frequency spacing, and amplitude from scaling the solar oscillations. We see evidence that the detected variability is consistent with solar-like oscillations in at least one and perhaps up to four stars. With metallicities 2 orders of magnitude lower than those of the Sun, these stars present so far the best evidence of solar-like oscillations in such a low-metallicity environment.

  3. A BLE-Based Pedestrian Navigation System for Car Searching in Indoor Parking Garages.

    Science.gov (United States)

    Wang, Sheng-Shih

    2018-05-05

    The continuous global increase in the number of cars has led to an increase in parking issues, particularly with respect to the search for available parking spaces and finding cars. In this paper, we propose a navigation system for car owners to find their cars in indoor parking garages. The proposed system comprises a car-searching mobile app and a positioning-assisting subsystem. The app guides car owners to their cars based on a “turn-by-turn” navigation strategy, and has the ability to correct the user’s heading orientation. The subsystem uses beacon technology for indoor positioning, supporting self-guidance of the car-searching mobile app. This study also designed a local coordinate system to support the identification of the locations of parking spaces and beacon devices. We used Android as the platform to implement the proposed car-searching mobile app, and used Bytereal HiBeacon devices to implement the proposed positioning-assisting subsystem. We also deployed the system in a parking lot in our campus for testing. The experimental results verified that the proposed system not only works well, but also provides the car owner with the correct route guidance information.

  4. NASA Taxonomies for Searching Problem Reports and FMEAs

    Science.gov (United States)

    Malin, Jane T.; Throop, David R.

    2006-01-01

    Many types of hazard and risk analyses are used during the life cycle of complex systems, including Failure Modes and Effects Analysis (FMEA), Hazard Analysis, Fault Tree and Event Tree Analysis, Probabilistic Risk Assessment, Reliability Analysis and analysis of Problem Reporting and Corrective Action (PRACA) databases. The success of these methods depends on the availability of input data and the analysts knowledge. Standard nomenclature can increase the reusability of hazard, risk and problem data. When nomenclature in the source texts is not standard, taxonomies with mapping words (sets of rough synonyms) can be combined with semantic search to identify items and tag them with metadata based on a rich standard nomenclature. Semantic search uses word meanings in the context of parsed phrases to find matches. The NASA taxonomies provide the word meanings. Spacecraft taxonomies and ontologies (generalization hierarchies with attributes and relationships, based on terms meanings) are being developed for types of subsystems, functions, entities, hazards and failures. The ontologies are broad and general, covering hardware, software and human systems. Semantic search of Space Station texts was used to validate and extend the taxonomies. The taxonomies have also been used to extract system connectivity (interaction) models and functions from requirements text. Now the Reconciler semantic search tool and the taxonomies are being applied to improve search in the Space Shuttle PRACA database, to discover recurring patterns of failure. Usual methods of string search and keyword search fall short because the entries are terse and have numerous shortcuts (irregular abbreviations, nonstandard acronyms, cryptic codes) and modifier words cannot be used in sentence context to refine the search. The limited and fixed FMEA categories associated with the entries do not make the fine distinctions needed in the search. The approach assigns PRACA report titles to problem classes in

  5. Urban green spaces and cancer: a protocol for a scoping review

    Science.gov (United States)

    Lejeune, Mathilde; Gaudel, Marion; Pommier, Jeanine; Faure, Emmanuelle; Heritage, Zoé; Rican, Stéphane; Simos, Jean; Cantoreggi, Nicola Luca; Roué Le Gall, Anne; Cambon, Linda; Regnaux, Jean-Philippe

    2018-01-01

    Introduction Green space in the built environment is an important topic on the health agenda today. Studies have shown that access to green spaces is associated with better mental and physical health, yet green spaces can also be detrimental to health if they are not managed appropriately. Despite the increasing interest in urban green spaces, little research has so far been conducted into the links between green spaces and cancer. Objective The purpose of this scoping review is therefore to map the literature available on the types of relationship between urban green spaces and cancer. Method and analysis We followed the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols 2015 guideline to report the protocol. To conduct this scoping review, we will use a structured search strategy based on controlled vocabulary and relevant key terms related to green space, urban space and cancer. We will search MEDLINE (PubMed), GreenFILE (EBSCOhost), Cumulative Index to Nursing and Allied Health Literature (EBSCOhost) and ScienceDirect as electronic database as well as hand-search publications for grey literature. This review will therefore provide evidence on this current topic, one which could have practical implications for policy-makers involved in choices which are more conducive to healthy living. Ethics and dissemination No primary data will be collected since all data that will be presented in this review are based on published articles and publicly available documents, and therefore ethics committee approval is not a requirement. The findings of this review will be presented at workshops and conferences, and will be submitted for publication in a peer-reviewed journal. PMID:29453298

  6. Enhancing search efficiency by means of a search filter for finding all studies on animal experimentation in PubMed.

    Science.gov (United States)

    Hooijmans, Carlijn R; Tillema, Alice; Leenaars, Marlies; Ritskes-Hoitinga, Merel

    2010-07-01

    Collecting and analysing all available literature before starting an animal experiment is important and it is indispensable when writing a systematic review (SR) of animal research. Writing such review prevents unnecessary duplication of animal studies and thus unnecessary animal use (Reduction). One of the factors currently impeding the production of 'high-quality' SRs in laboratory animal science is the fact that searching for all available literature concerning animal experimentation is rather difficult. In order to diminish these difficulties, we developed a search filter for PubMed to detect all publications concerning animal studies. This filter was compared with the method most frequently used, the PubMed Limit: Animals, and validated further by performing two PubMed topic searches. Our filter performs much better than the PubMed limit: it retrieves, on average, 7% more records. Other important advantages of our filter are that it also finds the most recent records and that it is easy to use. All in all, by using our search filter in PubMed, all available literature concerning animal studies on a specific topic can easily be found and assessed, which will help in increasing the scientific quality and thereby the ethical validity of animal experiments.

  7. Potential of LOFT telescope for the search of dark matter

    CERN Document Server

    Neronov, A; Iakubovskyi, D.; Ruchayskiy, O.

    2014-01-01

    Large Observatory For X-ray Timing (LOFT) is a next generation X-ray telescope selected by European Space Agency as one of the space mission concepts within the ``Cosmic Vision'' programme. The Large Area Detector on board of LOFT will be a collimator-type telescope with an unprecedentedly large collecting area of about 10 square meters in the energy band between 2 and 100 keV. We demonstrate that LOFT will be a powerful dark matter detector, suitable for the search of the X-ray line emission expected from decays of light dark matter particles in galactic halos. We show that LOFT will have sensitivity for dark matter line search more than an order of magnitude higher than that of all existing X-ray telescopes. In this way, LOFT will be able to provide a new insight into the fundamental problem of the nature of dark matter.

  8. Non-Cartesian MRI scan time reduction through sparse sampling

    NARCIS (Netherlands)

    Wajer, F.T.A.W.

    2001-01-01

    Non-Cartesian MRI Scan-Time Reduction through Sparse Sampling Magnetic resonance imaging (MRI) signals are measured in the Fourier domain, also called k-space. Samples of the MRI signal can not be taken at will, but lie along k-space trajectories determined by the magnetic field gradients. MRI

  9. Discrete Routh reduction

    International Nuclear Information System (INIS)

    Jalnapurkar, Sameer M; Leok, Melvin; Marsden, Jerrold E; West, Matthew

    2006-01-01

    This paper develops the theory of Abelian Routh reduction for discrete mechanical systems and applies it to the variational integration of mechanical systems with Abelian symmetry. The reduction of variational Runge-Kutta discretizations is considered, as well as the extent to which symmetry reduction and discretization commute. These reduced methods allow the direct simulation of dynamical features such as relative equilibria and relative periodic orbits that can be obscured or difficult to identify in the unreduced dynamics. The methods are demonstrated for the dynamics of an Earth orbiting satellite with a non-spherical J 2 correction, as well as the double spherical pendulum. The J 2 problem is interesting because in the unreduced picture, geometric phases inherent in the model and those due to numerical discretization can be hard to distinguish, but this issue does not appear in the reduced algorithm, where one can directly observe interesting dynamical structures in the reduced phase space (the cotangent bundle of shape space), in which the geometric phases have been removed. The main feature of the double spherical pendulum example is that it has a non-trivial magnetic term in its reduced symplectic form. Our method is still efficient as it can directly handle the essential non-canonical nature of the symplectic structure. In contrast, a traditional symplectic method for canonical systems could require repeated coordinate changes if one is evoking Darboux' theorem to transform the symplectic structure into canonical form, thereby incurring additional computational cost. Our method allows one to design reduced symplectic integrators in a natural way, despite the non-canonical nature of the symplectic structure

  10. Benchmark models, planes lines and points for future SUSY searches at the LHC

    International Nuclear Information System (INIS)

    AbdusSalam, S.S.; Allanach, B.C.; Dreiner, H.K.

    2012-03-01

    We define benchmark models for SUSY searches at the LHC, including the CMSSM, NUHM, mGMSB, mAMSB, MM-AMSB and p19MSSM, as well as models with R-parity violation and the NMSSM. Within the parameter spaces of these models, we propose benchmark subspaces, including planes, lines and points along them. The planes may be useful for presenting results of the experimental searches in different SUSY scenarios, while the specific benchmark points may serve for more detailed detector performance tests and comparisons. We also describe algorithms for defining suitable benchmark points along the proposed lines in the parameter spaces, and we define a few benchmark points motivated by recent fits to existing experimental data.

  11. Benchmark models, planes lines and points for future SUSY searches at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    AbdusSalam, S.S. [The Abdus Salam International Centre for Theoretical Physics, Trieste (Italy); Allanach, B.C. [Cambridge Univ. (United Kingdom). Dept. of Applied Mathematics and Theoretical Physics; Dreiner, H.K. [Bonn Univ. (DE). Bethe Center for Theoretical Physics and Physikalisches Inst.] (and others)

    2012-03-15

    We define benchmark models for SUSY searches at the LHC, including the CMSSM, NUHM, mGMSB, mAMSB, MM-AMSB and p19MSSM, as well as models with R-parity violation and the NMSSM. Within the parameter spaces of these models, we propose benchmark subspaces, including planes, lines and points along them. The planes may be useful for presenting results of the experimental searches in different SUSY scenarios, while the specific benchmark points may serve for more detailed detector performance tests and comparisons. We also describe algorithms for defining suitable benchmark points along the proposed lines in the parameter spaces, and we define a few benchmark points motivated by recent fits to existing experimental data.

  12. Benchmark Models, Planes, Lines and Points for Future SUSY Searches at the LHC

    CERN Document Server

    AbdusSalam, S S; Dreiner, H K; Ellis, J; Ellwanger, U; Gunion, J; Heinemeyer, S; Krämer, M; Mangano, M L; Olive, K A; Rogerson, S; Roszkowski, L; Schlaffer, M; Weiglein, G

    2011-01-01

    We define benchmark models for SUSY searches at the LHC, including the CMSSM, NUHM, mGMSB, mAMSB, MM-AMSB and p19MSSM, as well as models with R-parity violation and the NMSSM. Within the parameter spaces of these models, we propose benchmark subspaces, including planes, lines and points along them. The planes may be useful for presenting results of the experimental searches in different SUSY scenarios, while the specific benchmark points may serve for more detailed detector performance tests and comparisons. We also describe algorithms for defining suitable benchmark points along the proposed lines in the parameter spaces, and we define a few benchmark points motivated by recent fits to existing experimental data.

  13. Update on Risk Reduction Activities for a Liquid Advanced Booster for NASA's Space Launch System

    Science.gov (United States)

    Crocker, Andrew M.; Doering, Kimberly B; Meadows, Robert G.; Lariviere, Brian W.; Graham, Jerry B.

    2015-01-01

    The stated goals of NASA's Research Announcement for the Space Launch System (SLS) Advanced Booster Engineering Demonstration and/or Risk Reduction (ABEDRR) are to reduce risks leading to an affordable Advanced Booster that meets the evolved capabilities of SLS; and enable competition by mitigating targeted Advanced Booster risks to enhance SLS affordability. Dynetics, Inc. and Aerojet Rocketdyne (AR) formed a team to offer a wide-ranging set of risk reduction activities and full-scale, system-level demonstrations that support NASA's ABEDRR goals. For NASA's SLS ABEDRR procurement, Dynetics and AR formed a team to offer a series of full-scale risk mitigation hardware demonstrations for an affordable booster approach that meets the evolved capabilities of the SLS. To establish a basis for the risk reduction activities, the Dynetics Team developed a booster design that takes advantage of the flight-proven Apollo-Saturn F-1. Using NASA's vehicle assumptions for the SLS Block 2, a two-engine, F-1-based booster design delivers 150 mT (331 klbm) payload to LEO, 20 mT (44 klbm) above NASA's requirements. This enables a low-cost, robust approach to structural design. During the ABEDRR effort, the Dynetics Team has modified proven Apollo-Saturn components and subsystems to improve affordability and reliability (e.g., reduce parts counts, touch labor, or use lower cost manufacturing processes and materials). The team has built hardware to validate production costs and completed tests to demonstrate it can meet performance requirements. State-of-the-art manufacturing and processing techniques have been applied to the heritage F-1, resulting in a low recurring cost engine while retaining the benefits of Apollo-era experience. NASA test facilities have been used to perform low-cost risk-reduction engine testing. In early 2014, NASA and the Dynetics Team agreed to move additional large liquid oxygen/kerosene engine work under Dynetics' ABEDRR contract. Also led by AR, the

  14. Constraining Dark Matter Interactions with Pseudoscalar and Scalar Mediators Using Collider Searches for Multijets plus Missing Transverse Energy.

    Science.gov (United States)

    Buchmueller, Oliver; Malik, Sarah A; McCabe, Christopher; Penning, Bjoern

    2015-10-30

    The monojet search, looking for events involving missing transverse energy (E_{T}) plus one or two jets, is the most prominent collider dark matter search. We show that multijet searches, which look for E_{T} plus two or more jets, are significantly more sensitive than the monojet search for pseudoscalar- and scalar-mediated interactions. We demonstrate this in the context of a simplified model with a pseudoscalar interaction that explains the excess in GeV energy gamma rays observed by the Fermi Large Area Telescope. We show that multijet searches already constrain a pseudoscalar interpretation of the excess in much of the parameter space where the mass of the mediator M_{A} is more than twice the dark matter mass m_{DM}. With the forthcoming run of the Large Hadron Collider at higher energies, the remaining regions of the parameter space where M_{A}>2m_{DM} will be fully explored. Furthermore, we highlight the importance of complementing the monojet final state with multijet final states to maximize the sensitivity of the search for the production of dark matter at colliders.

  15. The influence of cognitive load on spatial search performance.

    Science.gov (United States)

    Longstaffe, Kate A; Hood, Bruce M; Gilchrist, Iain D

    2014-01-01

    During search, executive function enables individuals to direct attention to potential targets, remember locations visited, and inhibit distracting information. In the present study, we investigated these executive processes in large-scale search. In our tasks, participants searched a room containing an array of illuminated locations embedded in the floor. The participants' task was to press the switches at the illuminated locations on the floor so as to locate a target that changed color when pressed. The perceptual salience of the search locations was manipulated by having some locations flashing and some static. Participants were more likely to search at flashing locations, even when they were explicitly informed that the target was equally likely to be at any location. In large-scale search, attention was captured by the perceptual salience of the flashing lights, leading to a bias to explore these targets. Despite this failure of inhibition, participants were able to restrict returns to previously visited locations, a measure of spatial memory performance. Participants were more able to inhibit exploration to flashing locations when they were not required to remember which locations had previously been visited. A concurrent digit-span memory task further disrupted inhibition during search, as did a concurrent auditory attention task. These experiments extend a load theory of attention to large-scale search, which relies on egocentric representations of space. High cognitive load on working memory leads to increased distractor interference, providing evidence for distinct roles for the executive subprocesses of memory and inhibition during large-scale search.

  16. Stardust@home: An Interactive Internet-based Search for Interstellar Dust

    Science.gov (United States)

    Mendez, B. J.; Westphal, A. J.; Butterworth, A. L.; Craig, N.

    2006-12-01

    On January 15, 2006, NASA's Stardust mission returned to Earth after nearly seven years in interplanetary space. During its journey, Stardust encountered comet Wild 2, collecting dust particles from it in a special material called aerogel. At two other times in the mission, aerogel collectors were also opened to collect interstellar dust. The Stardust Interstellar Dust Collector is being scanned by an automated microscope at the Johnson Space Center. There are approximately 700,000 fields of view needed to cover the entire collector, but we expect only a few dozen total grains of interstellar dust were captured within it. Finding these particles is a daunting task. We have recruited many thousands of volunteers from the public to aid in the search for these precious pieces of space dust trapped in the collectors. We call the project Stardust@home. Through Stardust@home, volunteers from the public search fields of view from the Stardust aerogel collector using a web-based Virtual Microscope. Volunteers who discover interstellar dust particles have the privilege of naming them. The interest and response to this project has been extraordinary. Many people from all walks of life are very excited about space science and eager to volunteer their time to contribute to a real research project such as this. We will discuss the progress of the project and the education and outreach activities being carried out for it.

  17. Return and profitability of space programs. Information - the main product of flights in space

    Science.gov (United States)

    Nikolova, Irena

    The basic branch providing global information, as a product on the market, is astronautics and in particular aero and space flights. Nowadays economic categories like profitability, return, and self-financing are added to space information. The activity in the space information service market niche is an opportunity for realization of high economic efficiency and profitability. The present report aims at examining the possibilities for return and profitability of space programs. Specialists in economics from different countries strive for defining the economic effect of implementing space technologies in the technical branches on earth. Still the priorities here belong to government and insufficient market organization and orientation is apparent. Attracting private investors and searching for new mechanisms of financing are the factors for increasing economic efficiency and return of capital invested in the mentioned sphere. Return of utilized means is an economically justified goal, a motive for a bigger enlargement of efforts and directions for implementing the achievements of astronautics in the branches of economy on earth.

  18. Short-term economic environmental hydrothermal scheduling using improved multi-objective gravitational search algorithm

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Lu, Peng; Wang, Chao

    2015-01-01

    Highlights: • Improved multi-objective gravitational search algorithm. • An elite archive set is proposed to guide evolutionary process. • Neighborhood searching mechanism to improve local search ability. • Adopt chaotic mutation for avoiding premature convergence. • Propose feasible space method to handle hydro plant constrains. - Abstract: With growing concerns about energy and environment, short-term economic environmental hydrothermal scheduling (SEEHS) plays a more and more important role in power system. Because of the two objectives and various constraints, SEEHS is a complex multi-objective optimization problem (MOOP). In order to solve the problem, we propose an improved multi-objective gravitational search algorithm (IMOGSA) in this paper. In IMOGSA, the mass of the agent is redefined by multiple objectives to make it suitable for MOOP. An elite archive set is proposed to keep Pareto optimal solutions and guide evolutionary process. For balancing exploration and exploitation, a neighborhood searching mechanism is presented to cooperate with chaotic mutation. Moreover, a novel method based on feasible space is proposed to handle hydro plant constraints during SEEHS, and a violation adjustment method is adopted to handle power balance constraint. For verifying its effectiveness, the proposed IMOGSA is applied to a hydrothermal system in two different case studies. The simulation results show that IMOGSA has a competitive performance in SEEHS when compared with other established algorithms

  19. Finding Chemical Structures Corresponding to a Set of Coordinates in Chemical Descriptor Space.

    Science.gov (United States)

    Miyao, Tomoyuki; Funatsu, Kimito

    2017-08-01

    When chemical structures are searched based on descriptor values, or descriptors are interpreted based on values, it is important that corresponding chemical structures actually exist. In order to consider the existence of chemical structures located in a specific region in the chemical space, we propose to search them inside training data domains (TDDs), which are dense areas of a training dataset in the chemical space. We investigated TDDs' features using diverse and local datasets, assuming that GDB11 is the chemical universe. These two analyses showed that considering TDDs gives higher chance of finding chemical structures than a random search-based method, and that novel chemical structures actually exist inside TDDs. In addition to those findings, we tested the hypothesis that chemical structures were distributed on the limited areas of chemical space. This hypothesis was confirmed by the fact that distances among chemical structures in several descriptor spaces were much shorter than those among randomly generated coordinates in the training data range. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. SearchResultFinder: federated search made easy

    NARCIS (Netherlands)

    Trieschnigg, Rudolf Berend; Tjin-Kam-Jet, Kien; Hiemstra, Djoerd

    Building a federated search engine based on a large number existing web search engines is a challenge: implementing the programming interface (API) for each search engine is an exacting and time-consuming job. In this demonstration we present SearchResultFinder, a browser plugin which speeds up

  1. High-throughput computational search for strengthening precipitates in alloys

    International Nuclear Information System (INIS)

    Kirklin, S.; Saal, James E.; Hegde, Vinay I.; Wolverton, C.

    2016-01-01

    The search for high-strength alloys and precipitation hardened systems has largely been accomplished through Edisonian trial and error experimentation. Here, we present a novel strategy using high-throughput computational approaches to search for promising precipitate/alloy systems. We perform density functional theory (DFT) calculations of an extremely large space of ∼200,000 potential compounds in search of effective strengthening precipitates for a variety of different alloy matrices, e.g., Fe, Al, Mg, Ni, Co, and Ti. Our search strategy involves screening phases that are likely to produce coherent precipitates (based on small lattice mismatch) and are composed of relatively common alloying elements. When combined with the Open Quantum Materials Database (OQMD), we can computationally screen for precipitates that either have a stable two-phase equilibrium with the host matrix, or are likely to precipitate as metastable phases. Our search produces (for the structure types considered) nearly all currently known high-strength precipitates in a variety of fcc, bcc, and hcp matrices, thus giving us confidence in the strategy. In addition, we predict a number of new, currently-unknown precipitate systems that should be explored experimentally as promising high-strength alloy chemistries.

  2. Fast Multi-blind Modification Search through Tandem Mass Spectrometry*

    Science.gov (United States)

    Na, Seungjin; Bandeira, Nuno; Paek, Eunok

    2012-01-01

    With great biological interest in post-translational modifications (PTMs), various approaches have been introduced to identify PTMs using MS/MS. Recent developments for PTM identification have focused on an unrestrictive approach that searches MS/MS spectra for all known and possibly even unknown types of PTMs at once. However, the resulting expanded search space requires much longer search time and also increases the number of false positives (incorrect identifications) and false negatives (missed true identifications), thus creating a bottleneck in high throughput analysis. Here we introduce MODa, a novel “multi-blind” spectral alignment algorithm that allows for fast unrestrictive PTM searches with no limitation on the number of modifications per peptide while featuring over an order of magnitude speedup in relation to existing approaches. We demonstrate the sensitivity of MODa on human shotgun proteomics data where it reveals multiple mutations, a wide range of modifications (including glycosylation), and evidence for several putative novel modifications. Based on the reported findings, we argue that the efficiency and sensitivity of MODa make it the first unrestrictive search tool with the potential to fully replace conventional restrictive identification of proteomics mass spectrometry data. PMID:22186716

  3. Optimized blind gamma-ray pulsar searches at fixed computing budget

    International Nuclear Information System (INIS)

    Pletsch, Holger J.; Clark, Colin J.

    2014-01-01

    The sensitivity of blind gamma-ray pulsar searches in multiple years worth of photon data, as from the Fermi LAT, is primarily limited by the finite computational resources available. Addressing this 'needle in a haystack' problem, here we present methods for optimizing blind searches to achieve the highest sensitivity at fixed computing cost. For both coherent and semicoherent methods, we consider their statistical properties and study their search sensitivity under computational constraints. The results validate a multistage strategy, where the first stage scans the entire parameter space using an efficient semicoherent method and promising candidates are then refined through a fully coherent analysis. We also find that for the first stage of a blind search incoherent harmonic summing of powers is not worthwhile at fixed computing cost for typical gamma-ray pulsars. Further enhancing sensitivity, we present efficiency-improved interpolation techniques for the semicoherent search stage. Via realistic simulations we demonstrate that overall these optimizations can significantly lower the minimum detectable pulsed fraction by almost 50% at the same computational expense.

  4. Combining of Direct Search and Signal-to-Noise Ratio for economic dispatch optimization

    International Nuclear Information System (INIS)

    Lin, Whei-Min; Gow, Hong-Jey; Tsai, Ming-Tang

    2011-01-01

    This paper integrated the ideas of Direct Search and Signal-to-Noise Ratio (SNR) to develop a Novel Direct Search (NDS) method for solving the non-convex economic dispatch problems. NDS consists of three stages: Direct Search (DS), Global SNR (GSNR) and Marginal Compensation (MC) stages. DS provides a basic solution. GSNR searches the point with optimization strategy. MC fulfills the power balance requirement. With NDS, the infinite solution space becomes finite. Furthermore, a same optimum solution can be repeatedly reached. Effectiveness of NDS is demonstrated with three examples and the solutions were compared with previously published results. Test results show that the proposed method is simple, robust, and more effective than many other previously developed algorithms.

  5. Search for Sterile Neutrinos with the MINOS Long-Baseline Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Timmons, Ashley Michael [Univ. of Manchester (United Kingdom)

    2016-01-01

    This thesis will present a search for sterile neutrinos using data taken with the MINOS experiment between 2005 and 2012. MINOS is a two-detector on-axis experiment based at Fermilab. The NuMI neutrino beam encounters the MINOS Near Detector 1km downstream of the neutrino-production target before traveling a further 734km through the Earth's crust, to reach the Far Detector located at the Soudan Underground Laboratory in Northern Minnesota. By searching for oscillations driven by a large mass splitting, MINOS is sensitive to the existence of sterile neutrinos through looking for any energy-dependent perturbations using a charged-current sample, as well as looking at any relative deficit in neutral current events between the Far and Near Detectors. This thesis will discuss the novel analysis that enabled a search for sterile neutrinos covering five orders of magnitude in the mass splitting and setting a limit in previously unexplored regions of the parameter space $\\left\\{\\Delta m^{2}_{41},\\sin^2\\theta_{24}\\right\\}$, where a 3+1-flavour phenomenological model was used to extract parameter limits. The results presented in this thesis are sensitive to the sterile neutrino parameter space suggested by the LSND and MiniBooNE experiments.

  6. Effect of the antimicrobial photodynamic therapy on microorganism reduction in deep caries lesions: a systematic review and meta-analysis

    Science.gov (United States)

    Ornellas, Pâmela Oliveira; Antunes, Leonardo Santos; Fontes, Karla Bianca Fernandes da Costa; Póvoa, Helvécio Cardoso Corrêa; Küchler, Erika Calvano; Iorio, Natalia Lopes Pontes; Antunes, Lívia Azeredo Alves

    2016-09-01

    This study aimed to perform a systematic review to assess the effectiveness of antimicrobial photodynamic therapy (aPDT) in the reduction of microorganisms in deep carious lesions. An electronic search was conducted in Pubmed, Web of Science, Scopus, Lilacs, and Cochrane Library, followed by a manual search. The MeSH terms, MeSH synonyms, related terms, and free terms were used in the search. As eligibility criteria, only clinical studies were included. Initially, 227 articles were identified in the electronic search, and 152 studies remained after analysis and exclusion of the duplicated studies; 6 remained after application of the eligibility criteria; and 3 additional studies were found in the manual search. After access to the full articles, three were excluded, leaving six for evaluation by the criteria of the Cochrane Collaboration's tool for assessing risk of bias. Of these, five had some risk of punctuated bias. All results from the selected studies showed a significant reduction of microorganisms in deep carious lesions for both primary and permanent teeth. The meta-analysis demonstrated a significant reduction in microorganism counts in all analyses (p<0.00001). Based on these findings, there is scientific evidence emphasizing the effectiveness of aPDT in reducing microorganisms in deep carious lesions.

  7. Other Earths: Search for Life and the Constant Curvature

    Directory of Open Access Journals (Sweden)

    Khoshyaran M. M.

    2015-07-01

    Full Text Available The objective of this paper is to propose a search methodology for finding other exactly similar earth like planets (or sister earths. The theory is based on space consisting of Riemann curves or highways. A mathematical model based on constant curvature, a moving frame bundle, and gravitational dynamics is introduced.

  8. A search for spectral lines in gamma-ray bursts using TGRS

    International Nuclear Information System (INIS)

    Kurczynski, P.; Palmer, D.; Seifert, H.; Teegarden, B. J.; Gehrels, N.; Cline, T. L.; Ramaty, R.; Hurley, K.; Madden, N. W.; Pehl, R. H.

    1998-01-01

    We present the results of an ongoing search for narrow spectral lines in gamma-ray burst data. TGRS, the Transient Gamma-Ray Spectrometer aboard the Wind satellite is a high energy-resolution Ge device. Thus it is uniquely situated among the array of space-based, burst sensitive instruments to look for line features in gamma-ray burst spectra. Our search strategy adopts a two tiered approach. An automated 'quick look' scan searches spectra for statistically significant deviations from the continuum. We analyzed all possible time accumulations of spectra as well as individual spectra for each burst. Follow-up analysis of potential line candidates uses model fitting with F-test and χ 2 tests for statistical significance

  9. LETTER TO THE EDITOR: Exhaustive search for low-autocorrelation binary sequences

    Science.gov (United States)

    Mertens, S.

    1996-09-01

    Binary sequences with low autocorrelations are important in communication engineering and in statistical mechanics as ground states of the Bernasconi model. Computer searches are the main tool in the construction of such sequences. Owing to the exponential size 0305-4470/29/18/005/img1 of the configuration space, exhaustive searches are limited to short sequences. We discuss an exhaustive search algorithm with run-time characteristic 0305-4470/29/18/005/img2 and apply it to compile a table of exact ground states of the Bernasconi model up to N = 48. The data suggest F > 9 for the optimal merit factor in the limit 0305-4470/29/18/005/img3.

  10. Efficiently enclosing the compact binary parameter space by singular-value decomposition

    International Nuclear Information System (INIS)

    Cannon, Kipp; Hanna, Chad; Keppel, Drew

    2011-01-01

    Gravitational-wave searches for the merger of compact binaries use matched filtering as the method of detecting signals and estimating parameters. Such searches construct a fine mesh of filters covering a signal parameter space at high density. Previously it has been shown that singular-value decomposition can reduce the effective number of filters required to search the data. Here we study how the basis provided by the singular-value decomposition changes dimension as a function of template-bank density. We will demonstrate that it is sufficient to use the basis provided by the singular-value decomposition of a low-density bank to accurately reconstruct arbitrary points within the boundaries of the template bank. Since this technique is purely numerical, it may have applications to interpolating the space of numerical relativity waveforms.

  11. Electroweak SUSY production searches at ATLAS and CMS

    CERN Document Server

    Flowerdew, M; The ATLAS collaboration

    2014-01-01

    The discovery of weak-scale supersymmetric (SUSY) particles is one of the primary goals of the Large Hadron Collider experiments. Depending on the mechanism of SUSY breaking, it could be that strongly interacting squarks and gluinos are too massive to produce at the LHC. In this case, the primary SUSY production mode is of charginos, neutralinos and sleptons, mediated by electroweak interactions. However, the experimental signatures for discovery vary widely, depending on the mass hierarchy, SUSY particle mixing parameters and conservation/violation of R-parity, necessitating a large and complex suite of experimental search strategies. These strategies include searching for events with multiple charged leptons, photons, reconstructed higgs bosons or new long-lived particles. In this presentation, the latest ATLAS and CMS search results in these channels are presented, based mainly on $20~$fb$^{-1}$ of $pp$ collisions at $\\sqrt{s} = 8~$TeV collected in 2012. The resulting constraints on the parameter spaces of...

  12. EW SUSY production searches at ATLAS and CMS

    CERN Document Server

    Flowerdew, MJ; The ATLAS collaboration

    2014-01-01

    The discovery of weak-scale supersymmetric (SUSY) particles is one of the primary goals of the Large Hadron Collider experiments. Depending on the mechanism of SUSY breaking, it could be that strongly interacting squarks and gluinos are too massive to produce at the LHC. In this case, the primary SUSY production mode is of charginos, neutralinos and sleptons, mediated by electroweak interactions. However, the experimental signatures for discovery vary widely, depending on the mass hierarchy, SUSY particle mixing parameters and conservation/violation of R-parity, necessitating a large and complex suite of experimental search strategies. These strategies include searching for events with multiple charged leptons, photons, reconstructed higgs bosons or new long-lived particles. In this presentation, the latest ATLAS and CMS search results in these channels are presented, based mainly on 20 fb$^{-1}$ of pp collisions at $\\sqrt{s} = 8$ TeV collected in 2012. The resulting constraints on the parameter spaces of var...

  13. Simplified Models for LHC New Physics Searches

    International Nuclear Information System (INIS)

    Alves, Daniele; Arkani-Hamed, Nima; Arora, Sanjay; Bai, Yang; Baumgart, Matthew; Berger, Joshua; Butler, Bart; Chang, Spencer; Cheng, Hsin-Chia; Cheung, Clifford; Chivukula, R. Sekhar; Cho, Won Sang; Cotta, Randy; D'Alfonso, Mariarosaria; El Hedri, Sonia; Essig, Rouven; Fitzpatrick, Liam; Fox, Patrick; Franceschini, Roberto

    2012-01-01

    This document proposes a collection of simplified models relevant to the design of new-physics searches at the LHC and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the 'Topologies for Early LHC Searches' workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first ∼ 50-500 pb -1 of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

  14. Simplified Models for LHC New Physics Searches

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Daniele; /SLAC; Arkani-Hamed, Nima; /Princeton, Inst. Advanced Study; Arora, Sanjay; /Rutgers U., Piscataway; Bai, Yang; /SLAC; Baumgart, Matthew; /Johns Hopkins U.; Berger, Joshua; /Cornell U., Phys. Dept.; Buckley, Matthew; /Fermilab; Butler, Bart; /SLAC; Chang, Spencer; /Oregon U. /UC, Davis; Cheng, Hsin-Chia; /UC, Davis; Cheung, Clifford; /UC, Berkeley; Chivukula, R.Sekhar; /Michigan State U.; Cho, Won Sang; /Tokyo U.; Cotta, Randy; /SLAC; D' Alfonso, Mariarosaria; /UC, Santa Barbara; El Hedri, Sonia; /SLAC; Essig, Rouven, (ed.); /SLAC; Evans, Jared A.; /UC, Davis; Fitzpatrick, Liam; /Boston U.; Fox, Patrick; /Fermilab; Franceschini, Roberto; /LPHE, Lausanne /Pittsburgh U. /Argonne /Northwestern U. /Rutgers U., Piscataway /Rutgers U., Piscataway /Carleton U. /CERN /UC, Davis /Wisconsin U., Madison /SLAC /SLAC /SLAC /Rutgers U., Piscataway /Syracuse U. /SLAC /SLAC /Boston U. /Rutgers U., Piscataway /Seoul Natl. U. /Tohoku U. /UC, Santa Barbara /Korea Inst. Advanced Study, Seoul /Harvard U., Phys. Dept. /Michigan U. /Wisconsin U., Madison /Princeton U. /UC, Santa Barbara /Wisconsin U., Madison /Michigan U. /UC, Davis /SUNY, Stony Brook /TRIUMF; /more authors..

    2012-06-01

    This document proposes a collection of simplified models relevant to the design of new-physics searches at the LHC and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the 'Topologies for Early LHC Searches' workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first {approx} 50-500 pb{sup -1} of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

  15. Multispecies Coevolution Particle Swarm Optimization Based on Previous Search History

    Directory of Open Access Journals (Sweden)

    Danping Wang

    2017-01-01

    Full Text Available A hybrid coevolution particle swarm optimization algorithm with dynamic multispecies strategy based on K-means clustering and nonrevisit strategy based on Binary Space Partitioning fitness tree (called MCPSO-PSH is proposed. Previous search history memorized into the Binary Space Partitioning fitness tree can effectively restrain the individuals’ revisit phenomenon. The whole population is partitioned into several subspecies and cooperative coevolution is realized by an information communication mechanism between subspecies, which can enhance the global search ability of particles and avoid premature convergence to local optimum. To demonstrate the power of the method, comparisons between the proposed algorithm and state-of-the-art algorithms are grouped into two categories: 10 basic benchmark functions (10-dimensional and 30-dimensional, 10 CEC2005 benchmark functions (30-dimensional, and a real-world problem (multilevel image segmentation problems. Experimental results show that MCPSO-PSH displays a competitive performance compared to the other swarm-based or evolutionary algorithms in terms of solution accuracy and statistical tests.

  16. Space-based visual attention: a marker of immature selective attention in toddlers?

    Science.gov (United States)

    Rivière, James; Brisson, Julie

    2014-11-01

    Various studies suggested that attentional difficulties cause toddlers' failure in some spatial search tasks. However, attention is not a unitary construct and this study investigated two attentional mechanisms: location selection (space-based attention) and object selection (object-based attention). We investigated how toddlers' attention is distributed in the visual field during a manual search task for objects moving out of sight, namely the moving boxes task. Results show that 2.5-year-olds who failed this task allocated more attention to the location of the relevant object than to the object itself. These findings suggest that in some manual search tasks the primacy of space-based attention over object-based attention could be a marker of immature selective attention in toddlers. © 2014 Wiley Periodicals, Inc.

  17. 'Sciencenet'--towards a global search and share engine for all scientific knowledge.

    Science.gov (United States)

    Lütjohann, Dominic S; Shah, Asmi H; Christen, Michael P; Richter, Florian; Knese, Karsten; Liebel, Urban

    2011-06-15

    Modern biological experiments create vast amounts of data which are geographically distributed. These datasets consist of petabytes of raw data and billions of documents. Yet to the best of our knowledge, a search engine technology that searches and cross-links all different data types in life sciences does not exist. We have developed a prototype distributed scientific search engine technology, 'Sciencenet', which facilitates rapid searching over this large data space. By 'bringing the search engine to the data', we do not require server farms. This platform also allows users to contribute to the search index and publish their large-scale data to support e-Science. Furthermore, a community-driven method guarantees that only scientific content is crawled and presented. Our peer-to-peer approach is sufficiently scalable for the science web without performance or capacity tradeoff. The free to use search portal web page and the downloadable client are accessible at: http://sciencenet.kit.edu. The web portal for index administration is implemented in ASP.NET, the 'AskMe' experiment publisher is written in Python 2.7, and the backend 'YaCy' search engine is based on Java 1.6.

  18. Urban green spaces and cancer: a protocol for a scoping review.

    Science.gov (United States)

    Porcherie, Marion; Lejeune, Mathilde; Gaudel, Marion; Pommier, Jeanine; Faure, Emmanuelle; Heritage, Zoé; Rican, Stéphane; Simos, Jean; Cantoreggi, Nicola Luca; Roué Le Gall, Anne; Cambon, Linda; Regnaux, Jean-Philippe

    2018-02-16

    Green space in the built environment is an important topic on the health agenda today. Studies have shown that access to green spaces is associated with better mental and physical health, yet green spaces can also be detrimental to health if they are not managed appropriately. Despite the increasing interest in urban green spaces, little research has so far been conducted into the links between green spaces and cancer. The purpose of this scoping review is therefore to map the literature available on the types of relationship between urban green spaces and cancer. We followed the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols 2015 guideline to report the protocol. To conduct this scoping review, we will use a structured search strategy based on controlled vocabulary and relevant key terms related to green space, urban space and cancer. We will search MEDLINE (PubMed), GreenFILE (EBSCOhost), Cumulative Index to Nursing and Allied Health Literature (EBSCOhost) and ScienceDirect as electronic database as well as hand-search publications for grey literature. This review will therefore provide evidence on this current topic, one which could have practical implications for policy-makers involved in choices which are more conducive to healthy living. No primary data will be collected since all data that will be presented in this review are based on published articles and publicly available documents, and therefore ethics committee approval is not a requirement. The findings of this review will be presented at workshops and conferences, and will be submitted for publication in a peer-reviewed journal. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Supporting inter-topic entity search for biomedical Linked Data based on heterogeneous relationships.

    Science.gov (United States)

    Zong, Nansu; Lee, Sungin; Ahn, Jinhyun; Kim, Hong-Gee

    2017-08-01

    The keyword-based entity search restricts search space based on the preference of search. When given keywords and preferences are not related to the same biomedical topic, existing biomedical Linked Data search engines fail to deliver satisfactory results. This research aims to tackle this issue by supporting an inter-topic search-improving search with inputs, keywords and preferences, under different topics. This study developed an effective algorithm in which the relations between biomedical entities were used in tandem with a keyword-based entity search, Siren. The algorithm, PERank, which is an adaptation of Personalized PageRank (PPR), uses a pair of input: (1) search preferences, and (2) entities from a keyword-based entity search with a keyword query, to formalize the search results on-the-fly based on the index of the precomputed Individual Personalized PageRank Vectors (IPPVs). Our experiments were performed over ten linked life datasets for two query sets, one with keyword-preference topic correspondence (intra-topic search), and the other without (inter-topic search). The experiments showed that the proposed method achieved better search results, for example a 14% increase in precision for the inter-topic search than the baseline keyword-based search engine. The proposed method improved the keyword-based biomedical entity search by supporting the inter-topic search without affecting the intra-topic search based on the relations between different entities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Genetic Local Search for Optimum Multiuser Detection Problem in DS-CDMA Systems

    Science.gov (United States)

    Wang, Shaowei; Ji, Xiaoyong

    Optimum multiuser detection (OMD) in direct-sequence code-division multiple access (DS-CDMA) systems is an NP-complete problem. In this paper, we present a genetic local search algorithm, which consists of an evolution strategy framework and a local improvement procedure. The evolution strategy searches the space of feasible, locally optimal solutions only. A fast iterated local search algorithm, which employs the proprietary characteristics of the OMD problem, produces local optima with great efficiency. Computer simulations show the bit error rate (BER) performance of the GLS outperforms other multiuser detectors in all cases discussed. The computation time is polynomial complexity in the number of users.

  1. A naïve ontology for concepts of time and space for searching and learning

    Directory of Open Access Journals (Sweden)

    M. Miwa

    2007-01-01

    Full Text Available Introduction. In this paper, we propose a new approach for developing a naïve ontology as the basis for optimal information access interfaces for multimedia digital documents intended for novice users. Method. We try to elicit the knowledge structure of domain novices and patterns of its modification in their searching and learning processes by eye-tracker and showing eye-movements in the post-search interviews. Analysis. Recorded interview data were fully transcribed and coded using Atlas.ti and analysed following a bottom-up strategy of the constant-comparative technique. Results. We developed a taxonomy of knowledge modification which includes (1 adding, (2 correcting, (3 limiting, (4 relating, (5 specifying and (6 transforming. Conclusion. The taxonomy may be expanded and elaborated as the project progress and findings are expected to be incorporated into the design of the naïve ontology. The study results provided theoretical implications on knowledge building, methodological implications on data collection using eye-tracker and showing eye-movements in the post-search interviews and useful information on the design of information access interface for novices users.

  2. Open meta-search with OpenSearch: a case study

    OpenAIRE

    O'Riordan, Adrian P.

    2007-01-01

    The goal of this project was to demonstrate the possibilities of open source search engine and aggregation technology in a Web environment by building a meta-search engine which employs free open search engines and open protocols. In contrast many meta-search engines on the Internet use proprietary search systems. The search engines employed in this case study are all based on the OpenSearch protocol. OpenSearch-compliant systems support XML technologies such as RSS and Atom for aggregation a...

  3. A measurement concept for hot-spot BRDFs from space

    Energy Technology Data Exchange (ETDEWEB)

    Gerstl, S.A.W.

    1996-09-01

    Several concepts for canopy hot-spot measurements from space have been investigated. The most promising involves active illumination and bistatic detection that would allow hot-spot angular distribution (BRDF) measurements from space in a search-light mode. The concept includes a pointable illumination source, such as a laser operating at an atmospheric window wavelength, coupled with a number of high spatial-resolution detectors that are clustered around the illumination source in space, receiving photons nearly coaxial with the reto-reflection direction. Microwave control and command among the satellite cluster would allow orienting the direction of the laser beam as well as the focusing detectors simultaneously so that the coupled system can function like a search light with almost unlimited pointing capabilities. The concept is called the Hot-Spot Search-Light (HSSL) satellite. A nominal satellite altitude of 600 km will allow hot-spot BRDF measurements out to about 18 degrees phase angle. The distributed are taking radiometric measurements of the intensity wings of the hot-spot angular distribution without the need for complex imaging detectors. The system can be operated at night for increased signal-to-noise ratio. This way the hot-spot angular signatures can be quantified and parameterized in sufficient detail to extract the biophysical information content of plant architectures.

  4. A measurement concept for hot-spot BRDFs from space

    Science.gov (United States)

    Gerstl, S.A.W.

    1996-01-01

    Several concepts for canopy hot-spot measurements from space have been investigated. The most promising involves active illumination and bistatic detection that would allow hot-spot angular distribution (BRDF) measurements from space in a search-light mode. The concept includes a pointable illumination source, such as a laser operating at an atmospheric window wavelength, coupled with a number of high spatial-resolution detectors that are clustered around the illumination source in space, receiving photons nearly coaxial with the reto-reflection direction. Microwave control and command among the satellite cluster would allow orienting the direction of the laser beam as well as the focusing detectors simultaneously so that the coupled system can function like a search light with almost unlimited pointing capabilities. The concept is called the Hot-Spot Search-Light (HSSL) satellite. A nominal satellite altitude of 600 km will allow hot-spot BRDF measurements out to about 18 degrees phase angle. The distributed are taking radiometric measurements of the intensity wings of the hot-spot angular distribution without the need for complex imaging detectors. The system can be operated at night for increased signal-to-noise ratio. This way the hot-spot angular signatures can be quantified and parameterized in sufficient detail to extract the biophysical information content of plant architectures.

  5. Search for weakly interacting massive particles with the Cryogenic Dark Matter Search experiment

    Energy Technology Data Exchange (ETDEWEB)

    Saab, Tarek [Stanford U.

    2002-01-01

    From individual galaxies, to clusters of galaxies, to in between the cushions of your sofa, Dark Matter appears to be pervasive on every scale. With increasing accuracy, recent astrophysical measurements, from a variety of experiments, are arriving at the following cosmological model : a flat cosmology (Ωk = 0) with matter and energy densities contributing roughly 1/3 and 2/3 (Ωm = 0.35, ΩΛ = 0.65). Of the matter contribution, it appears that only ~ 10% (Ωb ~ 0.04) is attributable to baryons. Astrophysical measurements constrain the remaining matter to be non-realtivistic, interacting primarily gravitationally. Various theoretical models for such Dark Matter exist. A leading candidate for the non-baryonic matter are Weakly Interacting Massive Particles (dubbed WIMPS). These particles, and their relic density may be naturally explained within the framework of Super-Symmetry theories. SuperSymmetry also offers predictions as to the scattering rates of WIMPs with baryonic matter allowing for the design and tailoring of experiments that search specifically for the WIMPs. The Cryogenic Dark Matter Search experiment is searching for evidence of WIMP interactions in crystals of Ge and Si. Using cryogenic detector technology to measure both the phonon and ionization response to a particle recoil the CDMS detectors are able to discriminate between electron and nuclear recoils, thus reducing the large rates of electron recoil backgrounds to levels with which a Dark Matter search is not only feasible, but far-reaching. This thesis will describe in some detail the physical principles behind the CDMS detector technology, highlighting the final step in the evolution of the detector design and characterization techniques. In addition, data from a 100 day long exposure of the current run at the Stanford Underground Facility will be presented, with focus given to detector performance as well as to the implications on allowable WIMP mass - cross-section parameter space.

  6. GLAST, the Gamma-ray Large Area Space Telescope

    CERN Document Server

    De Angelis, A

    2001-01-01

    GLAST, a detector for cosmic gamma rays in the range from 20 MeV to 300 GeV, will be launched in space in 2005. Breakthroughs are expected in particular in the study of particle acceleration mechanisms in space and of gamma ray bursts, and maybe on the search for cold dark matter; but of course the most exciting discoveries could come from the unexpected.

  7. Fault-Tolerant NDE Data Reduction Framework, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — A distributed fault tolerant nondestructive evaluation (NDE) data reduction framework is proposed in which large NDE datasets are mapped to thousands to millions of...

  8. Drag Reduction through Pulsed Plasma Actuators, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Drag reduction is a fundamental necessity in all aerodynamic designs, as it directly affects aircraft fuel efficiency which in turn affects endurance, range, and...

  9. NASA space station automation: AI-based technology review

    Science.gov (United States)

    Firschein, O.; Georgeff, M. P.; Park, W.; Neumann, P.; Kautz, W. H.; Levitt, K. N.; Rom, R. J.; Poggio, A. A.

    1985-01-01

    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures.

  10. Challenges for future space power systems

    International Nuclear Information System (INIS)

    Brandhorst, H.W. Jr.

    1989-01-01

    Forecasts of space power needs are presented. The needs fall into three broad categories: survival, self-sufficiency, and industrialization. The cost of delivering payloads to orbital locations and from Low Earth Orbit (LEO) to Mars are determined. Future launch cost reductions are predicted. From these projections the performances necessary for future solar and nuclear space power options are identified. The availability of plentiful cost effective electric power and of low cost access to space are identified as crucial factors in the future extension of human presence in space

  11. Adaptive Sampling for Nonlinear Dimensionality Reduction Based on Manifold Learning

    DEFF Research Database (Denmark)

    Franz, Thomas; Zimmermann, Ralf; Goertz, Stefan

    2017-01-01

    We make use of the non-intrusive dimensionality reduction method Isomap in order to emulate nonlinear parametric flow problems that are governed by the Reynolds-averaged Navier-Stokes equations. Isomap is a manifold learning approach that provides a low-dimensional embedding space that is approxi...... to detect and fill up gaps in the sampling in the embedding space. The performance of the proposed manifold filling method will be illustrated by numerical experiments, where we consider nonlinear parameter-dependent steady-state Navier-Stokes flows in the transonic regime.......We make use of the non-intrusive dimensionality reduction method Isomap in order to emulate nonlinear parametric flow problems that are governed by the Reynolds-averaged Navier-Stokes equations. Isomap is a manifold learning approach that provides a low-dimensional embedding space...

  12. Augmentation of Virtual Space Physics Observatory Services to Expand Data Access Capabilities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aquilent, Inc. proposes to support the effort of Virtual Space Physics Observatory (VSPO) by developing services to expand the VSPO search capabilities, developing...

  13. CLUSTER STAFF search coils magnetometer calibration – comparisons with FGM

    Czech Academy of Sciences Publication Activity Database

    Robert, P.; Cornilleau-Wehrlin, N.; Piberne, R.; De Conchy, Y.; Lacombe, C.; Bouzid, V.; Grison, Benjamin; Alison, D.; Canu, P.

    2013-01-01

    Roč. 3, č. 2 (2013), s. 679-751 ISSN 2193-0872 Institutional support: RVO:68378289 Keywords : instrumentation * search coils * space physics * calibration Subject RIV: BL - Plasma and Gas Discharge Physics http://www.geosci-instrum-method-data-syst-discuss.net/3/679/2013/gid-3-679-2013.pdf

  14. Scintillation Reduction using Conjugate-Plane Imaging (Abstract)

    Science.gov (United States)

    Vander Haagen, G. A.

    2017-12-01

    (Abstract only) All observatories are plagued by atmospheric turbulence exhibited as star scintillation or "twinkle" whether a high altitude adaptive optics research or a 30-cm amateur telescope. It is well known that these disturbances are caused by wind and temperature-driven refractive gradients in the atmosphere and limit the ultimate photometric resolution of land-based facilities. One approach identified by Fuchs (1998) for scintillation noise reduction was to create a conjugate image space at the telescope and focus on the dominant conjugate turbulent layer within that space. When focused on the turbulent layer little or no scintillation exists. This technique is described whereby noise reductions of 6 to 11/1 have been experienced with mathematical and optical bench simulations. Discussed is a proof-of-principle conjugate optical train design for an 80-mm, f7 telescope.

  15. State Space Reduction of Linear Processes using Control Flow Reconstruction

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Timmer, Mark

    2009-01-01

    We present a new method for fighting the state space explosion of process algebraic specifications, by performing static analysis on an intermediate format: linear process equations (LPEs). Our method consists of two steps: (1) we reconstruct the LPE's control flow, detecting control flow parameters

  16. State Space Reduction of Linear Processes Using Control Flow Reconstruction

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Timmer, Mark; Liu, Zhiming; Ravn, Anders P.

    2009-01-01

    We present a new method for fighting the state space explosion of process algebraic specifications, by performing static analysis on an intermediate format: linear process equations (LPEs). Our method consists of two steps: (1) we reconstruct the LPE's control flow, detecting control flow parameters

  17. Comparison of multiobjective harmony search, cuckoo search and bat-inspired algorithms for renewable distributed generation placement

    Directory of Open Access Journals (Sweden)

    John E. Candelo-Becerra

    2015-07-01

    Full Text Available Electric power losses have a significant impact on the total costs of distribution networks. The use of renewable energy sources is a major alternative to improve power losses and costs, although other important issues are also enhanced such as voltage magnitudes and network congestion. However, determining the best location and size of renewable energy generators can be sometimes a challenging task due to a large number of possible combinations in the search space. Furthermore, the multiobjective functions increase the complexity of the problem and metaheuristics are preferred to find solutions in a relatively short time. This paper evaluates the performance of the cuckoo search (CS, harmony search (HS, and bat-inspired (BA algorithms for the location and size of renewable distributed generation (RDG in radial distribution networks using a multiobjective function defined as minimizing the energy losses and the RDG costs. The metaheuristic algorithms were programmed in Matlab and tested using the 33-node radial distribution network. The three algorithms obtained similar results for the two objectives evaluated, finding points close to the best solutions in the Pareto front. Comparisons showed that the CS obtained the minimum results for most points evaluated, but the BA and the HS were close to the best solution.

  18. Reductions of NO2 detected from space during the 2008 Beijing Olympic Games

    Science.gov (United States)

    Mijling, B.; van der A, R. J.; Boersma, K. F.; Van Roozendael, M.; De Smedt, I.; Kelder, H. M.

    2009-07-01

    During the 2008 Olympic and Paralympic Games in Beijing (from 8 August to 17 September), local authorities enforced strong measures to reduce air pollution during the events. To evaluate the direct effect of these measures, we use the tropospheric NO2 column observations from the satellite instruments GOME-2 and OMI. We interpret these data against simulations from the regional chemistry transport model CHIMERE, based on a 2006 emission inventory, and find a reduction of NO2 concentrations of approximately 60% above Beijing during the Olympic period. The air quality measures were especially effective in the Beijing area, but also noticeable in surrounding cities of Tianjin (30% reduction) and Shijiazhuang (20% reduction).

  19. Search for radions at LEP2

    International Nuclear Information System (INIS)

    Abbiendi, G.; Ainsley, C.; Akesson, P.F.

    2005-01-01

    A new scalar resonance, called the radion, with couplings to fermions and bosons similar to those of the Higgs boson, is predicted in the framework of Randall-Sundrum models, proposed solutions to the hierarchy problem with one extra dimension. An important distinction between the radion and the Higgs boson is that the radion would couple directly to gluon pairs, and in particular its decay products would include a significant fraction of gluon jets. The radion has the same quantum numbers as the Standard Model (SM) Higgs boson, and therefore they can mix, with the resulting mass eigenstates having properties different from those of the SM Higgs boson. Existing searches for the Higgs bosons are sensitive to the possible production and decay of radions and Higgs bosons in these models. For the first time, searches for the SM Higgs boson and flavour-independent and decay-mode independent searches for a neutral Higgs boson are used in combination to explore the parameter space of the Randall-Sundrum model. In the dataset recorded by the OPAL experiment at LEP, no evidence for radion or Higgs particle production was observed in any of those searches at centre-of-mass energies up to 209 GeV. The results are used to set limits on the radion and Higgs boson masses. For all parameters of the Randall-Sundrum model, the data exclude masses below 58 GeV for the mass eigenstate which becomes the Higgs boson in the no-mixing limit

  20. System network planning expansion using mathematical programming, genetic algorithms and tabu search

    International Nuclear Information System (INIS)

    Sadegheih, A.; Drake, P.R.

    2008-01-01

    In this paper, system network planning expansion is formulated for mixed integer programming, a genetic algorithm (GA) and tabu search (TS). Compared with other optimization methods, GAs are suitable for traversing large search spaces, since they can do this relatively rapidly and because the use of mutation diverts the method away from local minima, which will tend to become more common as the search space increases in size. GA's give an excellent trade off between solution quality and computing time and flexibility for taking into account specific constraints in real situations. TS has emerged as a new, highly efficient, search paradigm for finding quality solutions to combinatorial problems. It is characterized by gathering knowledge during the search and subsequently profiting from this knowledge. The attractiveness of the technique comes from its ability to escape local optimality. The cost function of this problem consists of the capital investment cost in discrete form, the cost of transmission losses and the power generation costs. The DC load flow equations for the network are embedded in the constraints of the mathematical model to avoid sub-optimal solutions that can arise if the enforcement of such constraints is done in an indirect way. The solution of the model gives the best line additions and also provides information regarding the optimal generation at each generation point. This method of solution is demonstrated on the expansion of a 10 bus bar system to 18 bus bars. Finally, a steady-state genetic algorithm is employed rather than generational replacement, also uniform crossover is used

  1. Short Term Gain, Long Term Pain:Informal Job Search Methods and Post-Displacement Outcomes

    OpenAIRE

    Green, Colin

    2012-01-01

    This paper examines the role of informal job search methods on the labour market outcomes of displaced workers. Informal job search methods could alleviate short-term labour market difficulties of displaced workers by providing information on job opportunities, allowing them to signal their productivity and may mitigate wage losses through better post-displacement job matching. However if displacement results from reductions in demand for specific sectors/skills, the use of informal job searc...

  2. Spanning the Home/Work Creative Space

    DEFF Research Database (Denmark)

    Davis, Lee N.; Davis, Jerome; Hoisl, Karin

    the employee brings to work. Based on Woodman et al.’s (1993) “interactionist perspective” on organizational creativity, supplemented by literature on search and knowledge re/combination, we explore whether and how leisure time activities can span the creative space between the employee’s home and workplace...

  3. Space Launch System Accelerated Booster Development Cycle

    Science.gov (United States)

    Arockiam, Nicole; Whittecar, William; Edwards, Stephen

    2012-01-01

    With the retirement of the Space Shuttle, NASA is seeking to reinvigorate the national space program and recapture the public s interest in human space exploration by developing missions to the Moon, near-earth asteroids, Lagrange points, Mars, and beyond. The would-be successor to the Space Shuttle, NASA s Constellation Program, planned to take humans back to the Moon by 2020, but due to budgetary constraints was cancelled in 2010 in search of a more "affordable, sustainable, and realistic" concept2. Following a number of studies, the much anticipated Space Launch System (SLS) was unveiled in September of 2011. The SLS core architecture consists of a cryogenic first stage with five Space Shuttle Main Engines (SSMEs), and a cryogenic second stage using a new J-2X engine3. The baseline configuration employs two 5-segment solid rocket boosters to achieve a 70 metric ton payload capability, but a new, more capable booster system will be required to attain the goal of 130 metric tons to orbit. To this end, NASA s Marshall Space Flight Center recently released a NASA Research Announcement (NRA) entitled "Space Launch System (SLS) Advanced Booster Engineering Demonstration and/or Risk Reduction." The increased emphasis on affordability is evident in the language used in the NRA, which is focused on risk reduction "leading to an affordable Advanced Booster that meets the evolved capabilities of SLS" and "enabling competition" to "enhance SLS affordability. The purpose of the work presented in this paper is to perform an independent assessment of the elements that make up an affordable and realistic path forward for the SLS booster system, utilizing advanced design methods and technology evaluation techniques. The goal is to identify elements that will enable a more sustainable development program by exploring the trade space of heavy lift booster systems and focusing on affordability, operability, and reliability at the system and subsystem levels5. For this study

  4. Playing Multi-Action Adversarial Games: Online Evolutionary Planning versus Tree Search

    DEFF Research Database (Denmark)

    Justesen, Niels; Mahlmann, Tobias; Risi, Sebastian

    2017-01-01

    We address the problem of playing turn-based multi-action adversarial games, which include many strategy games with extremely high branching factors as players take multiple actions each turn. This leads to the breakdown of standard tree search methods, including Monte Carlo Tree Search (MCTS......), as they become unable to reach a sufficient depth in the game tree. In this paper, we introduce Online Evolutionary Planning (OEP) to address this challenge, which searches for combinations of actions to perform during a single turn guided by a fitness function that evaluates the quality of a particular state....... We compare OEP to different MCTS variations that constrain the exploration to deal with the high branching factor in the turn-based multi-action game Hero Academy. While the constrained MCTS variations outperform the vanilla MCTS implementation by a large margin, OEP is able to search the space...

  5. Searching for light dark matter with the SLAC millicharge experiment.

    Science.gov (United States)

    Diamond, M; Schuster, P

    2013-11-27

    New sub-GeV gauge forces ("dark photons") that kinetically mix with the photon provide a promising scenario for MeV-GeV dark matter and are the subject of a program of searches at fixed-target and collider facilities around the world. In such models, dark photons produced in collisions may decay invisibly into dark-matter states, thereby evading current searches. We reexamine results of the SLAC mQ electron beam dump experiment designed to search for millicharged particles and find that it was strongly sensitive to any secondary beam of dark matter produced by electron-nucleus collisions in the target. The constraints are competitive for dark photon masses in the ~1-30 MeV range, covering part of the parameter space that can reconcile the apparent (g-2)(μ) anomaly. Simple adjustments to the original SLAC search for millicharges may extend sensitivity to cover a sizable portion of the remaining (g-2)(μ) anomaly-motivated region. The mQ sensitivity is therefore complementary to ongoing searches for visible decays of dark photons. Compared to existing direct-detection searches, mQ sensitivity to electron-dark-matter scattering cross sections is more than an order of magnitude better for a significant range of masses and couplings in simple models.

  6. Lung volume reduction for emphysema.

    Science.gov (United States)

    Shah, Pallav L; Herth, Felix J; van Geffen, Wouter H; Deslee, Gaetan; Slebos, Dirk-Jan

    2017-02-01

    Advanced emphysema is a lung disease in which alveolar capillary units are destroyed and supporting tissue is lost. The combined effect of reduced gas exchange and changes in airway dynamics impairs expiratory airflow and leads to progressive air trapping. Pharmacological therapies have limited effects. Surgical resection of the most destroyed sections of the lung can improve pulmonary function and exercise capacity but its benefit is tempered by significant morbidity. This issue stimulated a search for novel approaches to lung volume reduction. Alternative minimally invasive approaches using bronchoscopic techniques including valves, coils, vapour thermal ablation, and sclerosant agents have been at the forefront of these developments. Insertion of endobronchial valves in selected patients could have benefits that are comparable with lung volume reduction surgery. Endobronchial coils might have a role in the treatment of patients with emphysema with severe hyperinflation and less parenchymal destruction. Use of vapour thermal energy or a sclerosant might allow focal treatment but the unpredictability of the inflammatory response limits their current use. In this Review, we aim to summarise clinical trial evidence on lung volume reduction and provide guidance on patient selection for available therapies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Ozone response to emission reductions in the southeastern United States

    Science.gov (United States)

    Blanchard, Charles L.; Hidy, George M.

    2018-06-01

    Ozone (O3) formation in the southeastern US is studied in relation to nitrogen oxide (NOx) emissions using long-term (1990s-2015) surface measurements of the Southeastern Aerosol Research and Characterization (SEARCH) network, U.S. Environmental Protection Agency (EPA) O3 measurements, and EPA Clean Air Status and Trends Network (CASTNET) nitrate deposition data. Annual fourth-highest daily peak 8 h O3 mixing ratios at EPA monitoring sites in Georgia, Alabama, and Mississippi exhibit statistically significant (p total oxidized nitrogen (NOy) mixing ratios at SEARCH sites declined in proportion to NOx emission reductions. CASTNET data show declining wet and dry nitrate deposition since the late 1990s, with total (wet plus dry) nitrate deposition fluxes decreasing linearly in proportion to reductions of NOx emissions by ˜ 60 % in Alabama and Georgia. Annual nitrate deposition rates at Georgia and Alabama CASTNET sites correspond to 30 % of Georgia emission rates and 36 % of Alabama emission rates, respectively. The fraction of NOx emissions lost to deposition has not changed. SEARCH and CASTNET sites exhibit downward trends in mean annual nitric acid (HNO3) concentrations. Observed relationships of O3 to NOz (NOy-NOx) support past model predictions of increases in cycling of NO and increasing responsiveness of O3 to NOx. The study data provide a long-term record that can be used to examine the accuracy of process relationships embedded in modeling efforts. Quantifying observed O3 trends and relating them to reductions in ambient NOy species concentrations offers key insights into processes of general relevance to air quality management and provides important information supporting strategies for reducing O3 mixing ratios.

  8. Spatial Search Techniques for Mobile 3D Queries in Sensor Web Environments

    Directory of Open Access Journals (Sweden)

    James D. Carswell

    2013-03-01

    Full Text Available Developing mobile geo-information systems for sensor web applications involves technologies that can access linked geographical and semantically related Internet information. Additionally, in tomorrow’s Web 4.0 world, it is envisioned that trillions of inexpensive micro-sensors placed throughout the environment will also become available for discovery based on their unique geo-referenced IP address. Exploring these enormous volumes of disparate heterogeneous data on today’s location and orientation aware smartphones requires context-aware smart applications and services that can deal with “information overload”. 3DQ (Three Dimensional Query is our novel mobile spatial interaction (MSI prototype that acts as a next-generation base for human interaction within such geospatial sensor web environments/urban landscapes. It filters information using “Hidden Query Removal” functionality that intelligently refines the search space by calculating the geometry of a three dimensional visibility shape (Vista space at a user’s current location. This 3D shape then becomes the query “window” in a spatial database for retrieving information on only those objects visible within a user’s actual 3D field-of-view. 3DQ reduces information overload and serves to heighten situation awareness on constrained commercial off-the-shelf devices by providing visibility space searching as a mobile web service. The effects of variations in mobile spatial search techniques in terms of query speed vs. accuracy are evaluated and presented in this paper.

  9. Dimensionality reduction of collective motion by principal manifolds

    Science.gov (United States)

    Gajamannage, Kelum; Butail, Sachit; Porfiri, Maurizio; Bollt, Erik M.

    2015-01-01

    While the existence of low-dimensional embedding manifolds has been shown in patterns of collective motion, the current battery of nonlinear dimensionality reduction methods is not amenable to the analysis of such manifolds. This is mainly due to the necessary spectral decomposition step, which limits control over the mapping from the original high-dimensional space to the embedding space. Here, we propose an alternative approach that demands a two-dimensional embedding which topologically summarizes the high-dimensional data. In this sense, our approach is closely related to the construction of one-dimensional principal curves that minimize orthogonal error to data points subject to smoothness constraints. Specifically, we construct a two-dimensional principal manifold directly in the high-dimensional space using cubic smoothing splines, and define the embedding coordinates in terms of geodesic distances. Thus, the mapping from the high-dimensional data to the manifold is defined in terms of local coordinates. Through representative examples, we show that compared to existing nonlinear dimensionality reduction methods, the principal manifold retains the original structure even in noisy and sparse datasets. The principal manifold finding algorithm is applied to configurations obtained from a dynamical system of multiple agents simulating a complex maneuver called predator mobbing, and the resulting two-dimensional embedding is compared with that of a well-established nonlinear dimensionality reduction method.

  10. Isomorphism Theorem on Vector Spaces over a Ring

    Directory of Open Access Journals (Sweden)

    Futa Yuichi

    2017-10-01

    Full Text Available In this article, we formalize in the Mizar system [1, 4] some properties of vector spaces over a ring. We formally prove the first isomorphism theorem of vector spaces over a ring. We also formalize the product space of vector spaces. ℤ-modules are useful for lattice problems such as LLL (Lenstra, Lenstra and Lovász [5] base reduction algorithm and cryptographic systems [6, 2].

  11. On the use of cartographic projections in visualizing phylo-genetic tree space

    Directory of Open Access Journals (Sweden)

    Clement Mark

    2010-06-01

    Full Text Available Abstract Phylogenetic analysis is becoming an increasingly important tool for biological research. Applications include epidemiological studies, drug development, and evolutionary analysis. Phylogenetic search is a known NP-Hard problem. The size of the data sets which can be analyzed is limited by the exponential growth in the number of trees that must be considered as the problem size increases. A better understanding of the problem space could lead to better methods, which in turn could lead to the feasible analysis of more data sets. We present a definition of phylogenetic tree space and a visualization of this space that shows significant exploitable structure. This structure can be used to develop search methods capable of handling much larger data sets.

  12. Food Reduction in Avicenna's View and Related Principles in Classical Medicine.

    Science.gov (United States)

    Nozad, Aisan; Naseri, Mohsen; Safari, Mir Bahram; Abd Al Ahadi, Azam; Ghaffari, Farzaneh

    2016-06-01

    Traditional Iranian medicine (TIM) is a rich and valuable school of thought that believes medications are not the only effective approach for the treatment of diseases but that nutrition is also important. Our study includes two parts; the first is a book review of the Canon of Medicine by Avicenna (10th and 11th centuries), in which we focus on finding and understanding Avicenna's point of view. In the second part, we searched for "food reduction" as a key word from 2000 to 2015 in databases such as Google Scholar, PubMed, Copernicus, DOAJ, EBSCO-CINAHL, and the Iranian search database Iranmedex for principles of food reduction in classical medicine. The main methods of treatment in traditional medicine include changes in lifestyle, especially diet, the use of medications, and the use of manipulation methods. For diet, the individual may be prohibited from eating or food amounts may be decreased or increased. Centuries ago, Avicenna was making use of methods of food reduction as an important therapeutic approach in the treatment of diseases. According to him, food reduction, to the extent that it does not cause energy loss helps to cure disease. Avicenna has proposed food reduction as an aid to treating a variety of ailments such as headaches and reflux. Today, a variety of basic and clinical research has shown that food reduction or calorie restriction to a standard level can effectively prevent and treat a variety of diseases such as neoplasms, diabetes, and kidney disease. Practical principles explained by traditional Iranian medicine, in particular Avicenna, could open important and quite uncomplicated strategies for the prevention and treatment of diseases.

  13. A hybrid metaheuristic for the time-dependent vehicle routing problem with hard time windows

    Directory of Open Access Journals (Sweden)

    N. Rincon-Garcia

    2017-01-01

    Full Text Available This article paper presents a hybrid metaheuristic algorithm to solve the time-dependent vehicle routing problem with hard time windows. Time-dependent travel times are influenced by different congestion levels experienced throughout the day. Vehicle scheduling without consideration of congestion might lead to underestimation of travel times and consequently missed deliveries. The algorithm presented in this paper makes use of Large Neighbourhood Search approaches and Variable Neighbourhood Search techniques to guide the search. A first stage is specifically designed to reduce the number of vehicles required in a search space by the reduction of penalties generated by time-window violations with Large Neighbourhood Search procedures. A second stage minimises the travel distance and travel time in an ‘always feasible’ search space. Comparison of results with available test instances shows that the proposed algorithm is capable of obtaining a reduction in the number of vehicles (4.15%, travel distance (10.88% and travel time (12.00% compared to previous implementations in reasonable time.

  14. Footprints: A Visual Search Tool that Supports Discovery and Coverage Tracking.

    Science.gov (United States)

    Isaacs, Ellen; Domico, Kelly; Ahern, Shane; Bart, Eugene; Singhal, Mudita

    2014-12-01

    Searching a large document collection to learn about a broad subject involves the iterative process of figuring out what to ask, filtering the results, identifying useful documents, and deciding when one has covered enough material to stop searching. We are calling this activity "discoverage," discovery of relevant material and tracking coverage of that material. We built a visual analytic tool called Footprints that uses multiple coordinated visualizations to help users navigate through the discoverage process. To support discovery, Footprints displays topics extracted from documents that provide an overview of the search space and are used to construct searches visuospatially. Footprints allows users to triage their search results by assigning a status to each document (To Read, Read, Useful), and those status markings are shown on interactive histograms depicting the user's coverage through the documents across dates, sources, and topics. Coverage histograms help users notice biases in their search and fill any gaps in their analytic process. To create Footprints, we used a highly iterative, user-centered approach in which we conducted many evaluations during both the design and implementation stages and continually modified the design in response to feedback.

  15. Testing of indoor radon reduction techniques in central Ohio houses: Phase 2 (Winter 1988-1989). Final report, September 1988-May 1989

    International Nuclear Information System (INIS)

    Findlay, W.O.; Robertson, A.; Scott, A.G.

    1990-05-01

    The report gives results of tests of developmental indoor radon reduction techniques in nine slab-on-grade and four crawl-space houses near Dayton, Ohio. The slab-on-grade tests indicated that, when there is a good layer of aggregate under the slab, the sub-slab ventilation (SSV) mitigation technique, with only one or two suction pipes, can generally reduce indoor concentrations below 2 pCi/L (86 to 99% reduction). These reductions can be achieved even when: there are forced-air supply ducts under the slab; the slab is large (up to 2600 sq ft); and the foundation walls are hollow block. Operating the SSV system in suction always gave greater reductions than did operating in pressure. The crawl-space tests demonstrated that depressurizing under a plastic liner over the crawl-space floor was able to reduce living-area radon concentrations below 2 pCi/L (81 to 96% reduction). The performance of such sub-liner depressurization gave better reductions than did crawl-space ventilation (blowing air into, or out of, the crawl space). Completely covering the crawl-space floor with plastic sheeting was not always necessary to get adequate performance

  16. PWR loading pattern optimization using Harmony Search algorithm

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.

    2013-01-01

    Highlights: ► Numerical results reveal that the HS method is reliable. ► The great advantage of HS is significant gain in computational cost. ► On the average, the final band width of search fitness values is narrow. ► Our experiments show that the search approaches the optimal value fast. - Abstract: In this paper a core reloading technique using Harmony Search, HS, is presented in the context of finding an optimal configuration of fuel assemblies, FA, in pressurized water reactors. To implement and evaluate the proposed technique a Harmony Search along Nodal Expansion Code for 2-D geometry, HSNEC2D, is developed to obtain nearly optimal arrangement of fuel assemblies in PWR cores. This code consists of two sections including Harmony Search algorithm and Nodal Expansion modules using fourth degree flux expansion which solves two dimensional-multi group diffusion equations with one node per fuel assembly. Two optimization test problems are investigated to demonstrate the HS algorithm capability in converging to near optimal loading pattern in the fuel management field and other subjects. Results, convergence rate and reliability of the method are quite promising and show the HS algorithm performs very well and is comparable to other competitive algorithms such as Genetic Algorithm and Particle Swarm Intelligence. Furthermore, implementation of nodal expansion technique along HS causes considerable reduction of computational time to process and analysis optimization in the core fuel management problems

  17. Systematic Search for Chemical Reactions in Gas Phase Contributing to Methanol Formation in Interstellar Space.

    Science.gov (United States)

    Gamez-Garcia, Victoria G; Galano, Annia

    2017-10-05

    A massive search for chemical routes leading to methanol formation in gas phase has been conducted using computational chemistry, at the CBS-QB3 level of theory. The calculations were performed at five different temperatures (100, 80, 50, 20, and 10 K) and at three pressures (0.1, 0.01, and 0.001 atm) for each temperature. The search was focused on identifying reactions with the necessary features to be viable in the interstellar medium (ISM). A searching strategy was applied to that purpose, which allowed to reduce an initial set of 678 possible reactions to a subset of 11 chemical routes that are recommended, for the first time, as potential candidates for contributing to methanol formation in the gas phase of the ISM. They are all barrier-less, and thus they are expected to take place at collision rates. Hopefully, including these reactions in the currently available models, for the gas-phase methanol formation in the ISM, would help improving the predicted fractional abundance of this molecule in dark clouds. Further investigations, especially those dealing with grain chemistry and electronic excited states, would be crucial to get a complete picture of the methanol formation in the ISM.

  18. Search Patterns

    CERN Document Server

    Morville, Peter

    2010-01-01

    What people are saying about Search Patterns "Search Patterns is a delight to read -- very thoughtful and thought provoking. It's the most comprehensive survey of designing effective search experiences I've seen." --Irene Au, Director of User Experience, Google "I love this book! Thanks to Peter and Jeffery, I now know that search (yes, boring old yucky who cares search) is one of the coolest ways around of looking at the world." --Dan Roam, author, The Back of the Napkin (Portfolio Hardcover) "Search Patterns is a playful guide to the practical concerns of search interface design. It cont

  19. Green Space, Violence, and Crime: A Systematic Review.

    Science.gov (United States)

    Bogar, Sandra; Beyer, Kirsten M

    2016-04-01

    To determine the state of evidence on relationships among urban green space, violence, and crime in the United States. Major bibliographic databases were searched for studies meeting inclusion criteria. Additional studies were culled from study references and authors' personal collections. Comparison among studies was limited by variations in study design and measurement and results were mixed. However, more evidence supports the positive impact of green space on violence and crime, indicating great potential for green space to shape health-promoting environments. Numerous factors influence the relationships among green space, crime, and violence. Additional research and standardization among research studies are needed to better understand these relationships. © The Author(s) 2015.

  20. Hierarchical heuristic search using a Gaussian mixture model for UAV coverage planning.

    Science.gov (United States)

    Lin, Lanny; Goodrich, Michael A

    2014-12-01

    During unmanned aerial vehicle (UAV) search missions, efficient use of UAV flight time requires flight paths that maximize the probability of finding the desired subject. The probability of detecting the desired subject based on UAV sensor information can vary in different search areas due to environment elements like varying vegetation density or lighting conditions, making it likely that the UAV can only partially detect the subject. This adds another dimension of complexity to the already difficult (NP-Hard) problem of finding an optimal search path. We present a new class of algorithms that account for partial detection in the form of a task difficulty map and produce paths that approximate the payoff of optimal solutions. The algorithms use the mode goodness ratio heuristic that uses a Gaussian mixture model to prioritize search subregions. The algorithms search for effective paths through the parameter space at different levels of resolution. We compare the performance of the new algorithms against two published algorithms (Bourgault's algorithm and LHC-GW-CONV algorithm) in simulated searches with three real search and rescue scenarios, and show that the new algorithms outperform existing algorithms significantly and can yield efficient paths that yield payoffs near the optimal.

  1. Students are Confident Using Federated Search Tools as much as Single Databases. A Review of: Armstrong, A. (2009. Student perceptions of federated searching vs. single database searching. Reference Services Review, 37(3, 291-303. doi:10.1108/00907320910982785

    Directory of Open Access Journals (Sweden)

    Deena Yanofsky

    2011-09-01

    Full Text Available Objective – To measure students’ perceptions of the ease-of-use and efficacy of a federated search tool versus a single multidisciplinary database.Design – An evaluation worksheet, employing a combination of quantitative and qualitative questions.Setting – A required, first-year English composition course taught at the University of Illinois at Chicago (UIC.Subjects – Thirty-one undergraduate students completed and submitted the worksheet.Methods – Students attended two library instruction sessions. The first session introduced participants to basic Boolean searching (using AND only, selecting appropriate keywords and searching for books in the library catalogue. In the second library session, students were handed an evaluation worksheet and, with no introduction to the process of searching article databases, were asked to find relevant articles on a research topic of their own choosing using both a federated search tool and a single multidisciplinary database. The evaluation worksheet was divided into four sections: step-by-step instructions for accessing the single multidisciplinary database and the federated search tool; space to record search strings in both resources; space to record the titles of up to five relevant articles; and a series of quantitative and qualitative questions regarding ease-of-use, relevancy of results, overall preference (if any between the two resources, likeliness of future use and other preferred research tools. Half of the participants received a worksheet with instructions to search the federated search tool before the single database; the order was reversed for the other half of the students. The evaluation worksheet was designed to be completed in one hour.Participant responses to qualitative questions were analyzed, codified and grouped into thematic categories. If a student mentioned more than one factor in responding to a question, their response was recorded in multiple categories.Main Results

  2. Personalized Search

    CERN Document Server

    AUTHOR|(SzGeCERN)749939

    2015-01-01

    As the volume of electronically available information grows, relevant items become harder to find. This work presents an approach to personalizing search results in scientific publication databases. This work focuses on re-ranking search results from existing search engines like Solr or ElasticSearch. This work also includes the development of Obelix, a new recommendation system used to re-rank search results. The project was proposed and performed at CERN, using the scientific publications available on the CERN Document Server (CDS). This work experiments with re-ranking using offline and online evaluation of users and documents in CDS. The experiments conclude that the personalized search result outperform both latest first and word similarity in terms of click position in the search result for global search in CDS.

  3. Low-cost Radon Reduction Pilot Study

    Energy Technology Data Exchange (ETDEWEB)

    Rose, William B. [Univ. of Illinois, Urbana-Champaign, IL (United States); Francisco, Paul W. [Univ. of Illinois, Urbana-Champaign, IL (United States); Merrin, Zachary [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-09-01

    The U.S. Department of Energy's Building America research team Partnership for Advanced Residential Retrofits conducted a primary scoping study on the impact of air sealing between the foundation and the living space on radon transport reduction across the foundation and living space floor assembly. Fifteen homes in the Champaign, Illinois, area participated in the study. These homes were instrumented for hourly continuous radon measurements and simultaneous temperature and humidity measurements. Blower door and zone pressure diagnostics were conducted at each house. The treatments consisted of using air-sealing foams at the underside of the floor that separated the living space from the foundation and providing duct sealing on the ductwork that is situated in the foundation area. The hypothesis was that air sealing the floor system that separated the foundation from the living space should better isolate the living space from the foundation; this isolation should lead to less radon entering the living space from the foundation. If the hypothesis had been proven, retrofit energy-efficiency programs may have chosen to adopt these isolation methods for enhanced radon protection to the living space.

  4. Mastering Search Analytics Measuring SEO, SEM and Site Search

    CERN Document Server

    Chaters, Brent

    2011-01-01

    Many companies still approach Search Engine Optimization (SEO) and paid search as separate initiatives. This in-depth guide shows you how to use these programs as part of a comprehensive strategy-not just to improve your site's search rankings, but to attract the right people and increase your conversion rate. Learn how to measure, test, analyze, and interpret all of your search data with a wide array of analytic tools. Gain the knowledge you need to determine the strategy's return on investment. Ideal for search specialists, webmasters, and search marketing managers, Mastering Search Analyt

  5. Drag Reduction by Laminar Flow Control

    Directory of Open Access Journals (Sweden)

    Nils Beck

    2018-01-01

    Full Text Available The Energy System Transition in Aviation research project of the Aeronautics Research Center Niedersachsen (NFL searches for potentially game-changing technologies to reduce the carbon footprint of aviation by promoting and enabling new propulsion and drag reduction technologies. The greatest potential for aerodynamic drag reduction is seen in laminar flow control by boundary layer suction. While most of the research so far has been on partial laminarization by application of Natural Laminar Flow (NLF and Hybrid Laminar Flow Control (HLFC to wings, complete laminarization of wings, tails and fuselages promises much higher gains. The potential drag reduction and suction requirements, including the necessary compressor power, are calculated on component level using a flow solver with viscid/inviscid coupling and a 3D Reynolds-Averaged Navier-Stokes (RANS solver. The effect on total aircraft drag is estimated for a state-of-the-art mid-range aircraft configuration using preliminary aircraft design methods, showing that total cruise drag can be halved compared to today’s turbulent aircraft.

  6. Search for new particles with ALEPH

    International Nuclear Information System (INIS)

    Kasemann, M.

    1990-01-01

    Searches for the neutral Higgs particles in the Standard Model and the Minimal Supersymmetric Standard Model and for Neutralinos are preented. The Higgs particle in the Standard Model can be exclused in the mass range from 0 - 24 GeV. The light scalar Higgs boson h and the pseudoscalar Higgs boson A of the Minimal Supersymmetric Standard Model can be excluded in a large domain of the parameter space. The Higgs masses m h and M A are excluded at 95% C.L. in the whole range from 0 to 38.8 GeV for large values of the ratio of the vacuum expectation values of the two Higgs fields (v 2 /v 1 ). Limits on Z decay branching ratios into Neutralinos are reported and the results obtained are used to restrict substantially the parameter space of the Minimal Supersymmetric Standard Model

  7. Macroscopic reality and the dynamical reduction program

    International Nuclear Information System (INIS)

    Ghirardi, G.C.

    1995-10-01

    With reference to recently proposed theoretical models accounting for reduction in terms of a unified dynamics governing all physical processes, we analyze the problem of working out a worldview accommodating our knowledge about natural phenomena. We stress the relevant conceptual differences between the considered models and standard quantum mechanics. In spite of the fact that both theories describe individual physical systems within a genuine Hilbert space framework, the nice features of spontaneous reduction theories drastically limit the class of states which are dynamically stable. This allows one to work out a description of the world in terms of a mass density function in ordinary configuration space. A topology based on this function and differing radically from the one characterizing the Hilbert space is introduced and in terms of it the idea of similarity of macroscopic situations is made precise. Finally it is shown how the formalism and the proposed interpretation yield a natural criterion for establishing the psychophysical parallelism. The conclusion is that, within the considered theoretical models and at the nonrelativistic level, one can satisfy all sensible requirements for a consistent, unified, and objective description of reality at the macroscopic level. (author). 16 refs

  8. Macroscopic reality and the dynamical reduction program

    Energy Technology Data Exchange (ETDEWEB)

    Ghirardi, G C

    1995-10-01

    With reference to recently proposed theoretical models accounting for reduction in terms of a unified dynamics governing all physical processes, we analyze the problem of working out a worldview accommodating our knowledge about natural phenomena. We stress the relevant conceptual differences between the considered models and standard quantum mechanics. In spite of the fact that both theories describe individual physical systems within a genuine Hilbert space framework, the nice features of spontaneous reduction theories drastically limit the class of states which are dynamically stable. This allows one to work out a description of the world in terms of a mass density function in ordinary configuration space. A topology based on this function and differing radically from the one characterizing the Hilbert space is introduced and in terms of it the idea of similarity of macroscopic situations is made precise. Finally it is shown how the formalism and the proposed interpretation yield a natural criterion for establishing the psychophysical parallelism. The conclusion is that, within the considered theoretical models and at the nonrelativistic level, one can satisfy all sensible requirements for a consistent, unified, and objective description of reality at the macroscopic level. (author). 16 refs.

  9. Kent in space: Cosmic dust to space debris

    Science.gov (United States)

    McDonnell, J. A. M.

    1994-10-01

    The dusty heritage of the University of Kent's Space Group commenced at Jodrell Bank, Cheshire, U.K., the home of the largest steerable radio telescope. While Professor Bernard Lovell's 250 ft. diameter telescope was used to command the U.S. deep space Pioneer spacecraft, Professor Tony McDonnell, as a research student in 1960, was developing a space dust detector for the US-UK Ariel program. It was successful. With a Ph.D. safely under the belt, it seemed an inevitable step to go for the next higher degree, a B.T.A.] Two years with NASA at Goddard Space Flight Center, Greenbelt, provided excellent qualifications for such a graduation ('Been to America'). A spirited return to the University of Kent at Canterbury followed, to one of the green field UK University sites springing from the Robbins Report on Higher Education. Swimming against the current of the brain drain, and taking a very considerable reduction in salary, it was with some disappointment that he found that the UK Premier Harold Wilson's 'white-hot technological revolution' never quite seemed to materialize in terms of research funding] Research expertise, centered initially on cosmic dust, enlarged to encompass planetology during the Apollo program, and rightly acquired international acclaim, notching up a history of space missions over 25 years. The group now comprises 38 people supported by four sources: the government's Research Councils, the University, the Space Agencies and Industry. This paper describes the thrust of the group's Research Plan in Space Science and Planetology; not so much based on existing international space missions, but more helping to shape the direction and selection of space missions ahead.

  10. Coset Space Dimensional Reduction approach to the Standard Model

    International Nuclear Information System (INIS)

    Farakos, K.; Kapetanakis, D.; Koutsoumbas, G.; Zoupanos, G.

    1988-01-01

    We present a unified theory in ten dimensions based on the gauge group E 8 , which is dimensionally reduced to the Standard Mode SU 3c xSU 2 -LxU 1 , which breaks further spontaneously to SU 3L xU 1em . The model gives similar predictions for sin 2 θ w and proton decay as the minimal SU 5 G.U.T., while a natural choice of the coset space radii predicts light Higgs masses a la Coleman-Weinberg

  11. Efficient and automatic image reduction framework for space debris detection based on GPU technology

    Science.gov (United States)

    Diprima, Francesco; Santoni, Fabio; Piergentili, Fabrizio; Fortunato, Vito; Abbattista, Cristoforo; Amoruso, Leonardo

    2018-04-01

    In the last years, the increasing number of space debris has triggered the need of a distributed monitoring system for the prevention of possible space collisions. Space surveillance based on ground telescope allows the monitoring of the traffic of the Resident Space Objects (RSOs) in the Earth orbit. This space debris surveillance has several applications such as orbit prediction and conjunction assessment. In this paper is proposed an optimized and performance-oriented pipeline for sources extraction intended to the automatic detection of space debris in optical data. The detection method is based on the morphological operations and Hough Transform for lines. Near real-time detection is obtained using General Purpose computing on Graphics Processing Units (GPGPU). The high degree of processing parallelism provided by GPGPU allows to split data analysis over thousands of threads in order to process big datasets with a limited computational time. The implementation has been tested on a large and heterogeneous images data set, containing both imaging satellites from different orbit ranges and multiple observation modes (i.e. sidereal and object tracking). These images were taken during an observation campaign performed from the EQUO (EQUatorial Observatory) observatory settled at the Broglio Space Center (BSC) in Kenya, which is part of the ASI-Sapienza Agreement.

  12. A Declarative Design Approach to Modeling Traditional and Non-Traditional Space Systems

    Science.gov (United States)

    Hoag, Lucy M.

    The space system design process is known to be laborious, complex, and computationally demanding. It is highly multi-disciplinary, involving several interdependent subsystems that must be both highly optimized and reliable due to the high cost of launch. Satellites must also be capable of operating in harsh and unpredictable environments, so integrating high-fidelity analysis is important. To address each of these concerns, a holistic design approach is necessary. However, while the sophistication of space systems has evolved significantly in the last 60 years, improvements in the design process have been comparatively stagnant. Space systems continue to be designed using a procedural, subsystem-by-subsystem approach. This method is inadequate since it generally requires extensive iteration and limited or heuristic-based search, which can be slow, labor-intensive, and inaccurate. The use of a declarative design approach can potentially address these inadequacies. In the declarative programming style, the focus of a problem is placed on what the objective is, and not necessarily how it should be achieved. In the context of design, this entails knowledge expressed as a declaration of statements that are true about the desired artifact instead of explicit instructions on how to implement it. A well-known technique is through constraint-based reasoning, where a design problem is represented as a network of rules and constraints that are reasoned across by a solver to dynamically discover the optimal candidate(s). This enables implicit instantiation of the tradespace and allows for automatic generation of all feasible design candidates. As such, this approach also appears to be well-suited to modeling adaptable space systems, which generally have large tradespaces and possess configurations that are not well-known a priori. This research applied a declarative design approach to holistic satellite design and to tradespace exploration for adaptable space systems. The

  13. Discovery of gigantic molecular nanostructures using a flow reaction array as a search engine.

    Science.gov (United States)

    Zang, Hong-Ying; de la Oliva, Andreu Ruiz; Miras, Haralampos N; Long, De-Liang; McBurney, Roy T; Cronin, Leroy

    2014-04-28

    The discovery of gigantic molecular nanostructures like coordination and polyoxometalate clusters is extremely time-consuming since a vast combinatorial space needs to be searched, and even a systematic and exhaustive exploration of the available synthetic parameters relies on a great deal of serendipity. Here we present a synthetic methodology that combines a flow reaction array and algorithmic control to give a chemical 'real-space' search engine leading to the discovery and isolation of a range of new molecular nanoclusters based on [Mo(2)O(2)S(2)](2+)-based building blocks with either fourfold (C4) or fivefold (C5) symmetry templates and linkers. This engine leads us to isolate six new nanoscale cluster compounds: 1, {Mo(10)(C5)}; 2, {Mo(14)(C4)4(C5)2}; 3, {Mo(60)(C4)10}; 4, {Mo(48)(C4)6}; 5, {Mo(34)(C4)4}; 6, {Mo(18)(C4)9}; in only 200 automated experiments from a parameter space spanning ~5 million possible combinations.

  14. Hybrid Differential Dynamic Programming with Stochastic Search

    Science.gov (United States)

    Aziz, Jonathan; Parker, Jeffrey; Englander, Jacob

    2016-01-01

    Differential dynamic programming (DDP) has been demonstrated as a viable approach to low-thrust trajectory optimization, namely with the recent success of NASAs Dawn mission. The Dawn trajectory was designed with the DDP-based Static Dynamic Optimal Control algorithm used in the Mystic software. Another recently developed method, Hybrid Differential Dynamic Programming (HDDP) is a variant of the standard DDP formulation that leverages both first-order and second-order state transition matrices in addition to nonlinear programming (NLP) techniques. Areas of improvement over standard DDP include constraint handling, convergence properties, continuous dynamics, and multi-phase capability. DDP is a gradient based method and will converge to a solution nearby an initial guess. In this study, monotonic basin hopping (MBH) is employed as a stochastic search method to overcome this limitation, by augmenting the HDDP algorithm for a wider search of the solution space.

  15. Search Help

    Science.gov (United States)

    Guidance and search help resource listing examples of common queries that can be used in the Google Search Appliance search request, including examples of special characters, or query term seperators that Google Search Appliance recognizes.

  16. The Delta Scuti star 38 Eri from the ground and from space

    Science.gov (United States)

    Paparó, M.; Kolláth, Z.; Shobbrook, R. R.; Matthews, J. M.; Antoci, V.; Benkő, J. M.; Park, N.-K.; Mirtorabi, M. T.; Luedeke, K.; Kusakin, A.; Bognár, Zs; Sódor, Á.; García-Hernández, A.; Pe na, J. H.; Kuschnig, R.; Moffat, A. F. J.; Rowe, J.; Rucinski, S. M.; Sasselov, D.; Weiss, W. W.

    2018-04-01

    We present and discuss the pulsational characteristics of the Delta Scuti star 38 Eri from photometric data obtained at two widely spaced epochs, partly from the ground (1998) and partly from space (MOST, 2011). We found 18 frequencies resolving the discrepancy among the previously published frequencies. Some of the frequencies appeared with different relative amplitudes at two epochs, however, we carried out investigation for amplitude variability for only the MOST data. Amplitude variability was found for one of three frequencies that satisfy the necessary frequency criteria for linear-combination or resonant-mode coupling. Checking the criteria of beating and resonant-mode coupling we excluded them as possible reason for amplitude variability. The two recently developed methods of rotational-splitting and sequence-search were applied to find regular spacings based only on frequencies. Doublets or incomplete multiplets with l = 1, 2 and 3 were found in the rotational splitting search. In the sequence search method we identified four sequences. The averaged spacing, probably a combination of the large separation and the rotational frequency, is 1.724 ± 0.092 d-1. Using the spacing and the scaling relation \\bar{ρ }= [0.0394, 0.0554] gcm-3 was derived. The shift of the sequences proved to be the integer multiple of the rotational splitting spacing. Using the precise MOST frequencies and multi-colour photometry in a hybrid way, we identified four modes with l = 1, two modes with l = 2, two modes with l = 3, and two modes as l = 0 radial modes.

  17. A Pareto archive floating search procedure for solving multi-objective flexible job shop scheduling problem

    Directory of Open Access Journals (Sweden)

    J. S. Sadaghiani

    2014-04-01

    Full Text Available Flexible job shop scheduling problem is a key factor of using efficiently in production systems. This paper attempts to simultaneously optimize three objectives including minimization of the make span, total workload and maximum workload of jobs. Since the multi objective flexible job shop scheduling problem is strongly NP-Hard, an integrated heuristic approach has been used to solve it. The proposed approach was based on a floating search procedure that has used some heuristic algorithms. Within floating search procedure utilize local heuristic algorithms; it makes the considered problem into two sections including assigning and sequencing sub problem. First of all search is done upon assignment space achieving an acceptable solution and then search would continue on sequencing space based on a heuristic algorithm. This paper has used a multi-objective approach for producing Pareto solution. Thus proposed approach was adapted on NSGA II algorithm and evaluated Pareto-archives. The elements and parameters of the proposed algorithms were adjusted upon preliminary experiments. Finally, computational results were used to analyze efficiency of the proposed algorithm and this results showed that the proposed algorithm capable to produce efficient solutions.

  18. Thermodynamics of lunar ilmenite reduction

    Science.gov (United States)

    Altenberg, B. H.; Franklin, H. A.; Jones, C. H.

    1993-01-01

    With the prospect of returning to the moon, the development of a lunar occupation would fulfill one of the goals of the Space Exploration Initiative (SEI) of the late 1980's. Processing lunar resources into useful products, such as liquid oxygen for fuel and life support, would be one of many aspects of an active lunar base. ilmenite (FeTiO3) is found on the lunar surface and can be used as a feed stock to produce oxygen. Understanding the various ilmenite-reduction reactions elucidates many processing options. Defining the thermodynamic chemical behavior at equilibrium under various conditions of temperature and pressures can be helpful in specifying optimal operating conditions. Differences between a previous theoretical analysis and experimentally determined results has sparked interest in trying to understand the effect of operating pressure on the hydrogen-reduction-of-ilmenite reaction. Various aspects of this reduction reaction are discussed.

  19. Reduction procedures for accurate analysis of MSX surveillance experiment data

    Science.gov (United States)

    Gaposchkin, E. Mike; Lane, Mark T.; Abbot, Rick I.

    1994-01-01

    Technical challenges of the Midcourse Space Experiment (MSX) science instruments require careful characterization and calibration of these sensors for analysis of surveillance experiment data. Procedures for reduction of Resident Space Object (RSO) detections will be presented which include refinement and calibration of the metric and radiometric (and photometric) data and calculation of a precise MSX ephemeris. Examples will be given which support the reduction, and these are taken from ground-test data similar in characteristics to the MSX sensors and from the IRAS satellite RSO detections. Examples to demonstrate the calculation of a precise ephemeris will be provided from satellites in similar orbits which are equipped with S-band transponders.

  20. Tales from the Field: Search Strategies Applied in Web Searching

    Directory of Open Access Journals (Sweden)

    Soohyung Joo

    2010-08-01

    Full Text Available In their web search processes users apply multiple types of search strategies, which consist of different search tactics. This paper identifies eight types of information search strategies with associated cases based on sequences of search tactics during the information search process. Thirty-one participants representing the general public were recruited for this study. Search logs and verbal protocols offered rich data for the identification of different types of search strategies. Based on the findings, the authors further discuss how to enhance web-based information retrieval (IR systems to support each type of search strategy.

  1. Space 2000 Symposium

    Science.gov (United States)

    1999-01-01

    The purpose of the Space 2000 Symposium is to present the creativity and achievements of key figures of the 20th century. It offers a retrospective discussion on space exploration. It considers the future of the enterprise, and the legacy that will be left for future generations. The symposium includes panel discussions, smaller session meetings with some panelists, exhibits, and displays. The first session entitled "From Science Fiction to Science Facts" commences after a brief overview of the symposium. The panel discussions include talks on space exploration over many decades, and the missions of the millennium to search for life on Mars. The second session, "Risks and Rewards of Human Space Exploration," focuses on the training and health risks that astronauts face on their exploratory mission to space. Session three, "Messages and Messengers Informing and Inspire Space Exploration and the Public," focuses on the use of TV medium by educators and actors to inform and inspire a wide variety of audiences with adventures of space exploration. Session four, "The Legacy of Carl Sagan," discusses the influences made by Sagan to scientific research and the general public. In session five, "Space Exploration for a new Generation," two student speakers and the NASA Administrator Daniel S. Goldin address the group. Session six, "Destiny or Delusion? -- Humankind's Place in the Cosmos," ends the symposium with issues of space exploration and some thought provoking questions. Some of these issues and questions are: what will be the societal implications if we discover the origin of the universe, stars, or life; what will be the impact if scientists find clear evidence of life outside the domains of the Earth; should there be limits to what humans can or should learn; and what visionary steps should space-faring people take now for future generations.

  2. Black-hole ringdown search in TAMA300: matched filtering and event selections

    International Nuclear Information System (INIS)

    Tsunesada, Yoshiki; Kanda, Nobuyuki; Nakano, Hiroyuki; Tatsumi, Daisuke

    2005-01-01

    Detecting gravitational ringdown waves provides a probe for direct observation of astrophysical black holes. The masses and angular momenta of black holes can be determined from the waveforms by using the black-hole perturbation theory. In this paper we present data analysis methods to search for black-hole ringdowns of fundamental quasi-normal modes with interferometric gravitational wave detectors, and report an application to the TAMA300 data. Our method is based upon matched filtering by which we calculate cross-correlations between detector outputs and reference waveforms. In a search for gravitational signals, fake reductions and event identifications are of most importance. We developed two methods to reject spurious triggers in filter outputs in the time domain and examined their reduction powers. It is shown that by using the methods presented here the number of fake triggers can be reduced by an order with a false dismissal probability of 5%. We also discuss the possibility of using the higher order quasi-normal modes for event selection

  3. An indirect search for dark matter using antideuterons: the GAPS experiment

    International Nuclear Information System (INIS)

    Hailey, C J

    2009-01-01

    The general antiparticle spectrometer (GAPS) experiment is an indirect dark matter search. GAPS detects the antideuterons produced in WIMP-WIMP annihilation, a generic feature in many theories beyond the Standard Model. Antideuterons are a nearly background free signature of exotic physics. GAPS has substantial discovery potential for dark matter within the minimal supersymmetric model and its extensions, and models with universal extra dimensions. GAPS complements underground experiments, reaching parts of supersymmetric parameter space unavailable to them, and working to better constrain the properties of dark matter where they overlap in parameter space. GAPS is designed to be launched from a balloon. GAPS is funded for a prototype flight in 2011, to be followed by a long duration balloon flight to execute its science program. We discuss recent theoretical investigations on antideuteron searches, and their implications for experiment design. We describe the GAPS experiment placing particular emphasis on recent investigations that represent technical or conceptual extensions of the original GAPS concept.

  4. Searching Less Perturbed Circular Orbits for a Spacecraft Travelling around Europa

    Directory of Open Access Journals (Sweden)

    J. P. S. Carvalho

    2014-01-01

    Full Text Available Space missions to visit the natural satellite of Jupiter, Europa, constitute an important topic in space activities today, because missions to this moon are under study now. Several considerations have to be made for these missions. The present paper searches for less perturbed circular orbits around Europa. This search is made based on the total effects of the perturbing forces over the time, evaluated by the integral of those forces over the time. This value depends on the dynamical model and on the orbit of the spacecraft. The perturbing forces considered are the third-body perturbation that comes from Jupiter and the J2, J3, and C22 terms of the gravitational potential of Europa. Several numerical studies are performed and the results show the locations of the less perturbed orbits. Using those results, it is possible to find near-circular frozen orbits with smaller amplitudes of variations of the orbital elements.

  5. Internet Search Engines

    OpenAIRE

    Fatmaa El Zahraa Mohamed Abdou

    2004-01-01

    A general study about the internet search engines, the study deals main 7 points; the differance between search engines and search directories, components of search engines, the percentage of sites covered by search engines, cataloging of sites, the needed time for sites appearance in search engines, search capabilities, and types of search engines.

  6. University Students' Online Information Searching Strategies in Different Search Contexts

    Science.gov (United States)

    Tsai, Meng-Jung; Liang, Jyh-Chong; Hou, Huei-Tse; Tsai, Chin-Chung

    2012-01-01

    This study investigates the role of search context played in university students' online information searching strategies. A total of 304 university students in Taiwan were surveyed with questionnaires in which two search contexts were defined as searching for learning, and searching for daily life information. Students' online search strategies…

  7. Behavior and neural basis of near-optimal visual search

    Science.gov (United States)

    Ma, Wei Ji; Navalpakkam, Vidhya; Beck, Jeffrey M; van den Berg, Ronald; Pouget, Alexandre

    2013-01-01

    The ability to search efficiently for a target in a cluttered environment is one of the most remarkable functions of the nervous system. This task is difficult under natural circumstances, as the reliability of sensory information can vary greatly across space and time and is typically a priori unknown to the observer. In contrast, visual-search experiments commonly use stimuli of equal and known reliability. In a target detection task, we randomly assigned high or low reliability to each item on a trial-by-trial basis. An optimal observer would weight the observations by their trial-to-trial reliability and combine them using a specific nonlinear integration rule. We found that humans were near-optimal, regardless of whether distractors were homogeneous or heterogeneous and whether reliability was manipulated through contrast or shape. We present a neural-network implementation of near-optimal visual search based on probabilistic population coding. The network matched human performance. PMID:21552276

  8. Dark Matter Searches with the Fermi Large Area Telescope

    International Nuclear Information System (INIS)

    Meurer, Christine

    2008-01-01

    The Fermi Gamma-Ray Space Telescope, successfully launched on June 11th, 2008, is the next generation satellite experiment for high-energy gamma-ray astronomy. The main instrument, the Fermi Large Area Telescope (LAT), with a wide field of view (>2 sr), a large effective area (>8000 cm 2 at 1 GeV), sub-arcminute source localization, a large energy range (20 MeV-300 GeV) and a good energy resolution (close to 8% at 1 GeV), has excellent potential to either discover or to constrain a Dark Matter signal. The Fermi LAT team pursues complementary searches for signatures of particle Dark Matter in different search regions such as the galactic center, galactic satellites and subhalos, the milky way halo, extragalactic regions as well as the search for spectral lines. In these proceedings we examine the potential of the LAT to detect gamma-rays coming from Weakly Interacting Massive Particle annihilations in these regions with special focus on the galactic center region.

  9. Automating Deep Space Network scheduling and conflict resolution

    Science.gov (United States)

    Johnston, Mark D.; Clement, Bradley

    2005-01-01

    The Deep Space Network (DSN) is a central part of NASA's infrastructure for communicating with active space missions, from earth orbit to beyond the solar system. We describe our recent work in modeling the complexities of user requirements, and then scheduling and resolving conflicts on that basis. We emphasize our innovative use of background 'intelligent' assistants' that carry out search asynchrnously while the user is focusing on various aspects of the schedule.

  10. Searches for Dark Matter with the Fermi Large Area Telescope

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The nature of dark matter is a longstanding enigma of physics; it may consist of particles beyond the Standard Model that are still elusive to experiments. Among indirect search techniques, which look for stable products from the annihilation or decay of dark matter particles, or from axions coupling to high-energy photons, observations of the gamma-ray sky have come to prominence over the last few years, because of the excellent sensitivity and full-sky coverage of the Large Area Telescope (LAT) on the Fermi Gamma-ray Space Telescope mission. The LAT energy range from 20 MeV to above 300 GeV is particularly well suited for searching for products of the interactions of dark matter particles. In this talk I will describe targets studied for evidence of dark matter with the LAT, and review the status of searches performed with up to six years of LAT data. I will also discuss the factors that determine the sensitivities of these searches, including the magnitudes of the signals and the relevant backgrounds, c...

  11. Scanners and drillers: Characterizing expert visual search through volumetric images

    Science.gov (United States)

    Drew, Trafton; Vo, Melissa Le-Hoa; Olwal, Alex; Jacobson, Francine; Seltzer, Steven E.; Wolfe, Jeremy M.

    2013-01-01

    Modern imaging methods like computed tomography (CT) generate 3-D volumes of image data. How do radiologists search through such images? Are certain strategies more efficient? Although there is a large literature devoted to understanding search in 2-D, relatively little is known about search in volumetric space. In recent years, with the ever-increasing popularity of volumetric medical imaging, this question has taken on increased importance as we try to understand, and ultimately reduce, errors in diagnostic radiology. In the current study, we asked 24 radiologists to search chest CTs for lung nodules that could indicate lung cancer. To search, radiologists scrolled up and down through a “stack” of 2-D chest CT “slices.” At each moment, we tracked eye movements in the 2-D image plane and coregistered eye position with the current slice. We used these data to create a 3-D representation of the eye movements through the image volume. Radiologists tended to follow one of two dominant search strategies: “drilling” and “scanning.” Drillers restrict eye movements to a small region of the lung while quickly scrolling through depth. Scanners move more slowly through depth and search an entire level of the lung before moving on to the next level in depth. Driller performance was superior to the scanners on a variety of metrics, including lung nodule detection rate, percentage of the lung covered, and the percentage of search errors where a nodule was never fixated. PMID:23922445

  12. Extended supersymmetry in four-dimensional Euclidean space

    International Nuclear Information System (INIS)

    McKeon, D.G.C.; Sherry, T.N.

    2000-01-01

    Since the generators of the two SU(2) groups which comprise SO(4) are not Hermitian conjugates of each other, the simplest supersymmetry algebra in four-dimensional Euclidean space more closely resembles the N=2 than the N=1 supersymmetry algebra in four-dimensional Minkowski space. An extended supersymmetry algebra in four-dimensional Euclidean space is considered in this paper; its structure resembles that of N=4 supersymmetry in four-dimensional Minkowski space. The relationship of this algebra to the algebra found by dimensionally reducing the N=1 supersymmetry algebra in ten-dimensional Euclidean space to four-dimensional Euclidean space is examined. The dimensional reduction of N=1 super Yang-Mills theory in ten-dimensional Minkowski space to four-dimensional Euclidean space is also considered

  13. Searching for Cost-Optimized Interstellar Beacons

    Science.gov (United States)

    Benford, Gregory; Benford, James; Benford, Dominic

    2010-06-01

    What would SETI beacon transmitters be like if built by civilizations that had a variety of motives but cared about cost? In a companion paper, we presented how, for fixed power density in the far field, a cost-optimum interstellar beacon system could be built. Here, we consider how we should search for a beacon if it were produced by a civilization similar to ours. High-power transmitters could be built for a wide variety of motives other than the need for two-way communication; this would include beacons built to be seen over thousands of light-years. Extraterrestrial beacon builders would likely have to contend with economic pressures just as their terrestrial counterparts do. Cost, spectral lines near 1 GHz, and interstellar scintillation favor radiating frequencies substantially above the classic "water hole." Therefore, the transmission strategy for a distant, cost-conscious beacon would be a rapid scan of the galactic plane with the intent to cover the angular space. Such pulses would be infrequent events for the receiver. Such beacons built by distant, advanced, wealthy societies would have very different characteristics from what SETI researchers seek. Future searches should pay special attention to areas along the galactic disk where SETI searches have seen coherent signals that have not recurred on the limited listening time intervals we have used. We will need to wait for recurring events that may arriarrive in intermittent bursts. Several new SETI search strategies have emerged from these ideas. We propose a new test for beacons that is based on the Life Plane hypotheses.

  14. Search for New Physics in SHiP and at future colliders

    CERN Document Server

    AUTHOR|(CDS)2080890; Serra, Nicola; Storaci, Barbara

    2015-01-01

    SHiP is a newly proposed fixed-target experiment at the CERN SPS with the aim of searching for hidden particles that interact very weakly with SM particles. The work presented in this document investigates SHiP's physics reach in the parameter space of the Neutrino Minimal Standard Model ($\

  15. Space Ethics and Protection of the Space Environment

    Science.gov (United States)

    Williamson, Mark

    2002-01-01

    The construction of the International Space Station in low Earth orbit and the formulation of plans to search for life on Mars - one day by means of manned missions - indicate that mankind is intent on making the space environment part of its domain. Publicity surrounding space tourism, in-space `burials' and the sale of lunar `real estate' suggests that, some time in the 21st century, the space environment will become an extraterrestrial extension of our current business and domestic environment. This prompts the question of our collective attitude towards the space environment and the degree to which we should regulate its use and protect it for future generations. What, indeed, are the ethical considerations of space exploration and development? Ethics can be defined as "the philosophical study of the moral value of human conduct, and of the rules or principles that ought to govern it". More practically, it represents "an approved code of behaviour" adopted, for example, by a group or profession. If a set of ethics is to be developed for space, it is important that what we refer to as the `space community', or `space profession', is intimately involved. Indeed, if it is not, the profession risks having the job done for it, for example by politicians and members of the general public, who for their own reasons may wish to place restrictions on space development, or ban it altogether. The terrestrial nuclear power industry, for example, has already suffered this fate, while widespread ignorance of the subject has led to a moratorium on the use of RTGs in spacecraft. However, there is a danger in the discussion of ethics that consideration is confined to the philosophical aspects, thus excusing those involved from providing practical solutions to the problems that emerge. The fact that mankind has already affected, and arguably damaged, the space environment transports the discussion beyond the philosophical realm. This paper offers a pragmatic analysis of one

  16. Radon mitigation experience in houses with basements and adjoining crawl spaces

    International Nuclear Information System (INIS)

    Messing, M.; Henschel, D.B.

    1990-01-01

    Active soil depressurization systems were installed in four basement houses with adjoining crawl spaces in Maryland. In addition, existing soil depressurization systems were modified in two additional basement-plus-crawl-space houses. These six houses were selected to include both good and poor communication beneath the basement slab, and different degrees of importance of the crawl space as a source of the indoor radon. The radon reduction effectiveness was compared for: depressurization only under the basement slab; depressurization only under a polyethylene liner over the unpaved crawl-space floor; and simultaneous depressurization under both the basement slab and the crawl-space liner. The objective of this paper is to identify under what conditions treatment of the basement alone might provide sufficient radon reductions in houses of this substructure, and what incremental benefits might be achieved by also treating the crawl space

  17. Very large virtual compound spaces: construction, storage and utility in drug discovery.

    Science.gov (United States)

    Peng, Zhengwei

    2013-09-01

    Recent activities in the construction, storage and exploration of very large virtual compound spaces are reviewed by this report. As expected, the systematic exploration of compound spaces at the highest resolution (individual atoms and bonds) is intrinsically intractable. By contrast, by staying within a finite number of reactions and a finite number of reactants or fragments, several virtual compound spaces have been constructed in a combinatorial fashion with sizes ranging from 10(11)11 to 10(20)20 compounds. Multiple search methods have been developed to perform searches (e.g. similarity, exact and substructure) into those compound spaces without the need for full enumeration. The up-front investment spent on synthetic feasibility during the construction of some of those virtual compound spaces enables a wider adoption by medicinal chemists to design and synthesize important compounds for drug discovery. Recent activities in the area of exploring virtual compound spaces via the evolutionary approach based on Genetic Algorithm also suggests a positive shift of focus from method development to workflow, integration and ease of use, all of which are required for this approach to be widely adopted by medicinal chemists.

  18. Scalable unit commitment by memory-bounded ant colony optimization with A{sup *} local search

    Energy Technology Data Exchange (ETDEWEB)

    Saber, Ahmed Yousuf; Alshareef, Abdulaziz Mohammed [Department of Electrical and Computer Engineering, King Abdulaziz University, P.O. Box 80204, Jeddah 21589 (Saudi Arabia)

    2008-07-15

    Ant colony optimization (ACO) is successfully applied in optimization problems. Performance of the basic ACO for small problems with moderate dimension and searching space is satisfactory. As the searching space grows exponentially in the large-scale unit commitment problem, the basic ACO is not applicable for the vast size of pheromone matrix of ACO in practical time and physical computer-memory limit. However, memory-bounded methods prune the least-promising nodes to fit the system in computer memory. Therefore, the authors propose memory-bounded ant colony optimization (MACO) in this paper for the scalable (no restriction for system size) unit commitment problem. This MACO intelligently solves the limitation of computer memory, and does not permit the system to grow beyond a bound on memory. In the memory-bounded ACO implementation, A{sup *} heuristic is introduced to increase local searching ability and probabilistic nearest neighbor method is applied to estimate pheromone intensity for the forgotten value. Finally, the benchmark data sets and existing methods are used to show the effectiveness of the proposed method. (author)

  19. National Space Agencies vs. Commercial Space: Towards Improved Space Safety

    Science.gov (United States)

    Pelton, J.

    2013-09-01

    assume that thiscondition will not change. This seems particularly true for high profile, multi-billion dollar programs.The second part of the paper focuses on new commercial space programs that appear to be undertaken in a less restrictive manner; i.e. outside the constraints of politically-driven national space policies. Here the drivers—even within international consortia—seem to be on reliable performance and commercial return. Since sustained accident-free performance is critical to commercial programs very existence and profitability, the inherent role of safety in commercial space industry would seem clear. The question of prime interest for this paper is whether or not it might be possible for smaller and more focused commercial space entities, free from the constraints of space agency organizational and political constraints, to be more "risk adverse" and thus be more nimble in designing "safe" vehicles? If so how can this "safety first" corporate philosophy and management practice be detected and even objectively measured? Could, in the future, risk reduction at the level of design, quality verification, etc., be objectively measured?

  20. The instrument PAMELA for antimatter and dark matter search in space

    International Nuclear Information System (INIS)

    Picozza, Piergiorgio; Sparvoli, Roberta

    2010-01-01

    The PAMELA satellite experiment is dedicated to the study of charged particles in cosmic radiation, with a particular focus on antiparticles for the search of antimatter and signals of dark matter, in the energy window from 100 MeV to some hundreds of GeV. PAMELA is installed on board of the Resurs DK1 satellite that was launched from the Baikonur cosmodrome on June 15th, 2006. The PAMELA apparatus comprises a magnetic spectrometer, a time-of-flight system, a silicon-tungsten electromagnetic calorimeter, an anticoincidence system, a shower tail catcher scintillator and a neutron detector. The combination of these devices allows antiparticles to be reliably identified from a large background of other charged particles.

  1. Improved quantum-behaved particle swarm optimization with local search strategy

    Directory of Open Access Journals (Sweden)

    Maolong Xi

    2017-03-01

    Full Text Available Quantum-behaved particle swarm optimization, which was motivated by analysis of particle swarm optimization and quantum system, has shown compared performance in finding the optimal solutions for many optimization problems to other evolutionary algorithms. To address the problem of premature, a local search strategy is proposed to improve the performance of quantum-behaved particle swarm optimization. In proposed local search strategy, a super particle is presented which is a collection body of randomly selected particles’ dimension information in the swarm. The selected probability of particles in swarm is different and determined by their fitness values. To minimization problems, the fitness value of one particle is smaller; the selected probability is more and will contribute more information in constructing the super particle. In addition, in order to investigate the influence on algorithm performance with different local search space, four methods of computing the local search radius are applied in local search strategy and propose four variants of local search quantum-behaved particle swarm optimization. Empirical studies on a suite of well-known benchmark functions are undertaken in order to make an overall performance comparison among the proposed methods and other quantum-behaved particle swarm optimization. The simulation results show that the proposed quantum-behaved particle swarm optimization variants have better advantages over the original quantum-behaved particle swarm optimization.

  2. Space nonweaponization. An urgent task for arms control

    International Nuclear Information System (INIS)

    Du Xiangwan; Pan Jusheng; Zhang Xinwei; Du Shuhua; Xu Changgen

    1990-05-01

    The authors attempt to expound the basic points of veiw and put forward a proposal on the space nonweaponization. The authors analyse the nature of space weaponry and its impact on arms race and point out that the space nonweaponization is an urgent task for arms control. The relations between prohibition of space and ASAT weapons, between prohibition of space weapons and reduction of nuclear weapons and between space weapon and nuclear test are all analysed. The inadequacy of the existing space treaties is made clear based on the evaluation. It is hoped that a verifiable treaty on the prohibition of space weapons should be made and international cooperation on peaceful use of outer space is necessary

  3. Efficient computation of spaced seeds

    Directory of Open Access Journals (Sweden)

    Ilie Silvana

    2012-02-01

    Full Text Available Abstract Background The most frequently used tools in bioinformatics are those searching for similarities, or local alignments, between biological sequences. Since the exact dynamic programming algorithm is quadratic, linear-time heuristics such as BLAST are used. Spaced seeds are much more sensitive than the consecutive seed of BLAST and using several seeds represents the current state of the art in approximate search for biological sequences. The most important aspect is computing highly sensitive seeds. Since the problem seems hard, heuristic algorithms are used. The leading software in the common Bernoulli model is the SpEED program. Findings SpEED uses a hill climbing method based on the overlap complexity heuristic. We propose a new algorithm for this heuristic that improves its speed by over one order of magnitude. We use the new implementation to compute improved seeds for several software programs. We compute as well multiple seeds of the same weight as MegaBLAST, that greatly improve its sensitivity. Conclusion Multiple spaced seeds are being successfully used in bioinformatics software programs. Enabling researchers to compute very fast high quality seeds will help expanding the range of their applications.

  4. Designing the Search Service for Enterprise Portal based on Oracle Universal Content Management

    Science.gov (United States)

    Bauer, K. S.; Kuznetsov, D. Y.; Pominov, A. D.

    2017-01-01

    Enterprise Portal is an important part of an organization in informative and innovative space. The portal provides collaboration between employees and the organization. This article gives a valuable background of Enterprise Portal and technologies. The paper presents Oracle WebCenter Portal and UCM Server integration in detail. The focus is on tools for Enterprise Portal and on Search Service in particular. The paper also presents several UML diagrams to describe the use of cases for Search Service and main components of this application.

  5. INTERFACING GOOGLE SEARCH ENGINE TO CAPTURE USER WEB SEARCH BEHAVIOR

    OpenAIRE

    Fadhilah Mat Yamin; T. Ramayah

    2013-01-01

    The behaviour of the searcher when using the search engine especially during the query formulation is crucial. Search engines capture users’ activities in the search log, which is stored at the search engine server. Due to the difficulty of obtaining this search log, this paper proposed and develops an interface framework to interface a Google search engine. This interface will capture users’ queries before redirect them to Google. The analysis of the search log will show that users are utili...

  6. Data Transformation Functions for Expanded Search Spaces in Geographic Sample Supervised Segment Generation

    OpenAIRE

    Christoff Fourie; Elisabeth Schoepfer

    2014-01-01

    Sample supervised image analysis, in particular sample supervised segment generation, shows promise as a methodological avenue applicable within Geographic Object-Based Image Analysis (GEOBIA). Segmentation is acknowledged as a constituent component within typically expansive image analysis processes. A general extension to the basic formulation of an empirical discrepancy measure directed segmentation algorithm parameter tuning approach is proposed. An expanded search landscape is defined, c...

  7. Status and prospects for BSM ( (N)MSSM) Higgs searches at the LHC

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00071410; The ATLAS collaboration

    2016-01-01

    Searches for Beyond the Standard Model (BSM) Higgs processes in the context of Minimal Supersymmetric Standard Model (MSSM) and Next to MSSM (NMSSM) are presented. The results are based on the first LHC run of pp collision data recorded by the ATLAS and CMS experiments at the CERN Large Hadron Collider at centre-of-mass energies of 7 and 8 TeV, corresponding to an integrated luminosity of about 5 and 20 fb-1 respectively. Current searches constrain large parts of the parameter space. No evidence for BSM Higgs is found.

  8. The Maslov index in symplectic Banach spaces

    CERN Document Server

    Booss-Bavnbek, Bernhelm

    2018-01-01

    The authors consider a curve of Fredholm pairs of Lagrangian subspaces in a fixed Banach space with continuously varying weak symplectic structures. Assuming vanishing index, they obtain intrinsically a continuously varying splitting of the total Banach space into pairs of symplectic subspaces. Using such decompositions the authors define the Maslov index of the curve by symplectic reduction to the classical finite-dimensional case. The authors prove the transitivity of repeated symplectic reductions and obtain the invariance of the Maslov index under symplectic reduction while recovering all the standard properties of the Maslov index. As an application, the authors consider curves of elliptic operators which have varying principal symbol, varying maximal domain and are not necessarily of Dirac type. For this class of operator curves, the authors derive a desuspension spectral flow formula for varying well-posed boundary conditions on manifolds with boundary and obtain the splitting formula of the spectral f...

  9. A best-first tree-searching approach for ML decoding in MIMO system

    KAUST Repository

    Shen, Chung-An

    2012-07-28

    In MIMO communication systems maximum-likelihood (ML) decoding can be formulated as a tree-searching problem. This paper presents a tree-searching approach that combines the features of classical depth-first and breadth-first approaches to achieve close to ML performance while minimizing the number of visited nodes. A detailed outline of the algorithm is given, including the required storage. The effects of storage size on BER performance and complexity in terms of search space are also studied. Our result demonstrates that with a proper choice of storage size the proposed method visits 40% fewer nodes than a sphere decoding algorithm at signal to noise ratio (SNR) = 20dB and by an order of magnitude at 0 dB SNR.

  10. Faculty Searches at a Christian University: Ethical and Practical Considerations

    Science.gov (United States)

    Steele, Richard B.

    2008-01-01

    In the space of four years, the School of Theology at Seattle Pacific University made eight faculty hires. But for various reasons, three of the eight hirees did not prove to be good "mission fits" for the institution. Suspecting that the regrettable outcome of these searches lay not in the persons hired, but in the deficiencies of the hiring…

  11. Alpenglow: A signature for chameleons in axionlike particle search experiments

    International Nuclear Information System (INIS)

    Ahlers, M.; Lindner, A.; Ringwald, A.; Schrempp, L.; Weniger, C.

    2008-01-01

    We point out that chameleon field theories might reveal themselves as an afterglow effect in axionlike particle search experiments due to chameleon-photon conversion in a magnetic field. We estimate the parameter space which is accessible by currently available technology and find that afterglow experiments could constrain this parameter space in a way complementary to gravitational and Casimir force experiments. In addition, one could reach photon-chameleon couplings which are beyond the sensitivity of common laser polarization experiments. We also sketch the idea of a Fabry-Perot cavity with chameleons which could increase the experimental sensitivity significantly

  12. Efficient Parameter Searches for Colloidal Materials Design with Digital Alchemy

    Science.gov (United States)

    Dodd, Paul, M.; Geng, Yina; van Anders, Greg; Glotzer, Sharon C.

    Optimal colloidal materials design is challenging, even for high-throughput or genomic approaches, because the design space provided by modern colloid synthesis techniques can easily have dozens of dimensions. In this talk we present the methodology of an inverse approach we term ''digital alchemy'' to perform rapid searches of design-paramenter spaces with up to 188 dimensions that yield thermodynamically optimal colloid parameters for target crystal structures with up to 20 particles in a unit cell. The method relies only on fundamental principles of statistical mechanics and Metropolis Monte Carlo techniques, and yields particle attribute tolerances via analogues of familiar stress-strain relationships.

  13. [Advanced online search techniques and dedicated search engines for physicians].

    Science.gov (United States)

    Nahum, Yoav

    2008-02-01

    In recent years search engines have become an essential tool in the work of physicians. This article will review advanced search techniques from the world of information specialists, as well as some advanced search engine operators that may help physicians improve their online search capabilities, and maximize the yield of their searches. This article also reviews popular dedicated scientific and biomedical literature search engines.

  14. SPS batch spacing optimisation

    CERN Document Server

    Velotti, F M; Carlier, E; Goddard, B; Kain, V; Kotzian, G

    2017-01-01

    Until 2015, the LHC filling schemes used the batch spac-ing as specified in the LHC design report. The maximumnumber of bunches injectable in the LHC directly dependson the batch spacing at injection in the SPS and hence onthe MKP rise time.As part of the LHC Injectors Upgrade project for LHCheavy ions, a reduction of the batch spacing is needed. In thisdirection, studies to approach the MKP design rise time of150ns(2-98%) have been carried out. These measurementsgave clear indications that such optimisation, and beyond,could be done also for higher injection momentum beams,where the additional slower MKP (MKP-L) is needed.After the successful results from 2015 SPS batch spacingoptimisation for the Pb-Pb run [1], the same concept wasthought to be used also for proton beams. In fact, thanksto the SPS transverse feed back, it was already observedthat lower batch spacing than the design one (225ns) couldbe achieved. For the 2016 p-Pb run, a batch spacing of200nsfor the proton beam with100nsbunch spacing wasreque...

  15. Searches for SUSY at LHC

    International Nuclear Information System (INIS)

    Kharchilava, A.

    1997-01-01

    One of the main motivations of experiments at the LHC is to search for SUSY particles. The talk is based on recent analyses, performed by CMS Collaboration, within the framework of the Supergravity motivated minimal SUSY extension of the Standard Model. The emphasis is put on leptonic channels. The strategies for obtaining experimental signatures for strongly and weakly interacting sparticles productions, as well as examples of determination of SUSY masses and model parameters are discussed. The domain of parameter space where SUSY can be discovered is investigated. Results show, that if SUSY is of relevance at Electro-Weak scale it could hardly escape detection at LHC. (author)

  16. The dimensional reduction in a multi-dimensional cosmology

    International Nuclear Information System (INIS)

    Demianski, M.; Golda, Z.A.; Heller, M.; Szydlowski, M.

    1986-01-01

    Einstein's field equations are solved for the case of the eleven-dimensional vacuum spacetime which is the product R x Bianchi V x T 7 , where T 7 is a seven-dimensional torus. Among all possible solutions, the authors identify those in which the macroscopic space expands and the microscopic space contracts to a finite size. The solutions with this property are 'typical' within the considered class. They implement the idea of a purely dynamical dimensional reduction. (author)

  17. Wide-Range Motion Estimation Architecture with Dual Search Windows for High Resolution Video Coding

    Science.gov (United States)

    Dung, Lan-Rong; Lin, Meng-Chun

    This paper presents a memory-efficient motion estimation (ME) technique for high-resolution video compression. The main objective is to reduce the external memory access, especially for limited local memory resource. The reduction of memory access can successfully save the notorious power consumption. The key to reduce the memory accesses is based on center-biased algorithm in that the center-biased algorithm performs the motion vector (MV) searching with the minimum search data. While considering the data reusability, the proposed dual-search-windowing (DSW) approaches use the secondary windowing as an option per searching necessity. By doing so, the loading of search windows can be alleviated and hence reduce the required external memory bandwidth. The proposed techniques can save up to 81% of external memory bandwidth and require only 135 MBytes/sec, while the quality degradation is less than 0.2dB for 720p HDTV clips coded at 8Mbits/sec.

  18. Supervised learning of tools for content-based search of image databases

    Science.gov (United States)

    Delanoy, Richard L.

    1996-03-01

    A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.

  19. Characterising dark matter searches at colliders and direct detection experiments: Vector mediators

    International Nuclear Information System (INIS)

    Buchmueller, Oliver; Dolan, Matthew J.; Malik, Sarah A.; McCabe, Christopher

    2015-01-01

    We introduce a Minimal Simplified Dark Matter (MSDM) framework to quantitatively characterise dark matter (DM) searches at the LHC. We study two MSDM models where the DM is a Dirac fermion which interacts with a vector and axial-vector mediator. The models are characterised by four parameters: m DM , M med, g DM and g q , the DM and mediator masses, and the mediator couplings to DM and quarks respectively. The MSDM models accurately capture the full event kinematics, and the dependence on all masses and couplings can be systematically studied. The interpretation of mono-jet searches in this framework can be used to establish an equal-footing comparison with direct detection experiments. For theories with a vector mediator, LHC mono-jet searches possess better sensitivity than direct detection searches for light DM masses (≲5 GeV). For axial-vector mediators, LHC and direct detection searches generally probe orthogonal directions in the parameter space. We explore the projected limits of these searches from the ultimate reach of the LHC and multi-ton xenon direct detection experiments, and find that the complementarity of the searches remains. In conclusion, we provide a comparison of limits in the MSDM and effective field theory (EFT) frameworks to highlight the deficiencies of the EFT framework, particularly when exploring the complementarity of mono-jet and direct detection searches

  20. Children's Search Engines from an Information Search Process Perspective.

    Science.gov (United States)

    Broch, Elana

    2000-01-01

    Describes cognitive and affective characteristics of children and teenagers that may affect their Web searching behavior. Reviews literature on children's searching in online public access catalogs (OPACs) and using digital libraries. Profiles two Web search engines. Discusses some of the difficulties children have searching the Web, in the…

  1. Iterated Local Search Algorithm with Strategic Oscillation for School Bus Routing Problem with Bus Stop Selection

    Directory of Open Access Journals (Sweden)

    Mohammad Saied Fallah Niasar

    2017-02-01

    Full Text Available he school bus routing problem (SBRP represents a variant of the well-known vehicle routing problem. The main goal of this study is to pick up students allocated to some bus stops and generate routes, including the selected stops, in order to carry students to school. In this paper, we have proposed a simple but effective metaheuristic approach that employs two features: first, it utilizes large neighborhood structures for a deeper exploration of the search space; second, the proposed heuristic executes an efficient transition between the feasible and infeasible portions of the search space. Exploration of the infeasible area is controlled by a dynamic penalty function to convert the unfeasible solution into a feasible one. Two metaheuristics, called N-ILS (a variant of the Nearest Neighbourhood with Iterated Local Search algorithm and I-ILS (a variant of Insertion with Iterated Local Search algorithm are proposed to solve SBRP. Our experimental procedure is based on the two data sets. The results show that N-ILS is able to obtain better solutions in shorter computing times. Additionally, N-ILS appears to be very competitive in comparison with the best existing metaheuristics suggested for SBRP

  2. Marshall Space Flight Center Technology Investments Overview

    Science.gov (United States)

    Tinker, Mike

    2014-01-01

    NASA is moving forward with prioritized technology investments that will support NASA's exploration and science missions, while benefiting other Government agencies and the U.S. aerospace enterprise. center dotThe plan provides the guidance for NASA's space technology investments during the next four years, within the context of a 20-year horizon center dotThis plan will help ensure that NASA develops technologies that enable its 4 goals to: 1.Sustain and extend human activities in space, 2.Explore the structure, origin, and evolution of the solar system, and search for life past and present, 3.Expand our understanding of the Earth and the universe and have a direct and measurable impact on how we work and live, and 4.Energize domestic space enterprise and extend benefits of space for the Nation.

  3. How Users Search the Library from a Single Search Box

    Science.gov (United States)

    Lown, Cory; Sierra, Tito; Boyer, Josh

    2013-01-01

    Academic libraries are turning increasingly to unified search solutions to simplify search and discovery of library resources. Unfortunately, very little research has been published on library user search behavior in single search box environments. This study examines how users search a large public university library using a prominent, single…

  4. A concept of volume rendering guided search process to analyze medical data set.

    Science.gov (United States)

    Zhou, Jianlong; Xiao, Chun; Wang, Zhiyan; Takatsuka, Masahiro

    2008-03-01

    This paper firstly presents an approach of parallel coordinates based parameter control panel (PCP). The PCP is used to control parameters of focal region-based volume rendering (FRVR) during data analysis. It uses a parallel coordinates style interface. Different rendering parameters represented with nodes on each axis, and renditions based on related parameters are connected using polylines to show dependencies between renditions and parameters. Based on the PCP, a concept of volume rendering guided search process is proposed. The search pipeline is divided into four phases. Different parameters of FRVR are recorded and modulated in the PCP during search phases. The concept shows that volume visualization could play the role of guiding a search process in the rendition space to help users to efficiently find local structures of interest. The usability of the proposed approach is evaluated to show its effectiveness.

  5. A sparse grid based method for generative dimensionality reduction of high-dimensional data

    Science.gov (United States)

    Bohn, Bastian; Garcke, Jochen; Griebel, Michael

    2016-03-01

    Generative dimensionality reduction methods play an important role in machine learning applications because they construct an explicit mapping from a low-dimensional space to the high-dimensional data space. We discuss a general framework to describe generative dimensionality reduction methods, where the main focus lies on a regularized principal manifold learning variant. Since most generative dimensionality reduction algorithms exploit the representer theorem for reproducing kernel Hilbert spaces, their computational costs grow at least quadratically in the number n of data. Instead, we introduce a grid-based discretization approach which automatically scales just linearly in n. To circumvent the curse of dimensionality of full tensor product grids, we use the concept of sparse grids. Furthermore, in real-world applications, some embedding directions are usually more important than others and it is reasonable to refine the underlying discretization space only in these directions. To this end, we employ a dimension-adaptive algorithm which is based on the ANOVA (analysis of variance) decomposition of a function. In particular, the reconstruction error is used to measure the quality of an embedding. As an application, the study of large simulation data from an engineering application in the automotive industry (car crash simulation) is performed.

  6. Random searching

    International Nuclear Information System (INIS)

    Shlesinger, Michael F

    2009-01-01

    There are a wide variety of searching problems from molecules seeking receptor sites to predators seeking prey. The optimal search strategy can depend on constraints on time, energy, supplies or other variables. We discuss a number of cases and especially remark on the usefulness of Levy walk search patterns when the targets of the search are scarce.

  7. A reductive aminase from Aspergillus oryzae

    Science.gov (United States)

    Aleku, Godwin A.; France, Scott P.; Man, Henry; Mangas-Sanchez, Juan; Montgomery, Sarah L.; Sharma, Mahima; Leipold, Friedemann; Hussain, Shahed; Grogan, Gideon; Turner, Nicholas J.

    2017-10-01

    Reductive amination is one of the most important methods for the synthesis of chiral amines. Here we report the discovery of an NADP(H)-dependent reductive aminase from Aspergillus oryzae (AspRedAm, Uniprot code Q2TW47) that can catalyse the reductive coupling of a broad set of carbonyl compounds with a variety of primary and secondary amines with up to >98% conversion and with up to >98% enantiomeric excess. In cases where both carbonyl and amine show high reactivity, it is possible to employ a 1:1 ratio of the substrates, forming amine products with up to 94% conversion. Steady-state kinetic studies establish that the enzyme is capable of catalysing imine formation as well as reduction. Crystal structures of AspRedAm in complex with NADP(H) and also with both NADP(H) and the pharmaceutical ingredient (R)-rasagiline are reported. We also demonstrate preparative scale reductive aminations with wild-type and Q240A variant biocatalysts displaying total turnover numbers of up to 32,000 and space time yields up to 3.73 g l-1 d-1.

  8. Constraining supersymmetric models using Higgs physics, precision observables and direct searches

    International Nuclear Information System (INIS)

    Zeune, Lisa

    2014-08-01

    We present various complementary possibilities to exploit experimental measurements in order to test and constrain supersymmetric (SUSY) models. Direct searches for SUSY particles have not resulted in any signal so far, and limits on the SUSY parameter space have been set. Measurements of the properties of the observed Higgs boson at ∝126 GeV as well as of the W boson mass (M W ) can provide valuable indirect constraints, supplementing the ones from direct searches. This thesis is divided into three major parts: In the first part we present the currently most precise prediction for M W in the Minimal Supersymmetric Standard Model (MSSM) with complex parameters and in the Next-to-Minimal Supersymmetric Standard Model (NMSSM). The evaluation includes the full one-loop result and all relevant available higher order corrections of Standard Model (SM) and SUSY type. We perform a detailed scan over the MSSM parameter space, taking into account the latest experimental results, including the observation of a Higgs signal. We find that the current measurements for M W and the top quark mass (m t ) slightly favour a non-zero SUSY contribution. The impact of different SUSY sectors on the prediction of M W as well as the size of the higher-order SUSY corrections are analysed both in the MSSM and the NMSSM. We investigate the genuine NMSSM contribution from the extended Higgs and neutralino sectors and highlight differences between the M W predictions in the two SUSY models. In the second part of the thesis we discuss possible interpretations of the observed Higgs signal in SUSY models. The properties of the observed Higgs boson are compatible with the SM so far, but many other interpretations are also possible. Performing scans over the relevant parts of the MSSM and the NMSSM parameter spaces and applying relevant constraints from Higgs searches, flavour physics and electroweak measurements, we find that a Higgs boson at ∝126 GeV, which decays into two photons, can in

  9. User-assisted visual search and tracking across distributed multi-camera networks

    Science.gov (United States)

    Raja, Yogesh; Gong, Shaogang; Xiang, Tao

    2011-11-01

    Human CCTV operators face several challenges in their task which can lead to missed events, people or associations, including: (a) data overload in large distributed multi-camera environments; (b) short attention span; (c) limited knowledge of what to look for; and (d) lack of access to non-visual contextual intelligence to aid search. Developing a system to aid human operators and alleviate such burdens requires addressing the problem of automatic re-identification of people across disjoint camera views, a matching task made difficult by factors such as lighting, viewpoint and pose changes and for which absolute scoring approaches are not best suited. Accordingly, we describe a distributed multi-camera tracking (MCT) system to visually aid human operators in associating people and objects effectively over multiple disjoint camera views in a large public space. The system comprises three key novel components: (1) relative measures of ranking rather than absolute scoring to learn the best features for matching; (2) multi-camera behaviour profiling as higher-level knowledge to reduce the search space and increase the chance of finding correct matches; and (3) human-assisted data mining to interactively guide search and in the process recover missing detections and discover previously unknown associations. We provide an extensive evaluation of the greater effectiveness of the system as compared to existing approaches on industry-standard i-LIDS multi-camera data.

  10. Modeling and Analysis of Space Based Transceivers

    Science.gov (United States)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  11. Searching for gravitational waves from neutron stars

    Science.gov (United States)

    Idrisy, Ashikuzzaman

    generate the parameter space of a GW search so as to cover the largest physical range of parameters, while keeping the search computationally feasible. Finally we discuss the time-domain solar system barycentered resampling algorithm as a way to improve to the computational cost of the analysis. In Chapter 4 we discuss a search for GWs from two supernova remnants, G65.7 and G330.2. The searches were conducted using data from the 6th science run of the LIGO detectors. Since the searches were modeled on the Cassiopeia A search paper, Abadie et. al. [Astrophys. J. 722,1504--1513, 2010], we also used the frequency and the first and second derivatives of the frequency as the parameter space of the search. There are two main differences from the previous search. The first is the use of the resampling algorithm, which sped up the calculation of the F-statistic by a factor of 3 and thus allowed for longer stretches of data to be coherently integrated. Being able to integrate more data meant that we could beat the indirect limit on GWs expected from these sources. We used a 51 day integration time for G65.7 and 24 days for G330.2. The second difference is that the analysis pipeline is now more automated. This allows for a more efficient data analysis process. We did not find a credible source of GWs and so we placed upper limits on the gravitational wave strain, ellipticity, and r-mode amplitude of the sources. The best upper-limit for the strain was 3.0 x 10 -25, for ellipticity it was 7.0 x 10-6 and for r-mode amplitude it was 2.2 x 10-4 .

  12. MEASURING THE PERFORMANCE OF SIMILARITY PROPAGATION IN AN SEMANTIC SEARCH ENGINE

    Directory of Open Access Journals (Sweden)

    S. K. Jayanthi

    2013-10-01

    Full Text Available In the current scenario, web page result personalization is playing a vital role. Nearly 80 % of the users expect the best results in the first page itself without having any persistence to browse longer in URL mode. This research work focuses on two main themes: Semantic web search through online and Domain based search through offline. The first part is to find an effective method which allows grouping similar results together using BookShelf Data Structure and organizing the various clusters. The second one is focused on the academic domain based search through offline. This paper focuses on finding documents which are similar and how Vector space can be used to solve it. So more weightage is given for the principles and working methodology of similarity propagation. Cosine similarity measure is used for finding the relevancy among the documents.

  13. Active vibration reduction of a flexible structure bonded with optimised piezoelectric pairs using half and quarter chromosomes in genetic algorithms

    International Nuclear Information System (INIS)

    Daraji, A H; Hale, J M

    2012-01-01

    The optimal placement of sensors and actuators in active vibration control is limited by the number of candidates in the search space. The search space of a small structure discretized to one hundred elements for optimising the location of ten actuators gives 1.73 × 10 13 possible solutions, one of which is the global optimum. In this work, a new quarter and half chromosome technique based on symmetry is developed, by which the search space for optimisation of sensor/actuator locations in active vibration control of flexible structures may be greatly reduced. The technique is applied to the optimisation for eight and ten actuators located on a 500×500mm square plate, in which the search space is reduced by up to 99.99%. This technique helps for updating genetic algorithm program by updating natural frequencies and mode shapes in each generation to find the global optimal solution in a greatly reduced number of generations. An isotropic plate with piezoelectric sensor/actuator pairs bonded to its surface was investigated using the finite element method and Hamilton's principle based on first order shear deformation theory. The placement and feedback gain of ten and eight sensor/actuator pairs was optimised for a cantilever and clamped-clamped plate to attenuate the first six modes of vibration, using minimization of linear quadratic index as an objective function.

  14. The impact of harm reduction programs and police interventions on the number of syringes collected from public spaces. A time series analysis in Barcelona, 2004-2014.

    Science.gov (United States)

    Espelt, A; Villalbí, J R; Bosque-Prous, M; Parés-Badell, O; Mari-Dell'Olmo, M; Brugal, M T

    2017-12-01

    To estimate the effect of opening two services for people who use drugs and three police interventions on the number of discarded syringes collected from public spaces in Barcelona between 2004 and 2014. We conducted an interrupted time-series analysis of the monthly number of syringes collected from public spaces during this period. The dependent variable was the number of syringes collected per month. The main independent variables were month and five dummy variables (the opening of two facilities with safe consumption rooms, and three police interventions). To examine which interventions affected the number of syringes collected, we performed an interrupted time-series analysis using a quasi-Poisson regression model, obtaining relative risks (RR) and 95% confidence intervals (CIs). The number of syringes collected per month in Barcelona decreased from 13,800 in 2004 to 1655 in 2014 after several interventions. For example, following the closure of an open drug scene in District A of the city, we observed a decreasing trend in the number of syringes collected [RR=0.88 (95% CI: 0.82-0.95)], but an increasing trend in the remaining districts [RR=1.11 (95% CI: 1.05-1.17) and 1.08 (95% CI: 0.99-1.18) for districts B and C, respectively]. Following the opening of a harm reduction facility in District C, we observed an initial increase in the number collected in this district [RR=2.72 (95% CI: 1.57-4.71)] and stabilization of the trend thereafter [RR=0.97 (95% CI: 0.91-1.03)]. The overall number of discarded syringes collected from public spaces has decreased consistently in parallel with a combination of police interventions and the opening of harm reduction facilities. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Optimization of Signal Region for Dark Matter Search at the ATLAS Detector

    CERN Document Server

    Yip, Long Sang Kenny

    2015-01-01

    This report focused on the optimization of signal region for the search of dark matter produced in proton-proton collision with final states of a single electron or muon, a minimum of four jets, one or two b-jets, and missing transverse momentum at least 100 GeV. A brute-force approach was proposed to scan for the optimal signal region in rectangularly discretized parameter space. Analysis of the leniency of signal regions motivated event-shortlisting and loop-breaking features that allowed efficient optimization of the signal region. With the refined algorithm for the brute-force search, the computation time slimmed from an estimation of three months to one hour, in a test run of a million Monte-Carlo simulated events over densely discretized parameter space of four million signal regions. Further studies could focus on manipulating random numbers, and the interplay between the maximal figure of merit and the lower bound imposed on the background.

  16. Miniaturized high performance sensors for space plasmas

    International Nuclear Information System (INIS)

    Young, D.T.

    1996-01-01

    Operating under ever more constrained budgets, NASA has turned to a new paradigm for instrumentation and mission development in which smaller, faster, better, cheaper is of primary consideration for future space plasma investigations. The author presents several examples showing the influence of this new paradigm on sensor development and discuss certain implications for the scientific return from resource constrained sensors. The author also discusses one way to improve space plasma sensor performance which is to search out new technologies, measurement techniques and instrument analogs from related fields including among others, laboratory plasma physics

  17. Experimental Constraints of the Exotic Shearing of Space-Time

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, Jonathan William [Univ. of Chicago, IL (United States)

    2016-08-01

    The Holometer program is a search for rst experimental evidence that space-time has quantum structure. The detector consists of a pair of co-located 40-m power-recycled interferometers whose outputs are read out synchronously at 50 MHz, achieving sensitivity to spatiallycorrelated uctuations in dierential position on time scales shorter than the light-crossing time of the instruments. Unlike gravitational wave interferometers, which time-resolve transient geometrical disturbances in the spatial background, the Holometer is searching for a universal, stationary quantization noise of the background itself. This dissertation presents the nal results of the Holometer Phase I search, an experiment congured for sensitivity to exotic coherent shearing uctuations of space-time. Measurements of high-frequency cross-spectra of the interferometer signals obtain sensitivity to spatially-correlated eects far exceeding any previous measurement, in a broad frequency band extending to 7.6 MHz, twice the inverse light-crossing time of the apparatus. This measurement is the statistical aggregation of 2.1 petabytes of 2-byte dierential position measurements obtained over a month-long exposure time. At 3 signicance, it places an upper limit on the coherence scale of spatial shear two orders of magnitude below the Planck length. The result demonstrates the viability of this novel spatially-correlated interferometric detection technique to reach unprecedented sensitivity to coherent deviations of space-time from classicality, opening the door for direct experimental tests of theories of relational quantum gravity.

  18. Assessing the search for information on three Rs methods, and their subsequent implementation: a national survey among scientists in the Netherlands

    NARCIS (Netherlands)

    Luijk, J. van; Cuijpers, Y.M.; Vaart, L. van der; Leenaars, M; Ritskes-Hoitinga, M.

    2011-01-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify

  19. Assessing the Search for Information on Three Rs Methods, and their Subsequent Implementation: A National Survey among Scientists in The Netherlands.

    NARCIS (Netherlands)

    Luijk, J. van; Cuijpers, Y.M.; Vaart, L. van der; Leenaars, M.; Ritskes-Hoitinga, M.

    2011-01-01

    A local survey conducted among scientists into the current practice of searching for information on Three Rs (i.e. Replacement, Reduction and Refinement) methods has highlighted the gap between the statutory requirement to apply Three Rs methods and the lack of criteria to search for them. To verify

  20. Search times and probability of detection in time-limited search

    Science.gov (United States)

    Wilson, David; Devitt, Nicole; Maurer, Tana

    2005-05-01

    When modeling the search and target acquisition process, probability of detection as a function of time is important to war games and physical entity simulations. Recent US Army RDECOM CERDEC Night Vision and Electronics Sensor Directorate modeling of search and detection has focused on time-limited search. Developing the relationship between detection probability and time of search as a differential equation is explored. One of the parameters in the current formula for probability of detection in time-limited search corresponds to the mean time to detect in time-unlimited search. However, the mean time to detect in time-limited search is shorter than the mean time to detect in time-unlimited search and the relationship between them is a mathematical relationship between these two mean times. This simple relationship is derived.