WorldWideScience

Sample records for detects regular spatial

  1. Selection of regularization parameter for l1-regularized damage detection

    Science.gov (United States)

    Hou, Rongrong; Xia, Yong; Bao, Yuequan; Zhou, Xiaoqing

    2018-06-01

    The l1 regularization technique has been developed for structural health monitoring and damage detection through employing the sparsity condition of structural damage. The regularization parameter, which controls the trade-off between data fidelity and solution size of the regularization problem, exerts a crucial effect on the solution. However, the l1 regularization problem has no closed-form solution, and the regularization parameter is usually selected by experience. This study proposes two strategies of selecting the regularization parameter for the l1-regularized damage detection problem. The first method utilizes the residual and solution norms of the optimization problem and ensures that they are both small. The other method is based on the discrepancy principle, which requires that the variance of the discrepancy between the calculated and measured responses is close to the variance of the measurement noise. The two methods are applied to a cantilever beam and a three-story frame. A range of the regularization parameter, rather than one single value, can be determined. When the regularization parameter in this range is selected, the damage can be accurately identified even for multiple damage scenarios. This range also indicates the sensitivity degree of the damage identification problem to the regularization parameter.

  2. Using Tikhonov Regularization for Spatial Projections from CSR Regularized Spherical Harmonic GRACE Solutions

    Science.gov (United States)

    Save, H.; Bettadpur, S. V.

    2013-12-01

    It has been demonstrated before that using Tikhonov regularization produces spherical harmonic solutions from GRACE that have very little residual stripes while capturing all the signal observed by GRACE within the noise level. This paper demonstrates a two-step process and uses Tikhonov regularization to remove the residual stripes in the CSR regularized spherical harmonic coefficients when computing the spatial projections. We discuss methods to produce mass anomaly grids that have no stripe features while satisfying the necessary condition of capturing all observed signal within the GRACE noise level.

  3. Automatic Constraint Detection for 2D Layout Regularization

    KAUST Repository

    Jiang, Haiyong; Nan, Liangliang; Yan, Dongming; Dong, Weiming; Zhang, Xiaopeng; Wonka, Peter

    2015-01-01

    plans or images, such as floor plans and facade images, and for the improvement of user created contents, such as architectural drawings and slide layouts. To regularize a layout, we aim to improve the input by detecting and subsequently enforcing

  4. Automatic Constraint Detection for 2D Layout Regularization.

    Science.gov (United States)

    Jiang, Haiyong; Nan, Liangliang; Yan, Dong-Ming; Dong, Weiming; Zhang, Xiaopeng; Wonka, Peter

    2016-08-01

    In this paper, we address the problem of constraint detection for layout regularization. The layout we consider is a set of two-dimensional elements where each element is represented by its bounding box. Layout regularization is important in digitizing plans or images, such as floor plans and facade images, and in the improvement of user-created contents, such as architectural drawings and slide layouts. To regularize a layout, we aim to improve the input by detecting and subsequently enforcing alignment, size, and distance constraints between layout elements. Similar to previous work, we formulate layout regularization as a quadratic programming problem. In addition, we propose a novel optimization algorithm that automatically detects constraints. We evaluate the proposed framework using a variety of input layouts from different applications. Our results demonstrate that our method has superior performance to the state of the art.

  5. Automatic Constraint Detection for 2D Layout Regularization

    KAUST Repository

    Jiang, Haiyong

    2015-09-18

    In this paper, we address the problem of constraint detection for layout regularization. As layout we consider a set of two-dimensional elements where each element is represented by its bounding box. Layout regularization is important for digitizing plans or images, such as floor plans and facade images, and for the improvement of user created contents, such as architectural drawings and slide layouts. To regularize a layout, we aim to improve the input by detecting and subsequently enforcing alignment, size, and distance constraints between layout elements. Similar to previous work, we formulate the layout regularization as a quadratic programming problem. In addition, we propose a novel optimization algorithm to automatically detect constraints. In our results, we evaluate the proposed framework on a variety of input layouts from different applications, which demonstrates our method has superior performance to the state of the art.

  6. Spatially-Variant Tikhonov Regularization for Double-Difference Waveform Inversion

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Youzuo [Los Alamos National Laboratory; Huang, Lianjie [Los Alamos National Laboratory; Zhang, Zhigang [Los Alamos National Laboratory

    2011-01-01

    Double-difference waveform inversion is a potential tool for quantitative monitoring for geologic carbon storage. It jointly inverts time-lapse seismic data for changes in reservoir geophysical properties. Due to the ill-posedness of waveform inversion, it is a great challenge to obtain reservoir changes accurately and efficiently, particularly when using time-lapse seismic reflection data. Regularization techniques can be utilized to address the issue of ill-posedness. The regularization parameter controls the smoothness of inversion results. A constant regularization parameter is normally used in waveform inversion, and an optimal regularization parameter has to be selected. The resulting inversion results are a trade off among regions with different smoothness or noise levels; therefore the images are either over regularized in some regions while under regularized in the others. In this paper, we employ a spatially-variant parameter in the Tikhonov regularization scheme used in double-difference waveform tomography to improve the inversion accuracy and robustness. We compare the results obtained using a spatially-variant parameter with those obtained using a constant regularization parameter and those produced without any regularization. We observe that, utilizing a spatially-variant regularization scheme, the target regions are well reconstructed while the noise is reduced in the other regions. We show that the spatially-variant regularization scheme provides the flexibility to regularize local regions based on the a priori information without increasing computational costs and the computer memory requirement.

  7. Detecting spatial regimes in ecosystems

    Science.gov (United States)

    Sundstrom, Shana M.; Eason, Tarsha; Nelson, R. John; Angeler, David G.; Barichievy, Chris; Garmestani, Ahjond S.; Graham, Nicholas A.J.; Granholm, Dean; Gunderson, Lance; Knutson, Melinda; Nash, Kirsty L.; Spanbauer, Trisha; Stow, Craig A.; Allen, Craig R.

    2017-01-01

    Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory-based method, on both terrestrial and aquatic animal data (U.S. Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps and multivariate analyses such as nMDS and cluster analysis. We successfully detected spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change.

  8. EEG/MEG Source Reconstruction with Spatial-Temporal Two-Way Regularized Regression

    KAUST Repository

    Tian, Tian Siva; Huang, Jianhua Z.; Shen, Haipeng; Li, Zhimin

    2013-01-01

    In this work, we propose a spatial-temporal two-way regularized regression method for reconstructing neural source signals from EEG/MEG time course measurements. The proposed method estimates the dipole locations and amplitudes simultaneously

  9. Real time QRS complex detection using DFA and regular grammar.

    Science.gov (United States)

    Hamdi, Salah; Ben Abdallah, Asma; Bedoui, Mohamed Hedi

    2017-02-28

    The sequence of Q, R, and S peaks (QRS) complex detection is a crucial procedure in electrocardiogram (ECG) processing and analysis. We propose a novel approach for QRS complex detection based on the deterministic finite automata with the addition of some constraints. This paper confirms that regular grammar is useful for extracting QRS complexes and interpreting normalized ECG signals. A QRS is assimilated to a pair of adjacent peaks which meet certain criteria of standard deviation and duration. The proposed method was applied on several kinds of ECG signals issued from the standard MIT-BIH arrhythmia database. A total of 48 signals were used. For an input signal, several parameters were determined, such as QRS durations, RR distances, and the peaks' amplitudes. σRR and σQRS parameters were added to quantify the regularity of RR distances and QRS durations, respectively. The sensitivity rate of the suggested method was 99.74% and the specificity rate was 99.86%. Moreover, the sensitivity and the specificity rates variations according to the Signal-to-Noise Ratio were performed. Regular grammar with the addition of some constraints and deterministic automata proved functional for ECG signals diagnosis. Compared to statistical methods, the use of grammar provides satisfactory and competitive results and indices that are comparable to or even better than those cited in the literature.

  10. Spatial filtering precedes motion detection.

    Science.gov (United States)

    Morgan, M J

    1992-01-23

    When we perceive motion on a television or cinema screen, there must be some process that allows us to track moving objects over time: if not, the result would be a conflicting mass of motion signals in all directions. A possible mechanism, suggested by studies of motion displacement in spatially random patterns, is that low-level motion detectors have a limited spatial range, which ensures that they tend to be stimulated over time by the same object. This model predicts that the direction of displacement of random patterns cannot be detected reliably above a critical absolute displacement value (Dmax) that is independent of the size or density of elements in the display. It has been inferred that Dmax is a measure of the size of motion detectors in the visual pathway. Other studies, however, have shown that Dmax increases with element size, in which case the most likely interpretation is that Dmax depends on the probability of false matches between pattern elements following a displacement. These conflicting accounts are reconciled here by showing that Dmax is indeed determined by the spacing between the elements in the pattern, but only after fine detail has been removed by a physiological prefiltering stage: the filter required to explain the data has a similar size to the receptive field of neurons in the primate magnocellular pathway. The model explains why Dmax can be increased by removing high spatial frequencies from random patterns, and simplifies our view of early motion detection.

  11. Mixture models with entropy regularization for community detection in networks

    Science.gov (United States)

    Chang, Zhenhai; Yin, Xianjun; Jia, Caiyan; Wang, Xiaoyang

    2018-04-01

    Community detection is a key exploratory tool in network analysis and has received much attention in recent years. NMM (Newman's mixture model) is one of the best models for exploring a range of network structures including community structure, bipartite and core-periphery structures, etc. However, NMM needs to know the number of communities in advance. Therefore, in this study, we have proposed an entropy regularized mixture model (called EMM), which is capable of inferring the number of communities and identifying network structure contained in a network, simultaneously. In the model, by minimizing the entropy of mixing coefficients of NMM using EM (expectation-maximization) solution, the small clusters contained little information can be discarded step by step. The empirical study on both synthetic networks and real networks has shown that the proposed model EMM is superior to the state-of-the-art methods.

  12. Functional dissociation between regularity encoding and deviance detection along the auditory hierarchy.

    Science.gov (United States)

    Aghamolaei, Maryam; Zarnowiec, Katarzyna; Grimm, Sabine; Escera, Carles

    2016-02-01

    Auditory deviance detection based on regularity encoding appears as one of the basic functional properties of the auditory system. It has traditionally been assessed with the mismatch negativity (MMN) long-latency component of the auditory evoked potential (AEP). Recent studies have found earlier correlates of deviance detection based on regularity encoding. They occur in humans in the first 50 ms after sound onset, at the level of the middle-latency response of the AEP, and parallel findings of stimulus-specific adaptation observed in animal studies. However, the functional relationship between these different levels of regularity encoding and deviance detection along the auditory hierarchy has not yet been clarified. Here we addressed this issue by examining deviant-related responses at different levels of the auditory hierarchy to stimulus changes varying in their degree of deviation regarding the spatial location of a repeated standard stimulus. Auditory stimuli were presented randomly from five loudspeakers at azimuthal angles of 0°, 12°, 24°, 36° and 48° during oddball and reversed-oddball conditions. Middle-latency responses and MMN were measured. Our results revealed that middle-latency responses were sensitive to deviance but not the degree of deviation, whereas the MMN amplitude increased as a function of deviance magnitude. These findings indicated that acoustic regularity can be encoded at the level of the middle-latency response but that it takes a higher step in the auditory hierarchy for deviance magnitude to be encoded, thus providing a functional dissociation between regularity encoding and deviance detection along the auditory hierarchy. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  13. From Matched Spatial Filtering towards the Fused Statistical Descriptive Regularization Method for Enhanced Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shkvarko Yuriy

    2006-01-01

    Full Text Available We address a new approach to solve the ill-posed nonlinear inverse problem of high-resolution numerical reconstruction of the spatial spectrum pattern (SSP of the backscattered wavefield sources distributed over the remotely sensed scene. An array or synthesized array radar (SAR that employs digital data signal processing is considered. By exploiting the idea of combining the statistical minimum risk estimation paradigm with numerical descriptive regularization techniques, we address a new fused statistical descriptive regularization (SDR strategy for enhanced radar imaging. Pursuing such an approach, we establish a family of the SDR-related SSP estimators, that encompass a manifold of existing beamforming techniques ranging from traditional matched filter to robust and adaptive spatial filtering, and minimum variance methods.

  14. Enhanced spatial resolution in fluorescence molecular tomography using restarted L1-regularized nonlinear conjugate gradient algorithm.

    Science.gov (United States)

    Shi, Junwei; Liu, Fei; Zhang, Guanglei; Luo, Jianwen; Bai, Jing

    2014-04-01

    Owing to the high degree of scattering of light through tissues, the ill-posedness of fluorescence molecular tomography (FMT) inverse problem causes relatively low spatial resolution in the reconstruction results. Unlike L2 regularization, L1 regularization can preserve the details and reduce the noise effectively. Reconstruction is obtained through a restarted L1 regularization-based nonlinear conjugate gradient (re-L1-NCG) algorithm, which has been proven to be able to increase the computational speed with low memory consumption. The algorithm consists of inner and outer iterations. In the inner iteration, L1-NCG is used to obtain the L1-regularized results. In the outer iteration, the restarted strategy is used to increase the convergence speed of L1-NCG. To demonstrate the performance of re-L1-NCG in terms of spatial resolution, simulation and physical phantom studies with fluorescent targets located with different edge-to-edge distances were carried out. The reconstruction results show that the re-L1-NCG algorithm has the ability to resolve targets with an edge-to-edge distance of 0.1 cm at a depth of 1.5 cm, which is a significant improvement for FMT.

  15. The regularized monotonicity method: detecting irregular indefinite inclusions

    DEFF Research Database (Denmark)

    Garde, Henrik; Staboulis, Stratos

    2018-01-01

    inclusions, where the conductivity distribution has both more and less conductive parts relative to the background conductivity; one such method is the monotonicity method of Harrach, Seo, and Ullrich. We formulate the method for irregular indefinite inclusions, meaning that we make no regularity assumptions...

  16. EEG/MEG Source Reconstruction with Spatial-Temporal Two-Way Regularized Regression

    KAUST Repository

    Tian, Tian Siva

    2013-07-11

    In this work, we propose a spatial-temporal two-way regularized regression method for reconstructing neural source signals from EEG/MEG time course measurements. The proposed method estimates the dipole locations and amplitudes simultaneously through minimizing a single penalized least squares criterion. The novelty of our methodology is the simultaneous consideration of three desirable properties of the reconstructed source signals, that is, spatial focality, spatial smoothness, and temporal smoothness. The desirable properties are achieved by using three separate penalty functions in the penalized regression framework. Specifically, we impose a roughness penalty in the temporal domain for temporal smoothness, and a sparsity-inducing penalty and a graph Laplacian penalty in the spatial domain for spatial focality and smoothness. We develop a computational efficient multilevel block coordinate descent algorithm to implement the method. Using a simulation study with several settings of different spatial complexity and two real MEG examples, we show that the proposed method outperforms existing methods that use only a subset of the three penalty functions. © 2013 Springer Science+Business Media New York.

  17. Unsupervised seismic facies analysis with spatial constraints using regularized fuzzy c-means

    Science.gov (United States)

    Song, Chengyun; Liu, Zhining; Cai, Hanpeng; Wang, Yaojun; Li, Xingming; Hu, Guangmin

    2017-12-01

    Seismic facies analysis techniques combine classification algorithms and seismic attributes to generate a map that describes main reservoir heterogeneities. However, most of the current classification algorithms only view the seismic attributes as isolated data regardless of their spatial locations, and the resulting map is generally sensitive to noise. In this paper, a regularized fuzzy c-means (RegFCM) algorithm is used for unsupervised seismic facies analysis. Due to the regularized term of the RegFCM algorithm, the data whose adjacent locations belong to same classification will play a more important role in the iterative process than other data. Therefore, this method can reduce the effect of seismic data noise presented in discontinuous regions. The synthetic data with different signal/noise values are used to demonstrate the noise tolerance ability of the RegFCM algorithm. Meanwhile, the fuzzy factor, the neighbour window size and the regularized weight are tested using various values, to provide a reference of how to set these parameters. The new approach is also applied to a real seismic data set from the F3 block of the Netherlands. The results show improved spatial continuity, with clear facies boundaries and channel morphology, which reveals that the method is an effective seismic facies analysis tool.

  18. Topological chaos of the spatial prisoner's dilemma game on regular networks.

    Science.gov (United States)

    Jin, Weifeng; Chen, Fangyue

    2016-02-21

    The spatial version of evolutionary prisoner's dilemma on infinitely large regular lattice with purely deterministic strategies and no memories among players is investigated in this paper. Based on the statistical inferences, it is pertinent to confirm that the frequency of cooperation for characterizing its macroscopic behaviors is very sensitive to the initial conditions, which is the most practically significant property of chaos. Its intrinsic complexity is then justified on firm ground from the theory of symbolic dynamics; that is, this game is topologically mixing and possesses positive topological entropy on its subsystems. It is demonstrated therefore that its frequency of cooperation could not be adopted by simply averaging over several steps after the game reaches the equilibrium state. Furthermore, the chaotically changing spatial patterns via empirical observations can be defined and justified in view of symbolic dynamics. It is worth mentioning that the procedure proposed in this work is also applicable to other deterministic spatial evolutionary games therein. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Detecting regularities in soccer dynamics: A T-pattern approach

    Directory of Open Access Journals (Sweden)

    Valentino Zurloni

    2014-01-01

    Full Text Available La dinámica del juego en partidos de fútbol profesional es un fenómeno complejo que no ha estado resuelto de forma óptima a través delas vías tradicionales que han pretendido la cuantificación en deportes de equipo. El objetivo de este estudio es el de detectar la dinámica existente mediante un análisis de patrones temporales. Específicamente, se pretenden revelar las estructuras ocultas pero estables que subyacen a las situaciones interactivas que determinan las acciones de ataque en el fútbol. El planteamiento metodológico se basa en un diseño observacional, y con apoyo de registros digitales y análisis informatizados. Los datos se analizaron mediante el programa Theme 6 beta, el cual permite detectar la estructura temporaly secuencial de las series de datos, poniendo de manifiesto patrones que regular o irregularmente ocurren repetidamente en un período de observación. El Theme ha detectado muchos patrones temporales (T-patterns en los partidos de fútbol analizados. Se hallaron notables diferencias entre los partidos ganados y perdidos. El número de distintos T-patterns detectados fue mayor para los partidos perdidos, y menor para los ganados, mientras que el número de eventos codificados fue similar. El programa Theme y los T-patterns mejoran las posibilidades investigadoras respecto a un análisis de rendimiento basado en la frecuencia, y hacen que esta metodología sea eficaz para la investigación y constituya un apoyo procedimental en el análisis del deporte. Nuestros resultados indican que se requieren posteriores investigaciones relativas a posibles conexiones entre la detección de estas estructuras temporales y las observaciones humanas respecto al rendimiento en el fútbol. Este planteamiento sería un apoyo tanto para los miembros de los equipos como para los entrenadores, permitiendo alcanzar una mejor comprensión de la dinámica del juego y aportando una información que no ofrecen los métodos tradicionales.

  20. Analysis of Regularly and Irregularly Sampled Spatial, Multivariate, and Multi-temporal Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    1994-01-01

    This thesis describes different methods that are useful in the analysis of multivariate data. Some methods focus on spatial data (sampled regularly or irregularly), others focus on multitemporal data or data from multiple sources. The thesis covers selected and not all aspects of relevant data......-variograms are described. As a new way of setting up a well-balanced kriging support the Delaunay triangulation is suggested. Two case studies show the usefulness of 2-D semivariograms of geochemical data from areas in central Spain (with a geologist's comment) and South Greenland, and kriging/cokriging of an undersampled...... are considered as repetitions. Three case studies show the strength of the methods; one uses SPOT High Resolution Visible (HRV) multispectral (XS) data covering economically important pineapple and coffee plantations near Thika, Kiambu District, Kenya, the other two use Landsat Thematic Mapper (TM) data covering...

  1. Detecting spatial regimes in ecosystems | Science Inventory ...

    Science.gov (United States)

    Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory based method, on both terrestrial and aquatic animal data (US Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps, and multivariate analysis such as nMDS (non-metric Multidimensional Scaling) and cluster analysis. We successfully detect spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change. Use an information theory based method to identify ecological boundaries and compare our results to traditional early warning

  2. Spatial cluster detection using dynamic programming

    Directory of Open Access Journals (Sweden)

    Sverchkov Yuriy

    2012-03-01

    Full Text Available Abstract Background The task of spatial cluster detection involves finding spatial regions where some property deviates from the norm or the expected value. In a probabilistic setting this task can be expressed as finding a region where some event is significantly more likely than usual. Spatial cluster detection is of interest in fields such as biosurveillance, mining of astronomical data, military surveillance, and analysis of fMRI images. In almost all such applications we are interested both in the question of whether a cluster exists in the data, and if it exists, we are interested in finding the most accurate characterization of the cluster. Methods We present a general dynamic programming algorithm for grid-based spatial cluster detection. The algorithm can be used for both Bayesian maximum a-posteriori (MAP estimation of the most likely spatial distribution of clusters and Bayesian model averaging over a large space of spatial cluster distributions to compute the posterior probability of an unusual spatial clustering. The algorithm is explained and evaluated in the context of a biosurveillance application, specifically the detection and identification of Influenza outbreaks based on emergency department visits. A relatively simple underlying model is constructed for the purpose of evaluating the algorithm, and the algorithm is evaluated using the model and semi-synthetic test data. Results When compared to baseline methods, tests indicate that the new algorithm can improve MAP estimates under certain conditions: the greedy algorithm we compared our method to was found to be more sensitive to smaller outbreaks, while as the size of the outbreaks increases, in terms of area affected and proportion of individuals affected, our method overtakes the greedy algorithm in spatial precision and recall. The new algorithm performs on-par with baseline methods in the task of Bayesian model averaging. Conclusions We conclude that the dynamic

  3. SPATIAL MODELING OF SOLID-STATE REGULAR POLYHEDRA (SOLIDS OF PLATON IN AUTOCAD SYSTEM

    Directory of Open Access Journals (Sweden)

    P. V. Bezditko

    2009-03-01

    Full Text Available This article describes the technology of modeling regular polyhedra by graphic methods. The authors came to the conclusion that in order to create solid models of regular polyhedra the method of extrusion is best to use.

  4. An Improved Fitness Evaluation Mechanism with Memory in Spatial Prisoner's Dilemma Game on Regular Lattices

    International Nuclear Information System (INIS)

    Wang Juan; Liu Li-Na; Dong En-Zeng; Wang Li

    2013-01-01

    To deeply understand the emergence of cooperation in natural, social and economical systems, we present an improved fitness evaluation mechanism with memory in spatial prisoner's dilemma game on regular lattices. In our model, the individual fitness is not only determined by the payoff in the current game round, but also by the payoffs in previous round bins. A tunable parameter, termed as the memory strength (μ), which lies between 0 and 1, is introduced into the model to regulate the ratio of payoffs of current and previous game rounds in the individual fitness calculation. When μ = 0, our model is reduced to the standard prisoner's dilemma game; while μ = 1 represents the case in which the payoff is totally determined by the initial strategies and thus it is far from the realistic ones. Extensive numerical simulations indicate that the memory effect can substantially promote the evolution of cooperation. For μ < 1, the stronger the memory effect, the higher the cooperation level, but μ = 1 leads to a pathological state of cooperation, but can partially enhance the cooperation in the very large temptation parameter. The current results are of great significance for us to account for the role of memory effect during the evolution of cooperation among selfish players. (general)

  5. The patterning of retinal horizontal cells: normalizing the regularity index enhances the detection of genomic linkage

    Directory of Open Access Journals (Sweden)

    Patrick W. Keeley

    2014-10-01

    Full Text Available Retinal neurons are often arranged as non-random distributions called mosaics, as their somata minimize proximity to neighboring cells of the same type. The horizontal cells serve as an example of such a mosaic, but little is known about the developmental mechanisms that underlie their patterning. To identify genes involved in this process, we have used three different spatial statistics to assess the patterning of the horizontal cell mosaic across a panel of genetically distinct recombinant inbred strains. To avoid the confounding effect cell density, which varies two-fold across these different strains, we computed the real/random regularity ratio, expressing the regularity of a mosaic relative to a randomly distributed simulation of similarly sized cells. To test whether this latter statistic better reflects the variation in biological processes that contribute to horizontal cell spacing, we subsequently compared the genetic linkage for each of these two traits, the regularity index and the real/random regularity ratio, each computed from the distribution of nearest neighbor (NN distances and from the Voronoi domain (VD areas. Finally, we compared each of these analyses with another index of patterning, the packing factor. Variation in the regularity indexes, as well as their real/random regularity ratios, and the packing factor, mapped quantitative trait loci (QTL to the distal ends of Chromosomes 1 and 14. For the NN and VD analyses, we found that the degree of linkage was greater when using the real/random regularity ratio rather than the respective regularity index. Using informatic resources, we narrow the list of prospective genes positioned at these two intervals to a small collection of six genes that warrant further investigation to determine their potential role in shaping the patterning of the horizontal cell mosaic.

  6. Filter Bank Regularized Common Spatial Pattern Ensemble for Small Sample Motor Imagery Classification.

    Science.gov (United States)

    Park, Sang-Hoon; Lee, David; Lee, Sang-Goog

    2018-02-01

    For the last few years, many feature extraction methods have been proposed based on biological signals. Among these, the brain signals have the advantage that they can be obtained, even by people with peripheral nervous system damage. Motor imagery electroencephalograms (EEG) are inexpensive to measure, offer a high temporal resolution, and are intuitive. Therefore, these have received a significant amount of attention in various fields, including signal processing, cognitive science, and medicine. The common spatial pattern (CSP) algorithm is a useful method for feature extraction from motor imagery EEG. However, performance degradation occurs in a small-sample setting (SSS), because the CSP depends on sample-based covariance. Since the active frequency range is different for each subject, it is also inconvenient to set the frequency range to be different every time. In this paper, we propose the feature extraction method based on a filter bank to solve these problems. The proposed method consists of five steps. First, motor imagery EEG is divided by a using filter bank. Second, the regularized CSP (R-CSP) is applied to the divided EEG. Third, we select the features according to mutual information based on the individual feature algorithm. Fourth, parameter sets are selected for the ensemble. Finally, we classify using ensemble based on features. The brain-computer interface competition III data set IVa is used to evaluate the performance of the proposed method. The proposed method improves the mean classification accuracy by 12.34%, 11.57%, 9%, 4.95%, and 4.47% compared with CSP, SR-CSP, R-CSP, filter bank CSP (FBCSP), and SR-FBCSP. Compared with the filter bank R-CSP ( , ), which is a parameter selection version of the proposed method, the classification accuracy is improved by 3.49%. In particular, the proposed method shows a large improvement in performance in the SSS.

  7. Analysis of absence seizure generation using EEG spatial-temporal regularity measures.

    Science.gov (United States)

    Mammone, Nadia; Labate, Domenico; Lay-Ekuakille, Aime; Morabito, Francesco C

    2012-12-01

    Epileptic seizures are thought to be generated and to evolve through an underlying anomaly of synchronization in the activity of groups of neuronal populations. The related dynamic scenario of state transitions is revealed by detecting changes in the dynamical properties of Electroencephalography (EEG) signals. The recruitment procedure ending with the crisis can be explored through a spatial-temporal plot from which to extract suitable descriptors that are able to monitor and quantify the evolving synchronization level from the EEG tracings. In this paper, a spatial-temporal analysis of EEG recordings based on the concept of permutation entropy (PE) is proposed. The performance of PE are tested on a database of 24 patients affected by absence (generalized) seizures. The results achieved are compared to the dynamical behavior of the EEG of 40 healthy subjects. Being PE a feature which is dependent on two parameters, an extensive study of the sensitivity of the performance of PE with respect to the parameters' setting was carried out on scalp EEG. Once the optimal PE configuration was determined, its ability to detect the different brain states was evaluated. According to the results here presented, it seems that the widely accepted model of "jump" transition to absence seizure should be in some cases coupled (or substituted) by a gradual transition model characteristic of self-organizing networks. Indeed, it appears that the transition to the epileptic status is heralded before the preictal state, ever since the interictal stages. As a matter of fact, within the limits of the analyzed database, the frontal-temporal scalp areas appear constantly associated to PE levels higher compared to the remaining electrodes, whereas the parieto-occipital areas appear associated to lower PE values. The EEG of healthy subjects neither shows any similar dynamic behavior nor exhibits any recurrent portrait in PE topography.

  8. Iterative Method of Regularization with Application of Advanced Technique for Detection of Contours

    International Nuclear Information System (INIS)

    Niedziela, T.; Stankiewicz, A.

    2000-01-01

    This paper proposes a novel iterative method of regularization with application of an advanced technique for detection of contours. To eliminate noises, the properties of convolution of functions are utilized. The method can be accomplished in a simple neural cellular network, which creates the possibility of extraction of contours by automatic image recognition equipment. (author)

  9. Limited angle CT reconstruction by simultaneous spatial and Radon domain regularization based on TV and data-driven tight frame

    Science.gov (United States)

    Zhang, Wenkun; Zhang, Hanming; Wang, Linyuan; Cai, Ailong; Li, Lei; Yan, Bin

    2018-02-01

    Limited angle computed tomography (CT) reconstruction is widely performed in medical diagnosis and industrial testing because of the size of objects, engine/armor inspection requirements, and limited scan flexibility. Limited angle reconstruction necessitates usage of optimization-based methods that utilize additional sparse priors. However, most of conventional methods solely exploit sparsity priors of spatial domains. When CT projection suffers from serious data deficiency or various noises, obtaining reconstruction images that meet the requirement of quality becomes difficult and challenging. To solve this problem, this paper developed an adaptive reconstruction method for limited angle CT problem. The proposed method simultaneously uses spatial and Radon domain regularization model based on total variation (TV) and data-driven tight frame. Data-driven tight frame being derived from wavelet transformation aims at exploiting sparsity priors of sinogram in Radon domain. Unlike existing works that utilize pre-constructed sparse transformation, the framelets of the data-driven regularization model can be adaptively learned from the latest projection data in the process of iterative reconstruction to provide optimal sparse approximations for given sinogram. At the same time, an effective alternating direction method is designed to solve the simultaneous spatial and Radon domain regularization model. The experiments for both simulation and real data demonstrate that the proposed algorithm shows better performance in artifacts depression and details preservation than the algorithms solely using regularization model of spatial domain. Quantitative evaluations for the results also indicate that the proposed algorithm applying learning strategy performs better than the dual domains algorithms without learning regularization model

  10. WE-FG-207B-03: Multi-Energy CT Reconstruction with Spatial Spectral Nonlocal Means Regularization

    Energy Technology Data Exchange (ETDEWEB)

    Li, B [University of Texas Southwestern Medical Center, Dallas, TX (United States); Southern Medical University, Guangzhou, Guangdong (China); Shen, C; Ouyang, L; Yang, M; Jiang, S; Jia, X [University of Texas Southwestern Medical Center, Dallas, TX (United States); Zhou, L [Southern Medical University, Guangzhou, Guangdong (China)

    2016-06-15

    Purpose: Multi-energy computed tomography (MECT) is an emerging application in medical imaging due to its ability of material differentiation and potential for molecular imaging. In MECT, image correlations at different spatial and channels exist. It is desirable to incorporate these correlations in reconstruction to improve image quality. For this purpose, this study proposes a MECT reconstruction technique that employes spatial spectral non-local means (ssNLM) regularization. Methods: We consider a kVp-switching scanning method in which source energy is rapidly switched during data acquisition. For each energy channel, this yields projection data acquired at a number of angles, whereas projection angles among channels are different. We formulate the reconstruction task as an optimziation problem. A least square term enfores data fidelity. A ssNLM term is used as regularization to encourage similarities among image patches at different spatial locations and channels. When comparing image patches at different channels, intensity difference were corrected by a transformation estimated via histogram equalization during the reconstruction process. Results: We tested our method in a simulation study with a NCAT phantom and an experimental study with a Gammex phantom. For comparison purpose, we also performed reconstructions using conjugate-gradient least square (CGLS) method and conventional NLM method that only considers spatial correlation in an image. ssNLM is able to better suppress streak artifacts. The streaks are along different projection directions in images at different channels. ssNLM discourages this dissimilarity and hence removes them. True image structures are preserved in this process. Measurements in regions of interests yield 1.1 to 3.2 and 1.5 to 1.8 times higher contrast to noise ratio than the NLM approach. Improvements over CGLS is even more profound due to lack of regularization in the CGLS method and hence amplified noise. Conclusion: The

  11. Detecting regular sound changes in linguistics as events of concerted evolution.

    Science.gov (United States)

    Hruschka, Daniel J; Branford, Simon; Smith, Eric D; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-05

    Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Spatial Outlier Detection of CO2 Monitoring Data Based on Spatial Local Outlier Factor

    OpenAIRE

    Liu Xin; Zhang Shaoliang; Zheng Pulin

    2015-01-01

    Spatial local outlier factor (SLOF) algorithm was adopted in this study for spatial outlier detection because of the limitations of the traditional static threshold detection. Based on the spatial characteristics of CO2 monitoring data obtained in the carbon capture and storage (CCS) project, the K-Nearest Neighbour (KNN) graph was constructed using the latitude and longitude information of the monitoring points to identify the spatial neighbourhood of the monitoring points. Then ...

  13. Self-Organisation in Spatial Systems-From Fractal Chaos to Regular Patterns and Vice Versa.

    Directory of Open Access Journals (Sweden)

    Michal Banaszak

    Full Text Available This study offers a new perspective on the evolutionary patterns of cities or urban agglomerations. Such developments can range from chaotic to fully ordered. We demonstrate that in a dynamic space of interactive human behaviour cities produce a wealth of gravitational attractors whose size and shape depend on the resistance of space emerging inter alia from transport friction costs. This finding offers original insights into the complex evolution of spatial systems and appears to be consistent with the principles of central place theory known from the spatial sciences and geography. Our approach is dynamic in nature and forms a generalisation of hierarchical principles in geographic space.

  14. The Regularized Iteratively Reweighted MAD Method for Change Detection in Multi- and Hyperspectral Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2007-01-01

    This paper describes new extensions to the previously published multivariate alteration detection (MAD) method for change detection in bi-temporal, multi- and hypervariate data such as remote sensing imagery. Much like boosting methods often applied in data mining work, the iteratively reweighted...... to observations that show little change, i.e., for which the sum of squared, standardized MAD variates is small, and small weights are assigned to observations for which the sum is large. Like the original MAD method, the iterative extension is invariant to linear (affine) transformations of the original...... an agricultural region in Kenya, and hyperspectral airborne HyMap data from a small rural area in southeastern Germany are given. The latter case demonstrates the need for regularization....

  15. Regularized non-stationary morphological reconstruction algorithm for weak signal detection in microseismic monitoring: methodology

    Science.gov (United States)

    Huang, Weilin; Wang, Runqiu; Chen, Yangkang

    2018-05-01

    Microseismic signal is typically weak compared with the strong background noise. In order to effectively detect the weak signal in microseismic data, we propose a mathematical morphology based approach. We decompose the initial data into several morphological multiscale components. For detection of weak signal, a non-stationary weighting operator is proposed and introduced into the process of reconstruction of data by morphological multiscale components. The non-stationary weighting operator can be obtained by solving an inversion problem. The regularized non-stationary method can be understood as a non-stationary matching filtering method, where the matching filter has the same size as the data to be filtered. In this paper, we provide detailed algorithmic descriptions and analysis. The detailed algorithm framework, parameter selection and computational issue for the regularized non-stationary morphological reconstruction (RNMR) method are presented. We validate the presented method through a comprehensive analysis through different data examples. We first test the proposed technique using a synthetic data set. Then the proposed technique is applied to a field project, where the signals induced from hydraulic fracturing are recorded by 12 three-component geophones in a monitoring well. The result demonstrates that the RNMR can improve the detectability of the weak microseismic signals. Using the processed data, the short-term-average over long-term average picking algorithm and Geiger's method are applied to obtain new locations of microseismic events. In addition, we show that the proposed RNMR method can be used not only in microseismic data but also in reflection seismic data to detect the weak signal. We also discussed the extension of RNMR from 1-D to 2-D or a higher dimensional version.

  16. Cardiac C-arm computed tomography using a 3D + time ROI reconstruction method with spatial and temporal regularization

    Energy Technology Data Exchange (ETDEWEB)

    Mory, Cyril, E-mail: cyril.mory@philips.com [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, F-69621 Villeurbanne Cedex (France); Philips Research Medisys, 33 rue de Verdun, 92156 Suresnes (France); Auvray, Vincent; Zhang, Bo [Philips Research Medisys, 33 rue de Verdun, 92156 Suresnes (France); Grass, Michael; Schäfer, Dirk [Philips Research, Röntgenstrasse 24–26, D-22335 Hamburg (Germany); Chen, S. James; Carroll, John D. [Department of Medicine, Division of Cardiology, University of Colorado Denver, 12605 East 16th Avenue, Aurora, Colorado 80045 (United States); Rit, Simon [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1 (France); Centre Léon Bérard, 28 rue Laënnec, F-69373 Lyon (France); Peyrin, Françoise [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, F-69621 Villeurbanne Cedex (France); X-ray Imaging Group, European Synchrotron, Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Douek, Philippe; Boussel, Loïc [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1 (France); Hospices Civils de Lyon, 28 Avenue du Doyen Jean Lépine, 69500 Bron (France)

    2014-02-15

    Purpose: Reconstruction of the beating heart in 3D + time in the catheter laboratory using only the available C-arm system would improve diagnosis, guidance, device sizing, and outcome control for intracardiac interventions, e.g., electrophysiology, valvular disease treatment, structural or congenital heart disease. To obtain such a reconstruction, the patient's electrocardiogram (ECG) must be recorded during the acquisition and used in the reconstruction. In this paper, the authors present a 4D reconstruction method aiming to reconstruct the heart from a single sweep 10 s acquisition. Methods: The authors introduce the 4D RecOnstructiOn using Spatial and TEmporal Regularization (short 4D ROOSTER) method, which reconstructs all cardiac phases at once, as a 3D + time volume. The algorithm alternates between a reconstruction step based on conjugate gradient and four regularization steps: enforcing positivity, averaging along time outside a motion mask that contains the heart and vessels, 3D spatial total variation minimization, and 1D temporal total variation minimization. Results: 4D ROOSTER recovers the different temporal representations of a moving Shepp and Logan phantom, and outperforms both ECG-gated simultaneous algebraic reconstruction technique and prior image constrained compressed sensing on a clinical case. It generates 3D + time reconstructions with sharp edges which can be used, for example, to estimate the patient's left ventricular ejection fraction. Conclusions: 4D ROOSTER can be applied for human cardiac C-arm CT, and potentially in other dynamic tomography areas. It can easily be adapted to other problems as regularization is decoupled from projection and back projection.

  17. Cardiac C-arm computed tomography using a 3D + time ROI reconstruction method with spatial and temporal regularization

    International Nuclear Information System (INIS)

    Mory, Cyril; Auvray, Vincent; Zhang, Bo; Grass, Michael; Schäfer, Dirk; Chen, S. James; Carroll, John D.; Rit, Simon; Peyrin, Françoise; Douek, Philippe; Boussel, Loïc

    2014-01-01

    Purpose: Reconstruction of the beating heart in 3D + time in the catheter laboratory using only the available C-arm system would improve diagnosis, guidance, device sizing, and outcome control for intracardiac interventions, e.g., electrophysiology, valvular disease treatment, structural or congenital heart disease. To obtain such a reconstruction, the patient's electrocardiogram (ECG) must be recorded during the acquisition and used in the reconstruction. In this paper, the authors present a 4D reconstruction method aiming to reconstruct the heart from a single sweep 10 s acquisition. Methods: The authors introduce the 4D RecOnstructiOn using Spatial and TEmporal Regularization (short 4D ROOSTER) method, which reconstructs all cardiac phases at once, as a 3D + time volume. The algorithm alternates between a reconstruction step based on conjugate gradient and four regularization steps: enforcing positivity, averaging along time outside a motion mask that contains the heart and vessels, 3D spatial total variation minimization, and 1D temporal total variation minimization. Results: 4D ROOSTER recovers the different temporal representations of a moving Shepp and Logan phantom, and outperforms both ECG-gated simultaneous algebraic reconstruction technique and prior image constrained compressed sensing on a clinical case. It generates 3D + time reconstructions with sharp edges which can be used, for example, to estimate the patient's left ventricular ejection fraction. Conclusions: 4D ROOSTER can be applied for human cardiac C-arm CT, and potentially in other dynamic tomography areas. It can easily be adapted to other problems as regularization is decoupled from projection and back projection

  18. Redistribution population data across a regular spatial grid according to buildings characteristics

    Science.gov (United States)

    Calka, Beata; Bielecka, Elzbieta; Zdunkiewicz, Katarzyna

    2016-12-01

    Population data are generally provided by state census organisations at the predefined census enumeration units. However, these datasets very are often required at userdefined spatial units that differ from the census output levels. A number of population estimation techniques have been developed to address these problems. This article is one of those attempts aimed at improving county level population estimates by using spatial disaggregation models with support of buildings characteristic, derived from national topographic database, and average area of a flat. The experimental gridded population surface was created for Opatów county, sparsely populated rural region located in Central Poland. The method relies on geolocation of population counts in buildings, taking into account the building volume and structural building type and then aggregation the people total in 1 km quadrilateral grid. The overall quality of population distribution surface expressed by the mean of RMSE equals 9 persons, and the MAE equals 0.01. We also discovered that nearly 20% of total county area is unpopulated and 80% of people lived on 33% of the county territory.

  19. Grid-texture mechanisms in human vision: Contrast detection of regular sparse micro-patterns requires specialist templates.

    Science.gov (United States)

    Baker, Daniel H; Meese, Tim S

    2016-07-27

    Previous work has shown that human vision performs spatial integration of luminance contrast energy, where signals are squared and summed (with internal noise) over area at detection threshold. We tested that model here in an experiment using arrays of micro-pattern textures that varied in overall stimulus area and sparseness of their target elements, where the contrast of each element was normalised for sensitivity across the visual field. We found a power-law improvement in performance with stimulus area, and a decrease in sensitivity with sparseness. While the contrast integrator model performed well when target elements constituted 50-100% of the target area (replicating previous results), observers outperformed the model when texture elements were sparser than this. This result required the inclusion of further templates in our model, selective for grids of various regular texture densities. By assuming a MAX operation across these noisy mechanisms the model also accounted for the increase in the slope of the psychometric function that occurred as texture density decreased. Thus, for the first time, mechanisms that are selective for texture density have been revealed at contrast detection threshold. We suggest that these mechanisms have a role to play in the perception of visual textures.

  20. Drone-based Object Counting by Spatially Regularized Regional Proposal Network

    OpenAIRE

    Hsieh, Meng-Ru; Lin, Yen-Liang; Hsu, Winston H.

    2017-01-01

    Existing counting methods often adopt regression-based approaches and cannot precisely localize the target objects, which hinders the further analysis (e.g., high-level understanding and fine-grained classification). In addition, most of prior work mainly focus on counting objects in static environments with fixed cameras. Motivated by the advent of unmanned flying vehicles (i.e., drones), we are interested in detecting and counting objects in such dynamic environments. We propose Layout Prop...

  1. Impacts of memory on a regular lattice for different population sizes with asynchronous update in spatial snowdrift game

    Science.gov (United States)

    Shu, Feng; Liu, Xingwen; Li, Min

    2018-05-01

    Memory is an important factor on the evolution of cooperation in spatial structure. For evolutionary biologists, the problem is often how cooperation acts can emerge in an evolving system. In the case of snowdrift game, it is found that memory can boost cooperation level for large cost-to-benefit ratio r, while inhibit cooperation for small r. Thus, how to enlarge the range of r for the purpose of enhancing cooperation becomes a hot issue recently. This paper addresses a new memory-based approach and its core lies in: Each agent applies the given rule to compare its own historical payoffs in a certain memory size, and take the obtained maximal one as virtual payoff. In order to get the optimal strategy, each agent randomly selects one of its neighbours to compare their virtual payoffs, which can lead to the optimal strategy. Both constant-size memory and size-varying memory are investigated by means of a scenario of asynchronous updating algorithm on regular lattices with different sizes. Simulation results show that this approach effectively enhances cooperation level in spatial structure and makes the high cooperation level simultaneously emerge for both small and large r. Moreover, it is discovered that population sizes have a significant influence on the effects of cooperation.

  2. Incorporating Spatial Information for Microaneurysm Detection in Retinal Images

    Directory of Open Access Journals (Sweden)

    Mohamed M. Habib

    2017-06-01

    Full Text Available The presence of microaneurysms(MAs in retinal images is a pathognomonic sign of Diabetic Retinopathy (DR. This is one of the leading causes of blindness in the working population worldwide. This paper introduces a novel algorithm that combines information from spatial views of the retina for the purpose of MA detection. Most published research in the literature has addressed the problem of detecting MAs from single retinal images. This work proposes the incorporation of information from two spatial views during the detection process. The algorithm is evaluated using 160 images from 40 patients seen as part of a UK diabetic eye screening programme which contained 207 MAs. An improvement in performance compared to detection from an algorithm that relies on a single image is shown as an increase of 2% ROC score, hence demonstrating the potential of this method.

  3. Spatial Outlier Detection of CO2 Monitoring Data Based on Spatial Local Outlier Factor

    Directory of Open Access Journals (Sweden)

    Liu Xin

    2015-12-01

    Full Text Available Spatial local outlier factor (SLOF algorithm was adopted in this study for spatial outlier detection because of the limitations of the traditional static threshold detection. Based on the spatial characteristics of CO2 monitoring data obtained in the carbon capture and storage (CCS project, the K-Nearest Neighbour (KNN graph was constructed using the latitude and longitude information of the monitoring points to identify the spatial neighbourhood of the monitoring points. Then SLOF was adopted to calculate the outlier degrees of the monitoring points and the 3σ rule was employed to identify the spatial outlier. Finally, the selection of K value was analysed and the optimal one was selected. The results show that, compared with the static threshold method, the proposed algorithm has a higher detection precision. It can overcome the shortcomings of the static threshold method and improve the accuracy and diversity of local outlier detection, which provides a reliable reference for the safety assessment and warning of CCS monitoring.

  4. Spatial-Temporal Event Detection from Geo-Tagged Tweets

    Directory of Open Access Journals (Sweden)

    Yuqian Huang

    2018-04-01

    Full Text Available As one of the most popular social networking services in the world, Twitter allows users to post messages along with their current geographic locations. Such georeferenced or geo-tagged Twitter datasets can benefit location-based services, targeted advertising and geosocial studies. Our study focused on the detection of small-scale spatial-temporal events and their textual content. First, we used Spatial-Temporal Density-Based Spatial Clustering of Applications with Noise (ST-DBSCAN to spatially-temporally cluster the tweets. Then, the word frequencies were summarized for each cluster and the potential topics were modeled by the Latent Dirichlet Allocation (LDA algorithm. Using two years of Twitter data from four college cities in the U.S., we were able to determine the spatial-temporal patterns of two known events, two unknown events and one recurring event, which then were further explored and modeled to identify the semantic content about the events. This paper presents our process and recommendations for both finding event-related tweets as well as understanding the spatial-temporal behaviors and semantic natures of the detected events.

  5. Modelling aggregation on the large scale and regularity on the small scale in spatial point pattern datasets

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper

    We consider a dependent thinning of a regular point process with the aim of obtaining aggregation on the large scale and regularity on the small scale in the resulting target point process of retained points. Various parametric models for the underlying processes are suggested and the properties...

  6. Detecting violations of temporal regularities in waking and sleeping two-month-old infants

    NARCIS (Netherlands)

    Otte, R.A.; Winkler, I.; Braeken, M.A.K.A.; Stekelenburg, J.J.; van der Stelt, O.; Van den Bergh, B.R.H.

    2013-01-01

    Correctly processing rapid sequences of sounds is essential for developmental milestones, such as language acquisition. We investigated the sensitivity of two-month-old infants to violations of a temporal regularity, by recording event-related brain potentials (ERPs) in an auditory oddball paradigm

  7. Neutron detection in the frame of spatial magnetic spin resonance

    Energy Technology Data Exchange (ETDEWEB)

    Jericha, Erwin, E-mail: jericha@ati.ac.at [TU Wien, Atominstitut, Stadionallee 2, 1020 Wien (Austria); Bosina, Joachim [TU Wien, Atominstitut, Stadionallee 2, 1020 Wien (Austria); Austrian Academy of Sciences, Stefan Meyer Institute, Boltzmanngasse 3, 1090 Wien (Austria); Institut Laue–Langevin, 71 Avenue des Martyrs, 38042 Grenoble (France); Geltenbort, Peter [Institut Laue–Langevin, 71 Avenue des Martyrs, 38042 Grenoble (France); Hino, Masahiro [Kyoto University, Research Reactor Institute, Kumatori, Osaka 590-0494 (Japan); Mach, Wilfried [TU Wien, Atominstitut, Stadionallee 2, 1020 Wien (Austria); Oda, Tatsuro [Kyoto University, Department of Nuclear Engineering, Kyoto 615-8540 (Japan); Badurek, Gerald [TU Wien, Atominstitut, Stadionallee 2, 1020 Wien (Austria)

    2017-02-11

    This work is related to neutron detection in the context of the polarised neutron optics technique of spatial magnetic spin resonance. By this technique neutron beams may be tailored in their spectral distribution and temporal structure. We have performed experiments with very cold neutrons (VCN) at the high-flux research reactor of the Institut Laue Langevin (ILL) in Grenoble to demonstrate the potential of this method. A combination of spatially and temporally resolving neutron detection allowed us to characterize a prototype neutron resonator. With this detector we were able to record neutron time-of-flight spectra, assess and minimise neutron background and provide for normalisation of the spectra owing to variations in reactor power and ambient conditions at the same time.

  8. Spatial-temporal event detection in climate parameter imagery.

    Energy Technology Data Exchange (ETDEWEB)

    McKenna, Sean Andrew; Gutierrez, Karen A.

    2011-10-01

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to the earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.

  9. Fast entanglement detection for unknown states of two spatial qutrits

    International Nuclear Information System (INIS)

    Lima, G.; Gomez, E. S.; Saavedra, C.; Vargas, A.; Vianna, R. O.

    2010-01-01

    We investigate the practicality of the method proposed by Maciel et al. [Phys. Rev. A. 80, 032325 (2009).] for detecting the entanglement of two spatial qutrits (three-dimensional quantum systems), which are encoded in the discrete transverse momentum of single photons transmitted through a multislit aperture. The method is based on the acquisition of partial information of the quantum state through projective measurements, and a data processing analysis done with semidefinite programs. This analysis relies on generating gradually an optimal entanglement witness operator, and numerical investigations have shown that it allows for the entanglement detection of unknown states with a cost much lower than full state tomography.

  10. Analysis of sea-surface radar signatures by means of wavelet-based edge detection and detection of regularities; Analyse von Radarsignaturen der Meeresoberflaeche mittels auf Wavelets basierender Kantenerkennung und Regularitaetsbestimmung

    Energy Technology Data Exchange (ETDEWEB)

    Wolff, U. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Gewaesserphysik

    2000-07-01

    The derivation and implementation of an algorithm for edge detection in images and for the detection of the Lipschitz regularity in edge points are described. The method is based on the use of the wavelet transform for edge detection at different resolutions. The Lipschitz regularity is a measure that characterizes the edges. The description of the derivation is first performed in one dimension. The approach of Mallat is formulated consistently and proved. Subsequently, the two-dimensional case is addressed, for which the derivation, as well as the description of the algorithm, is analogous. The algorithm is applied to detect edges in nautical radar images using images collected at the island of Sylt. The edges discernible in the images and the Lipschitz values provide information about the position and nature of spatial variations in the depth of the seafloor. By comparing images from different periods of measurement, temporal changes in the bottom structures can be localized at different resolutions and interpreted. The method is suited to the monitoring of coastal areas. It is an inexpensive way to observe long-term changes in the seafloor character. Thus, the results of this technique may be used by the authorities responsible for coastal protection to decide whether measures should be taken or not. (orig.)

  11. Improving PET spatial resolution and detectability for prostate cancer imaging

    International Nuclear Information System (INIS)

    Bal, H; Guerin, L; Casey, M E; Conti, M; Eriksson, L; Michel, C; Fanti, S; Pettinato, C; Adler, S; Choyke, P

    2014-01-01

    Prostate cancer, one of the most common forms of cancer among men, can benefit from recent improvements in positron emission tomography (PET) technology. In particular, better spatial resolution, lower noise and higher detectability of small lesions could be greatly beneficial for early diagnosis and could provide a strong support for guiding biopsy and surgery. In this article, the impact of improved PET instrumentation with superior spatial resolution and high sensitivity are discussed, together with the latest development in PET technology: resolution recovery and time-of-flight reconstruction. Using simulated cancer lesions, inserted in clinical PET images obtained with conventional protocols, we show that visual identification of the lesions and detectability via numerical observers can already be improved using state of the art PET reconstruction methods. This was achieved using both resolution recovery and time-of-flight reconstruction, and a high resolution image with 2 mm pixel size. Channelized Hotelling numerical observers showed an increase in the area under the LROC curve from 0.52 to 0.58. In addition, a relationship between the simulated input activity and the area under the LROC curve showed that the minimum detectable activity was reduced by more than 23%. (paper)

  12. Cluster Detection Tests in Spatial Epidemiology: A Global Indicator for Performance Assessment.

    Directory of Open Access Journals (Sweden)

    Aline Guttmann

    Full Text Available In cluster detection of disease, the use of local cluster detection tests (CDTs is current. These methods aim both at locating likely clusters and testing for their statistical significance. New or improved CDTs are regularly proposed to epidemiologists and must be subjected to performance assessment. Because location accuracy has to be considered, performance assessment goes beyond the raw estimation of type I or II errors. As no consensus exists for performance evaluations, heterogeneous methods are used, and therefore studies are rarely comparable. A global indicator of performance, which assesses both spatial accuracy and usual power, would facilitate the exploration of CDTs behaviour and help between-studies comparisons. The Tanimoto coefficient (TC is a well-known measure of similarity that can assess location accuracy but only for one detected cluster. In a simulation study, performance is measured for many tests. From the TC, we here propose two statistics, the averaged TC and the cumulated TC, as indicators able to provide a global overview of CDTs performance for both usual power and location accuracy. We evidence the properties of these two indicators and the superiority of the cumulated TC to assess performance. We tested these indicators to conduct a systematic spatial assessment displayed through performance maps.

  13. Cluster Detection Tests in Spatial Epidemiology: A Global Indicator for Performance Assessment

    Science.gov (United States)

    Guttmann, Aline; Li, Xinran; Feschet, Fabien; Gaudart, Jean; Demongeot, Jacques; Boire, Jean-Yves; Ouchchane, Lemlih

    2015-01-01

    In cluster detection of disease, the use of local cluster detection tests (CDTs) is current. These methods aim both at locating likely clusters and testing for their statistical significance. New or improved CDTs are regularly proposed to epidemiologists and must be subjected to performance assessment. Because location accuracy has to be considered, performance assessment goes beyond the raw estimation of type I or II errors. As no consensus exists for performance evaluations, heterogeneous methods are used, and therefore studies are rarely comparable. A global indicator of performance, which assesses both spatial accuracy and usual power, would facilitate the exploration of CDTs behaviour and help between-studies comparisons. The Tanimoto coefficient (TC) is a well-known measure of similarity that can assess location accuracy but only for one detected cluster. In a simulation study, performance is measured for many tests. From the TC, we here propose two statistics, the averaged TC and the cumulated TC, as indicators able to provide a global overview of CDTs performance for both usual power and location accuracy. We evidence the properties of these two indicators and the superiority of the cumulated TC to assess performance. We tested these indicators to conduct a systematic spatial assessment displayed through performance maps. PMID:26086911

  14. Temporal Modulation Detection Depends on Sharpness of Spatial Tuning.

    Science.gov (United States)

    Zhou, Ning; Cadmus, Matthew; Dong, Lixue; Mathews, Juliana

    2018-04-25

    Prior research has shown that in electrical hearing, cochlear implant (CI) users' speech recognition performance is related in part to their ability to detect temporal modulation (i.e., modulation sensitivity). Previous studies have also shown better speech recognition when selectively stimulating sites with good modulation sensitivity rather than all stimulation sites. Site selection based on channel interaction measures, such as those using imaging or psychophysical estimates of spread of neural excitation, has also been shown to improve speech recognition. This led to the question of whether temporal modulation sensitivity and spatial selectivity of neural excitation are two related variables. In the present study, CI users' modulation sensitivity was compared for sites with relatively broad or narrow neural excitation patterns. This was achieved by measuring temporal modulation detection thresholds (MDTs) at stimulation sites that were significantly different in their sharpness of the psychophysical spatial tuning curves (PTCs) and measuring MDTs at the same sites in monopolar (MP) and bipolar (BP) stimulation modes. Nine postlingually deafened subjects implanted with Cochlear Nucleus® device took part in the study. Results showed a significant correlation between the sharpness of PTCs and MDTs, indicating that modulation detection benefits from a more spatially restricted neural activation pattern. There was a significant interaction between stimulation site and mode. That is, using BP stimulation only improved MDTs at stimulation sites with broad PTCs but had no effect or sometimes a detrimental effect on MDTs at stimulation sites with sharp PTCs. This interaction could suggest that a criterion number of nerve fibers is needed to achieve optimal temporal resolution, and, to achieve optimized speech recognition outcomes, individualized selection of site-specific current focusing strategies may be necessary. These results also suggest that the removal of

  15. Label-Informed Non-negative Matrix Factorization with Manifold Regularization for Discriminative Subnetwork Detection.

    Science.gov (United States)

    Watanabe, Takanori; Tunc, Birkan; Parker, Drew; Kim, Junghoon; Verma, Ragini

    2016-10-01

    In this paper, we present a novel method for obtaining a low dimensional representation of a complex brain network that: (1) can be interpreted in a neurobiologically meaningful way, (2) emphasizes group differences by accounting for label information, and (3) captures the variation in disease subtypes/severity by respecting the intrinsic manifold structure underlying the data. Our method is a supervised variant of non-negative matrix factorization (NMF), and achieves dimensionality reduction by extracting an orthogonal set of subnetworks that are interpretable, reconstructive of the original data, and also discriminative at the group level. In addition, the method includes a manifold regularizer that encourages the low dimensional representations to be smooth with respect to the intrinsic geometry of the data, allowing subjects with similar disease-severity to share similar network representations. While the method is generalizable to other types of non-negative network data, in this work we have used structural connectomes (SCs) derived from diffusion data to identify the cortical/subcortical connections that have been disrupted in abnormal neurological state. Experiments on a traumatic brain injury (TBI) dataset demonstrate that our method can identify subnetworks that can reliably classify TBI from controls and also reveal insightful connectivity patterns that may be indicative of a biomarker.

  16. Detection and removal of spatial bias in multiwell assays.

    Science.gov (United States)

    Lachmann, Alexander; Giorgi, Federico M; Alvarez, Mariano J; Califano, Andrea

    2016-07-01

    Multiplex readout assays are now increasingly being performed using microfluidic automation in multiwell format. For instance, the Library of Integrated Network-based Cellular Signatures (LINCS) has produced gene expression measurements for tens of thousands of distinct cell perturbations using a 384-well plate format. This dataset is by far the largest 384-well gene expression measurement assay ever performed. We investigated the gene expression profiles of a million samples from the LINCS dataset and found that the vast majority (96%) of the tested plates were affected by a significant 2D spatial bias. Using a novel algorithm combining spatial autocorrelation detection and principal component analysis, we could remove most of the spatial bias from the LINCS dataset and show in parallel a dramatic improvement of similarity between biological replicates assayed in different plates. The proposed methodology is fully general and can be applied to any highly multiplexed assay performed in multiwell format. ac2248@columbia.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Spatial resolution and chest nodule detection: an interesting incidental finding

    Science.gov (United States)

    Toomey, R. J.; McEntee, M. F.; Ryan, J. T.; Evanoff, M. G.; Hayes, A.; Brennan, P. C.

    2010-02-01

    This study reports an incidental finding from a larger work. It examines the relationship between spatial resolution and nodule detection for chest radiographs. Twelve examining radiologists with the American Board of Radiology read thirty chest radiographs in two conditions - full (1500 × 1500 pixel) resolution, and 300 × 300 pixel resolution linearly interpolated to 1500 × 1500 pixels. All images were surrounded by a 10-pixel sharp grey border to aid in focussing the observer's eye when viewing the comparatively unsharp interpolated images. Fifteen of the images contained a single simulated pulmonary nodule. Observers were asked to rate their confidence that a nodule was present on each radiograph on a scale of 1 (least confidence, certain no lesion is present) to 6 (most confidence, certain a lesion was present). All other abnormalities were to be ignored. No windowing, levelling or magnification of the images was permitted and viewing distance was constrained to approximately 70cm. Images were displayed on a 3 megapixel greyscale monitor. Receiver operating characteristic (ROC) analysis was applied to the results of the readings using the Dorfman-Berbaum-Metz multiplereader, multiple-case method. No statistically significant differences were found with either readers and cases treated as random or with cases treated as fixed. Low spatial frequency information appears to be sufficient for the detection of chest lesion of the type used in this study.

  18. An Examination of Three Spatial Event Cluster Detection Methods

    Directory of Open Access Journals (Sweden)

    Hensley H. Mariathas

    2015-03-01

    Full Text Available In spatial disease surveillance, geographic areas with large numbers of disease cases are to be identified, so that targeted investigations can be pursued. Geographic areas with high disease rates are called disease clusters and statistical cluster detection tests are used to identify geographic areas with higher disease rates than expected by chance alone. In some situations, disease-related events rather than individuals are of interest for geographical surveillance, and methods to detect clusters of disease-related events are called event cluster detection methods. In this paper, we examine three distributional assumptions for the events in cluster detection: compound Poisson, approximate normal and multiple hypergeometric (exact. The methods differ on the choice of distributional assumption for the potentially multiple correlated events per individual. The methods are illustrated on emergency department (ED presentations by children and youth (age < 18 years because of substance use in the province of Alberta, Canada, during 1 April 2007, to 31 March 2008. Simulation studies are conducted to investigate Type I error and the power of the clustering methods.

  19. Spatial-Spectral Approaches to Edge Detection in Hyperspectral Remote Sensing

    Science.gov (United States)

    Cox, Cary M.

    also explores the concept of an edge within hyperspectral space, the relative importance of spatial and spectral resolutions as they pertain to HSI edge detection and how effectively compressed HSI data improves edge detection results. The HSI edge detection experiments yielded valuable insights into the algorithms' strengths, weaknesses and optimal alignment to remote sensing applications. The gradient-based edge operator produced strong edge planes across a range of evaluation measures and applications, particularly with respect to false negatives, unbroken edges, urban mapping, vegetation mapping and oil spill mapping applications. False positives and uncompressed HSI data presented occasional challenges to the algorithm. The HySPADE edge operator produced satisfactory results with respect to localization, single-point response, oil spill mapping and trace chemical detection, and was challenged by false positives, declining spectral resolution and vegetation mapping applications. The level set edge detector produced high-quality edge planes for most tests and demonstrated strong performance with respect to false positives, single-point response, oil spill mapping and mineral mapping. False negatives were a regular challenge for the level set edge detection algorithm. Finally, HSI data optimized for spectral information compression and noise was shown to improve edge detection performance across all three algorithms, while the gradient-based algorithm and HySPADE demonstrated significant robustness to declining spectral and spatial resolutions.

  20. Penalized likelihood and multi-objective spatial scans for the detection and inference of irregular clusters

    Directory of Open Access Journals (Sweden)

    Fonseca Carlos M

    2010-10-01

    Full Text Available Abstract Background Irregularly shaped spatial clusters are difficult to delineate. A cluster found by an algorithm often spreads through large portions of the map, impacting its geographical meaning. Penalized likelihood methods for Kulldorff's spatial scan statistics have been used to control the excessive freedom of the shape of clusters. Penalty functions based on cluster geometry and non-connectivity have been proposed recently. Another approach involves the use of a multi-objective algorithm to maximize two objectives: the spatial scan statistics and the geometric penalty function. Results & Discussion We present a novel scan statistic algorithm employing a function based on the graph topology to penalize the presence of under-populated disconnection nodes in candidate clusters, the disconnection nodes cohesion function. A disconnection node is defined as a region within a cluster, such that its removal disconnects the cluster. By applying this function, the most geographically meaningful clusters are sifted through the immense set of possible irregularly shaped candidate cluster solutions. To evaluate the statistical significance of solutions for multi-objective scans, a statistical approach based on the concept of attainment function is used. In this paper we compared different penalized likelihoods employing the geometric and non-connectivity regularity functions and the novel disconnection nodes cohesion function. We also build multi-objective scans using those three functions and compare them with the previous penalized likelihood scans. An application is presented using comprehensive state-wide data for Chagas' disease in puerperal women in Minas Gerais state, Brazil. Conclusions We show that, compared to the other single-objective algorithms, multi-objective scans present better performance, regarding power, sensitivity and positive predicted value. The multi-objective non-connectivity scan is faster and better suited for the

  1. EBSD spatial resolution for detecting sigma phase in steels

    Energy Technology Data Exchange (ETDEWEB)

    Bordín, S. Fernandez; Limandri, S. [Instituto de Física Enrique Gaviola, CONICET. M. Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Ranalli, J.M. [Comisión Nacional de Energía Atómica, Av. Gral. Paz 1499, San Martín, 1650 Buenos Aires (Argentina); Castellano, G. [Instituto de Física Enrique Gaviola, CONICET. M. Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina)

    2016-12-15

    The spatial resolution of the electron backscatter diffraction signal is explored by Monte Carlo simulation for the sigma phase in steel at a typical instrumental set-up. In order to estimate the active volume corresponding to the diffracted electrons, the fraction of the backscattered electrons contributing to the diffraction signal was inferred by extrapolating the Kikuchi pattern contrast measured by other authors, as a function of the diffracted electron energy. In the resulting estimation, the contribution of the intrinsic incident beam size and the software capability to deconvolve patterns were included. A strong influence of the beam size on the lateral resolution was observed, resulting in 20 nm for the aperture considered. For longitudinal and depth directions the resolutions obtained were 75 nm and 16 nm, respectively. The reliability of this last result is discussed in terms of the survey of the last large-angle deflection undergone by the backscattered electrons involved in the diffraction process. Bearing in mind the mean transversal resolution found, it was possible to detect small area grains of sigma phase by EBSD measurements, for a stabilized austenitic AISI 347 stainless steel under heat treatments, simulating post welding (40 h at 600 °C) and aging (284 h at 484 °C) effects—as usually occurring in nuclear reactor pressure vessels. - Highlights: • EBSD spatial resolution is studied by Monte Carlo simulation for σ-phase in steel. • The contribution of the intrinsic incident beam size was included. • A stabilized austenitic stainless steel under heat treatments was measured by EBSD. • With the transversal resolution found, small area σ-phase grains could be identified.

  2. Spatial- and Time-Correlated Detection of Fission Fragments

    Directory of Open Access Journals (Sweden)

    Platkevic M.

    2012-02-01

    Full Text Available With the goal to measure angular correlations of fission fragments in rare fission decay (e.g. ternary and quaternary fission, a multi-detector coincidence system based on two and up to four position sensitive pixel detectors Timepix has been built. In addition to the high granularity, wide dynamic range and per pixel signal threshold, these devices are equipped with per pixel energy and time sensitivity providing more information (position, energy, time, enhances particle-type identification and selectivity of event-by-event detection. Operation of the device with the integrated USB 2.0 based readout interface FITPix and the control and data acquisition software tool Pixelman enables online visualization and flexible/adjustable operation for a different type of experiments. Spatially correlated fission fragments can be thus registered in coincidence. Similarly triggered measurements are performed using an integrated spectrometric module with analogue signal chain electronics. The current status of development together with demonstration of the technique with a 252Cf source is presented.

  3. Detecting, anticipating, and predicting critical transitions in spatially extended systems.

    Science.gov (United States)

    Kwasniok, Frank

    2018-03-01

    A data-driven linear framework for detecting, anticipating, and predicting incipient bifurcations in spatially extended systems based on principal oscillation pattern (POP) analysis is discussed. The dynamics are assumed to be governed by a system of linear stochastic differential equations which is estimated from the data. The principal modes of the system together with corresponding decay or growth rates and oscillation frequencies are extracted as the eigenvectors and eigenvalues of the system matrix. The method can be applied to stationary datasets to identify the least stable modes and assess the proximity to instability; it can also be applied to nonstationary datasets using a sliding window approach to track the changing eigenvalues and eigenvectors of the system. As a further step, a genuinely nonstationary POP analysis is introduced. Here, the system matrix of the linear stochastic model is time-dependent, allowing for extrapolation and prediction of instabilities beyond the learning data window. The methods are demonstrated and explored using the one-dimensional Swift-Hohenberg equation as an example, focusing on the dynamics of stochastic fluctuations around the homogeneous stable state prior to the first bifurcation. The POP-based techniques are able to extract and track the least stable eigenvalues and eigenvectors of the system; the nonstationary POP analysis successfully predicts the timing of the first instability and the unstable mode well beyond the learning data window.

  4. Spatial Cluster Detection for Repeatedly Measured Outcomes while Accounting for Residential History

    OpenAIRE

    Cook, Andrea J.; Gold, Diane R.; Li, Yi

    2009-01-01

    Spatial cluster detection has become an important methodology in quantifying the effect of hazardous exposures. Previous methods have focused on cross-sectional outcomes that are binary or continuous. There are virtually no spatial cluster detection methods proposed for longitudinal outcomes. This paper proposes a new spatial cluster detection method for repeated outcomes using cumulative geographic residuals. A major advantage of this method is its ability to readily incorporate information ...

  5. Detection of Dendritic Spines Using Wavelet-Based Conditional Symmetric Analysis and Regularized Morphological Shared-Weight Neural Networks

    Directory of Open Access Journals (Sweden)

    Shuihua Wang

    2015-01-01

    Full Text Available Identification and detection of dendritic spines in neuron images are of high interest in diagnosis and treatment of neurological and psychiatric disorders (e.g., Alzheimer’s disease, Parkinson’s diseases, and autism. In this paper, we have proposed a novel automatic approach using wavelet-based conditional symmetric analysis and regularized morphological shared-weight neural networks (RMSNN for dendritic spine identification involving the following steps: backbone extraction, localization of dendritic spines, and classification. First, a new algorithm based on wavelet transform and conditional symmetric analysis has been developed to extract backbone and locate the dendrite boundary. Then, the RMSNN has been proposed to classify the spines into three predefined categories (mushroom, thin, and stubby. We have compared our proposed approach against the existing methods. The experimental result demonstrates that the proposed approach can accurately locate the dendrite and accurately classify the spines into three categories with the accuracy of 99.1% for “mushroom” spines, 97.6% for “stubby” spines, and 98.6% for “thin” spines.

  6. Troubles detected during regular inspection of No.1 plant in Oi Power Station, Kansai Electric Power Co., Inc

    International Nuclear Information System (INIS)

    1990-01-01

    No. 1 plant in Oi Power Station, Kansai Electric Power Co., Inc. is a PWR plant with rated output of 1175 MW, and its regular inspection is carried out since August 14, 1989. When eddy current flaw detection inspection was carried out on the total number (11426 except already plugged tubes) of the heating tubes of steam generators, significant indication was observed in tube supporting plate part of 279 tubes, at the boundary of tube plate expanded part of 34 tubes, and in the tube plate expanded part of 99 tubes, 411 heating tubes in total (all on high temperature side). Consequently, it was decided to repair 367 tubes using sleeves, and to plug other 44 tubes. Besides, among the heating tubes plugged in the past, it was decided to remove plugs from 161 tubes, and by repairing them with sleeves, to use them again. Total number of heating tubes 13552 (3388 tubes x 4 steam generators), Number of plugged tubes 2009 (decrease by 117 this time), Ratio of plugging 14.8%. (K.I.)

  7. Creation and detection of optical modes with spatial light modulators

    CSIR Research Space (South Africa)

    Forbes, A

    2016-06-01

    Full Text Available (1979). 24. J. A. Davis, K. O. Valade´z, and D. M. Cottrell, “Encoding amplitude and phase information onto a binary phase-only spatial light modulator,” Appl. Opt. 42, 2003–2008 (2003). 25. E. Bolduc, N. Bent, E. Santamato, E. Karimi, and R. W. Boyd...

  8. Rapid Fear Detection Relies on High Spatial Frequencies

    NARCIS (Netherlands)

    Stein, T.; Seymour, K.; Hebart, M.N.; Sterzer, P.

    Signals of threat—such as fearful faces—are processed with priority and have privileged access to awareness. This fear advantage is commonly believed to engage a specialized subcortical pathway to the amygdala that bypasses visual cortex and processes predominantly low-spatial-frequency information

  9. Enhancing spatial detection accuracy for syndromic surveillance with street level incidence data

    Directory of Open Access Journals (Sweden)

    Alemi Farrokh

    2010-01-01

    Full Text Available Abstract Background The Department of Defense Military Health System operates a syndromic surveillance system that monitors medical records at more than 450 non-combat Military Treatment Facilities (MTF worldwide. The Electronic Surveillance System for Early Notification of Community-based Epidemics (ESSENCE uses both temporal and spatial algorithms to detect disease outbreaks. This study focuses on spatial detection and attempts to improve the effectiveness of the ESSENCE implementation of the spatial scan statistic by increasing the spatial resolution of incidence data from zip codes to street address level. Methods Influenza-Like Illness (ILI was used as a test syndrome to develop methods to improve the spatial accuracy of detected alerts. Simulated incident clusters of various sizes were superimposed on real ILI incidents from the 2008/2009 influenza season. Clusters were detected using the spatial scan statistic and their displacement from simulated loci was measured. Detected cluster size distributions were also evaluated for compliance with simulated cluster sizes. Results Relative to the ESSENCE zip code based method, clusters detected using street level incidents were displaced on average 65% less for 2 and 5 mile radius clusters and 31% less for 10 mile radius clusters. Detected cluster size distributions for the street address method were quasi normal and sizes tended to slightly exceed simulated radii. ESSENCE methods yielded fragmented distributions and had high rates of zero radius and oversized clusters. Conclusions Spatial detection accuracy improved notably with regard to both location and size when incidents were geocoded to street addresses rather than zip code centroids. Since street address geocoding success rates were only 73.5%, zip codes were still used for more than one quarter of ILI cases. Thus, further advances in spatial detection accuracy are dependant on systematic improvements in the collection of individual

  10. Minimum detection limit and spatial resolution of thin-sample field-emission electron probe microanalysis

    International Nuclear Information System (INIS)

    Kubo, Yugo; Hamada, Kotaro; Urano, Akira

    2013-01-01

    The minimum detection limit and spatial resolution for a thinned semiconductor sample were determined by electron probe microanalysis (EPMA) using a Schottky field emission (FE) electron gun and wavelength dispersive X-ray spectrometry. Comparison of the FE-EPMA results with those obtained using energy dispersive X-ray spectrometry in conjunction with scanning transmission electron microscopy, confirmed that FE-EPMA is largely superior in terms of detection sensitivity. Thin-sample FE-EPMA is demonstrated as a very effective method for high resolution, high sensitivity analysis in a laboratory environment because a high probe current and high signal-to-noise ratio can be achieved. - Highlights: • Minimum detection limit and spatial resolution determined for FE-EPMA. • Detection sensitivity of FE-EPMA greatly superior to that of STEM-EDX. • Minimum detection limit and spatial resolution controllable by probe current

  11. Detection of endolithic spatial distribution in marble stone.

    Science.gov (United States)

    Casanova Municchia, A; Percario, Z; Caneva, G

    2014-10-01

    The penetration of endolithic microorganisms, which develop to depths of several millimetres or even centimetres into the stone, and the diffusion of their extracellular substances speeds up the stone deterioration process. The aim of this study was to investigate, using a confocal laser scanning microscopy with a double-staining, a marble rock sample by observing the endolithic spatial distribution and quantifying the volume they occupied within the stone, in order to understand the real impact of these microorganisms on the conservation of stone monuments. Often the only factors taken into account by biodeterioration studies regarding endolithic microorganisms, are spread and depth of penetration. Despite the knowledge of three-dimensional spatial distribution and quantification of volume, it is indispensable to understand the real damage caused by endolithic microorganisms to stone monuments. In this work, we analyze a marble rock sample using a confocal laser scanning microscopy stained with propidium iodide and Concavalin-A conjugate with the fluorophore Alexa Fluor 488, comparing these results with other techniques (SEM microscope, microphotographs of polished cross-sections and thin-section, PAS staining methods), An image analysis approach has also been applied. The use of confocal laser scanning microscopy with double staining shows clear evidence of the presence of endolithic microorganisms (cyanobacteria and fungi) as well as the extracellular polymeric substance matrix in a three-dimensional architecture as part of the rock sample, this technique, therefore, seems very useful when applied to restoration interventions on stone monuments when endolithic growth is suspected. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  12. Adaptive regularization

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Rasmussen, Carl Edward; Svarer, C.

    1994-01-01

    Regularization, e.g., in the form of weight decay, is important for training and optimization of neural network architectures. In this work the authors provide a tool based on asymptotic sampling theory, for iterative estimation of weight decay parameters. The basic idea is to do a gradient desce...

  13. Detecting abrupt dynamic change based on changes in the fractal properties of spatial images

    Science.gov (United States)

    Liu, Qunqun; He, Wenping; Gu, Bin; Jiang, Yundi

    2017-10-01

    Many abrupt climate change events often cannot be detected timely by conventional abrupt detection methods until a few years after these events have occurred. The reason for this lag in detection is that abundant and long-term observational data are required for accurate abrupt change detection by these methods, especially for the detection of a regime shift. So, these methods cannot help us understand and forecast the evolution of the climate system in a timely manner. Obviously, spatial images, generated by a coupled spatiotemporal dynamical model, contain more information about a dynamic system than a single time series, and we find that spatial images show the fractal properties. The fractal properties of spatial images can be quantitatively characterized by the Hurst exponent, which can be estimated by two-dimensional detrended fluctuation analysis (TD-DFA). Based on this, TD-DFA is used to detect an abrupt dynamic change of a coupled spatiotemporal model. The results show that the TD-DFA method can effectively detect abrupt parameter changes in the coupled model by monitoring the changing in the fractal properties of spatial images. The present method provides a new way for abrupt dynamic change detection, which can achieve timely and efficient abrupt change detection results.

  14. A flexible spatial scan statistic with a restricted likelihood ratio for detecting disease clusters.

    Science.gov (United States)

    Tango, Toshiro; Takahashi, Kunihiko

    2012-12-30

    Spatial scan statistics are widely used tools for detection of disease clusters. Especially, the circular spatial scan statistic proposed by Kulldorff (1997) has been utilized in a wide variety of epidemiological studies and disease surveillance. However, as it cannot detect noncircular, irregularly shaped clusters, many authors have proposed different spatial scan statistics, including the elliptic version of Kulldorff's scan statistic. The flexible spatial scan statistic proposed by Tango and Takahashi (2005) has also been used for detecting irregularly shaped clusters. However, this method sets a feasible limitation of a maximum of 30 nearest neighbors for searching candidate clusters because of heavy computational load. In this paper, we show a flexible spatial scan statistic implemented with a restricted likelihood ratio proposed by Tango (2008) to (1) eliminate the limitation of 30 nearest neighbors and (2) to have surprisingly much less computational time than the original flexible spatial scan statistic. As a side effect, it is shown to be able to detect clusters with any shape reasonably well as the relative risk of the cluster becomes large via Monte Carlo simulation. We illustrate the proposed spatial scan statistic with data on mortality from cerebrovascular disease in the Tokyo Metropolitan area, Japan. Copyright © 2012 John Wiley & Sons, Ltd.

  15. A spatially offset Raman spectroscopy method for non-destructive detection of gelatin-encapsulated powders

    Science.gov (United States)

    Non-destructive subsurface detection of encapsulated, coated, or seal-packaged foods and pharmaceuticals can help prevent distribution and consumption of counterfeit or hazardous products. This study used a Spatially Offset Raman Spectroscopy (SORS) method to detect and identify urea, ibuprofen, and...

  16. Spatial cluster detection for repeatedly measured outcomes while accounting for residential history.

    Science.gov (United States)

    Cook, Andrea J; Gold, Diane R; Li, Yi

    2009-10-01

    Spatial cluster detection has become an important methodology in quantifying the effect of hazardous exposures. Previous methods have focused on cross-sectional outcomes that are binary or continuous. There are virtually no spatial cluster detection methods proposed for longitudinal outcomes. This paper proposes a new spatial cluster detection method for repeated outcomes using cumulative geographic residuals. A major advantage of this method is its ability to readily incorporate information on study participants relocation, which most cluster detection statistics cannot. Application of these methods will be illustrated by the Home Allergens and Asthma prospective cohort study analyzing the relationship between environmental exposures and repeated measured outcome, occurrence of wheeze in the last 6 months, while taking into account mobile locations.

  17. Spatially resolved detection of mutually locked Josephson junctions in arrays

    International Nuclear Information System (INIS)

    Keck, M.; Doderer, T.; Huebener, R.P.; Traeuble, T.; Dolata, R.; Weimann, T.; Niemeyer, J.

    1997-01-01

    Mutual locking due to the internal coupling in two-dimensional arrays of Josephson junctions was investigated. The appearance of Shapiro steps in the current versus voltage curve of a coupled on-chip detector junction is used to indicate coherent oscillations in the array. A highly coherent state is observed for some range of the array bias current. By scanning the array with a low-power electron beam, mutually locked junctions remain locked while the unlocked junctions generate a beam-induced additional voltage drop at the array. This imaging technique allows the detection of the nonlocked or weakly locked Josephson junctions in a (partially) locked array state. copyright 1997 American Institute of Physics

  18. A method of detecting spatial clustering of disease

    International Nuclear Information System (INIS)

    Openshaw, S.; Wilkie, D.; Binks, K.; Wakeford, R.; Gerrard, M.H.; Croasdale, M.R.

    1989-01-01

    A statistical technique has been developed to identify extreme groupings of a disease and is being applied to childhood cancers, initially to acute lymphoblastic leukaemia incidence in the Northern and North-Western Regions of England. The method covers the area with a square grid, the size of which is varied over a wide range and whose origin is moved in small increments in two directions. The population at risk within any square is estimated using the 1971 and 1981 censuses. The significance of an excess of disease is determined by random simulation. In addition, tests to detect a general departure from a background Poisson process are carried out. Available results will be presented at the conference. (author)

  19. Spatial biomarker of disease and detection of spatial organization of cellular receptors

    Science.gov (United States)

    Salaita, Khalid S.; Nair, Pradeep M.; Das, Debopriya; Gray, Joe W.; Groves, John T.

    2017-07-18

    A signature of a condition of a live cell is established in an assay that allows distribution of the receptors on the cell surface in response to binding a ligand. The receptors can be optically detected and quantified to provide a value for the condition, Test drugs can be screened for therapeutic potential in the assay: a potentially efficacious drug is identified by an ability to modulate an established signature. The receptor distribution signature can be corroborated with an mRNA expression profile of several genes, indicating, for example, metastasis.

  20. FIND: difFerential chromatin INteractions Detection using a spatial Poisson process.

    Science.gov (United States)

    Djekidel, Mohamed Nadhir; Chen, Yang; Zhang, Michael Q

    2018-02-12

    Polymer-based simulations and experimental studies indicate the existence of a spatial dependency between the adjacent DNA fibers involved in the formation of chromatin loops. However, the existing strategies for detecting differential chromatin interactions assume that the interacting segments are spatially independent from the other segments nearby. To resolve this issue, we developed a new computational method, FIND, which considers the local spatial dependency between interacting loci. FIND uses a spatial Poisson process to detect differential chromatin interactions that show a significant difference in their interaction frequency and the interaction frequency of their neighbors. Simulation and biological data analysis show that FIND outperforms the widely used count-based methods and has a better signal-to-noise ratio. © 2018 Djekidel et al.; Published by Cold Spring Harbor Laboratory Press.

  1. Spatial correlation analysis of urban traffic state under a perspective of community detection

    Science.gov (United States)

    Yang, Yanfang; Cao, Jiandong; Qin, Yong; Jia, Limin; Dong, Honghui; Zhang, Aomuhan

    2018-05-01

    Understanding the spatial correlation of urban traffic state is essential for identifying the evolution patterns of urban traffic state. However, the distribution of traffic state always has characteristics of large spatial span and heterogeneity. This paper adapts the concept of community detection to the correlation network of urban traffic state and proposes a new perspective to identify the spatial correlation patterns of traffic state. In the proposed urban traffic network, the nodes represent road segments, and an edge between a pair of nodes is added depending on the result of significance test for the corresponding correlation of traffic state. Further, the process of community detection in the urban traffic network (named GWPA-K-means) is applied to analyze the spatial dependency of traffic state. The proposed method extends the traditional K-means algorithm in two steps: (i) redefines the initial cluster centers by two properties of nodes (the GWPA value and the minimum shortest path length); (ii) utilizes the weight signal propagation process to transfer the topological information of the urban traffic network into a node similarity matrix. Finally, numerical experiments are conducted on a simple network and a real urban road network in Beijing. The results show that GWPA-K-means algorithm is valid in spatial correlation analysis of traffic state. The network science and community structure analysis perform well in describing the spatial heterogeneity of traffic state on a large spatial scale.

  2. Incrementally Detecting Change Types of Spatial Area Object: A Hierarchical Matching Method Considering Change Process

    Directory of Open Access Journals (Sweden)

    Yanhui Wang

    2018-01-01

    Full Text Available Detecting and extracting the change types of spatial area objects can track area objects’ spatiotemporal change pattern and provide the change backtracking mechanism for incrementally updating spatial datasets. To respond to the problems of high complexity of detection methods, high redundancy rate of detection factors, and the low automation degree during incrementally update process, we take into account the change process of area objects in an integrated way and propose a hierarchical matching method to detect the nine types of changes of area objects, while minimizing the complexity of the algorithm and the redundancy rate of detection factors. We illustrate in details the identification, extraction, and database entry of change types, and how we achieve a close connection and organic coupling of incremental information extraction and object type-of-change detection so as to characterize the whole change process. The experimental results show that this method can successfully detect incremental information about area objects in practical applications, with the overall accuracy reaching above 90%, which is much higher than the existing weighted matching method, making it quite feasible and applicable. It helps establish the corresponding relation between new-version and old-version objects, and facilitate the linked update processing and quality control of spatial data.

  3. Clustered Regularly Interspaced Short Palindromic Repeats/Cas9 Triggered Isothermal Amplification for Site-Specific Nucleic Acid Detection.

    Science.gov (United States)

    Huang, Mengqi; Zhou, Xiaoming; Wang, Huiying; Xing, Da

    2018-02-06

    A novel CRISPR/Cas9 triggered isothermal exponential amplification reaction (CAS-EXPAR) strategy based on CRISPR/Cas9 cleavage and nicking endonuclease (NEase) mediated nucleic acids amplification was developed for rapid and site-specific nucleic acid detection. CAS-EXPAR was primed by the target DNA fragment produced by cleavage of CRISPR/Cas9, and the amplification reaction performed cyclically to generate a large number of DNA replicates which were detected using a real-time fluorescence monitoring method. This strategy that combines the advantages of CRISPR/Cas9 and exponential amplification showed high specificity as well as rapid amplification kinetics. Unlike conventional nucleic acids amplification reactions, CAS-EXPAR does not require exogenous primers, which often cause target-independent amplification. Instead, primers are first generated by Cas9/sgRNA directed site-specific cleavage of target and accumulated during the reaction. It was demonstrated this strategy gave a detection limit of 0.82 amol and showed excellent specificity in discriminating single-base mismatch. Moreover, the applicability of this method to detect DNA methylation and L. monocytogenes total RNA was also verified. Therefore, CAS-EXPAR may provide a new paradigm for efficient nucleic acid amplification and hold the potential for molecular diagnostic applications.

  4. Web-based GIS for spatial pattern detection: application to malaria incidence in Vietnam.

    Science.gov (United States)

    Bui, Thanh Quang; Pham, Hai Minh

    2016-01-01

    There is a great concern on how to build up an interoperable health information system of public health and health information technology within the development of public information and health surveillance programme. Technically, some major issues remain regarding to health data visualization, spatial processing of health data, health information dissemination, data sharing and the access of local communities to health information. In combination with GIS, we propose a technical framework for web-based health data visualization and spatial analysis. Data was collected from open map-servers and geocoded by open data kit package and data geocoding tools. The Web-based system is designed based on Open-source frameworks and libraries. The system provides Web-based analyst tool for pattern detection through three spatial tests: Nearest neighbour, K function, and Spatial Autocorrelation. The result is a web-based GIS, through which end users can detect disease patterns via selecting area, spatial test parameters and contribute to managers and decision makers. The end users can be health practitioners, educators, local communities, health sector authorities and decision makers. This web-based system allows for the improvement of health related services to public sector users as well as citizens in a secure manner. The combination of spatial statistics and web-based GIS can be a solution that helps empower health practitioners in direct and specific intersectional actions, thus provide for better analysis, control and decision-making.

  5. Tools for Multimode Quantum Information: Modulation, Detection, and Spatial Quantum Correlations

    DEFF Research Database (Denmark)

    Lassen, Mikael Østergaard; Delaubert, Vincent; Janousek, Jirí

    2007-01-01

    We present here all the tools required for continuous variable parallel quantum information protocols based on spatial multi-mode quantum correlations and entanglement. We describe techniques for encoding and detecting this quantum information with high efficiency in the individual modes. We use ...

  6. Spatially valid proprioceptive cues improve the detection of a visual stimulus

    DEFF Research Database (Denmark)

    Jackson, Carl P T; Miall, R Chris; Balslev, Daniela

    2010-01-01

    , which has been demonstrated for other modality pairings. The aim of this study was to test whether proprioceptive signals can spatially cue a visual target to improve its detection. Participants were instructed to use a planar manipulandum in a forward reaching action and determine during this movement...

  7. Spatial Modeling of Industrial Windfall on Soils to Detect Woody Species with Potential for Bioremediation

    Science.gov (United States)

    S. Salazar; M. Mendoza; A. M. Tejeda

    2006-01-01

    A spatial model is presented to explain the concentration of heavy metals (Fe, Cu, Zn, Ni, Cr, Co and Pb), in the soils around the industrial complex near the Port of Veracruz, Mexico. Unexpected low concentration sites where then tested to detect woody plant species that may have the capability to hiperacumulate these contaminants, hence having a potential for...

  8. The spatial resolution of the porcine multifocal electroretinogram for detection of laser-induced retinal lesions

    DEFF Research Database (Denmark)

    Kyhn, Maria Voss; Kiilgaard, Jens Folke; Scherfig, Erik

    2008-01-01

    This study aimed to investigate the spatial resolution of a porcine multifocal electroretinogram (mfERG) protocol by testing its ability to detect laser-induced retinal lesions. Furthermore, we wanted to describe time-dependent changes in implicit time and amplitude of the different mfERG peaks...

  9. Moving target detection based on temporal-spatial information fusion for infrared image sequences

    Science.gov (United States)

    Toing, Wu-qin; Xiong, Jin-yu; Zeng, An-jun; Wu, Xiao-ping; Xu, Hao-peng

    2009-07-01

    Moving target detection and localization is one of the most fundamental tasks in visual surveillance. In this paper, through analyzing the advantages and disadvantages of the traditional approaches about moving target detection, a novel approach based on temporal-spatial information fusion is proposed for moving target detection. The proposed method combines the spatial feature in single frame and the temporal properties within multiple frames of an image sequence of moving target. First, the method uses the spatial image segmentation for target separation from background and uses the local temporal variance for extracting targets and wiping off the trail artifact. Second, the logical "and" operator is used to fuse the temporal and spatial information. In the end, to the fusion image sequence, the morphological filtering and blob analysis are used to acquire exact moving target. The algorithm not only requires minimal computation and memory but also quickly adapts to the change of background and environment. Comparing with other methods, such as the KDE, the Mixture of K Gaussians, etc., the simulation results show the proposed method has better validity and higher adaptive for moving target detection, especially in infrared image sequences with complex illumination change, noise change, and so on.

  10. Zero-crossing detection algorithm for arrays of optical spatial filtering velocimetry sensors

    DEFF Research Database (Denmark)

    Jakobsen, Michael Linde; Pedersen, Finn; Hanson, Steen Grüner

    2008-01-01

    This paper presents a zero-crossing detection algorithm for arrays of compact low-cost optical sensors based on spatial filtering for measuring fluctuations in angular velocity of rotating solid structures. The algorithm is applicable for signals with moderate signal-to-noise ratios, and delivers...... repeating the same measurement error for each revolution of the target, and to gain high performance measurement of angular velocity. The traditional zero-crossing detection is extended by 1) inserting an appropriate band-pass filter before the zero-crossing detection, 2) measuring time periods between zero...

  11. Research on photodiode detector-based spatial transient light detection and processing system

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Liu, Yang; Zhao, Hui; Nan, Meng

    2016-10-01

    In order to realize real-time signal identification and processing of spatial transient light, the features and the energy of the captured target light signal are first described and quantitatively calculated. Considering that the transient light signal has random occurrence, a short duration and an evident beginning and ending, a photodiode detector based spatial transient light detection and processing system is proposed and designed in this paper. This system has a large field of view and is used to realize non-imaging energy detection of random, transient and weak point target under complex background of spatial environment. Weak signal extraction under strong background is difficult. In this paper, considering that the background signal changes slowly and the target signal changes quickly, filter is adopted for signal's background subtraction. A variable speed sampling is realized by the way of sampling data points with a gradually increased interval. The two dilemmas that real-time processing of large amount of data and power consumption required by the large amount of data needed to be stored are solved. The test results with self-made simulative signal demonstrate the effectiveness of the design scheme. The practical system could be operated reliably. The detection and processing of the target signal under the strong sunlight background was realized. The results indicate that the system can realize real-time detection of target signal's characteristic waveform and monitor the system working parameters. The prototype design could be used in a variety of engineering applications.

  12. Comparisons of lesion detectability in ultrasound images acquired using time-shift compensation and spatial compounding.

    Science.gov (United States)

    Lacefield, James C; Pilkington, Wayne C; Waag, Robert C

    2004-12-01

    The effects of aberration, time-shift compensation, and spatial compounding on the discrimination of positive-contrast lesions in ultrasound b-scan images are investigated using a two-dimensional (2-D) array system and tissue-mimicking phantoms. Images were acquired within an 8.8 x 12-mm2 field of view centered on one of four statistically similar 4-mm diameter spherical lesions. Each lesion was imaged in four planes offset by successive 45 degree rotations about the central scan line. Images of the lesions were acquired using conventional geometric focusing through a water path, geometric focusing through a 35-mm thick distributed aberration phantom, and time-shift compensated transmit and receive focusing through the aberration phantom. The views of each lesion were averaged to form sets of water path, aberrated, and time-shift compensated 4:1 compound images and 16:1 compound images. The contrast ratio and detectability index of each image were computed to assess lesion differentiation. In the presence of aberration representative of breast or abdominal wall tissue, time-shift compensation provided statistically significant improvements of contrast ratio but did not consistently affect the detectability index, and spatial compounding significantly increased the detectability index but did not alter the contrast ratio. Time-shift compensation and spatial compounding thus provide complementary benefits to lesion detection.

  13. Experimental study on the crack detection with optimized spatial wavelet analysis and windowing

    Science.gov (United States)

    Ghanbari Mardasi, Amir; Wu, Nan; Wu, Christine

    2018-05-01

    In this paper, a high sensitive crack detection is experimentally realized and presented on a beam under certain deflection by optimizing spatial wavelet analysis. Due to the crack existence in the beam structure, a perturbation/slop singularity is induced in the deflection profile. Spatial wavelet transformation works as a magnifier to amplify the small perturbation signal at the crack location to detect and localize the damage. The profile of a deflected aluminum cantilever beam is obtained for both intact and cracked beams by a high resolution laser profile sensor. Gabor wavelet transformation is applied on the subtraction of intact and cracked data sets. To improve detection sensitivity, scale factor in spatial wavelet transformation and the transformation repeat times are optimized. Furthermore, to detect the possible crack close to the measurement boundaries, wavelet transformation edge effect, which induces large values of wavelet coefficient around the measurement boundaries, is efficiently reduced by introducing different windowing functions. The result shows that a small crack with depth of less than 10% of the beam height can be localized with a clear perturbation. Moreover, the perturbation caused by a crack at 0.85 mm away from one end of the measurement range, which is covered by wavelet transform edge effect, emerges by applying proper window functions.

  14. Analytical properties of an Ostrovsky-Whitham type dynamical system for a relaxing medium with spatial memory and its integrable regularization

    International Nuclear Information System (INIS)

    Bogoliubov, N.N. Jr.; Prykarpatsky, A.K.; Gucwa, I.; Golenia, J.

    2007-12-01

    Short-wave perturbations in a relaxing medium, governed by a special reduction of the Ostrovsky evolution equation, and later derived by Whitham, are studied using the gradient-holonomic integrability algorithm.The bi-Hamiltonicity and complete integrability of the corresponding dynamical system is stated and an infinite hierarchy of commuting to each other conservation laws of dispersive type are found. The two- and four-dimensional invariant reductions are studied in detail. The well defined regularization of the model is constructed and its Lax type integrability is discussed. (author)

  15. Early non-destructive biofouling detection and spatial distribution: Application of oxygen sensing optodes

    KAUST Repository

    Farhat, Nadia

    2015-06-11

    Biofouling is a serious problem in reverse osmosis/nanofiltration (RO/NF) applications, reducing membrane performance. Early detection of biofouling plays an essential role in an adequate anti-biofouling strategy. Presently, fouling of membrane filtration systems is mainly determined by measuring changes in pressure drop, which is not exclusively linked to biofouling. Non-destructive imaging of oxygen concentrations (i) is specific for biological activity of biofilms and (ii) may enable earlier detection of biofilm accumulation than pressure drop. The objective of this study was to test whether transparent luminescent planar O2 optodes, in combination with a simple imaging system, can be used for early non-destructive biofouling detection. This biofouling detection is done by mapping the two-dimensional distribution of O2 concentrations and O2 decrease rates inside a membrane fouling simulator (MFS). Results show that at an early stage, biofouling development was detected by the oxygen sensing optodes while no significant increase in pressure drop was yet observed. Additionally, optodes could detect spatial heterogeneities in biofouling distribution at a micro scale. Biofilm development started mainly at the feed spacer crossings. The spatial and quantitative information on biological activity will lead to better understanding of the biofouling processes, contributing to the development of more effective biofouling control strategies.

  16. Breast cancer mitosis detection in histopathological images with spatial feature extraction

    Science.gov (United States)

    Albayrak, Abdülkadir; Bilgin, Gökhan

    2013-12-01

    In this work, cellular mitosis detection in histopathological images has been investigated. Mitosis detection is very expensive and time consuming process. Development of digital imaging in pathology has enabled reasonable and effective solution to this problem. Segmentation of digital images provides easier analysis of cell structures in histopathological data. To differentiate normal and mitotic cells in histopathological images, feature extraction step is very crucial step for the system accuracy. A mitotic cell has more distinctive textural dissimilarities than the other normal cells. Hence, it is important to incorporate spatial information in feature extraction or in post-processing steps. As a main part of this study, Haralick texture descriptor has been proposed with different spatial window sizes in RGB and La*b* color spaces. So, spatial dependencies of normal and mitotic cellular pixels can be evaluated within different pixel neighborhoods. Extracted features are compared with various sample sizes by Support Vector Machines using k-fold cross validation method. According to the represented results, it has been shown that separation accuracy on mitotic and non-mitotic cellular pixels gets better with the increasing size of spatial window.

  17. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn; Yao, Jianhua; Lu, Le; Kim, Lauren; Turkbey, Evrim B.; Summers, Ronald M., E-mail: rms@nih.gov [Imaging Biomarkers and Computer-aided Diagnosis Laboratory, Radiology and Imaging Sciences, National Institutes of Health Clinical Center Building, 10 Room 1C224 MSC 1182, Bethesda, Maryland 20892-1182 (United States)

    2016-07-15

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifier for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.

  18. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    International Nuclear Information System (INIS)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn; Yao, Jianhua; Lu, Le; Kim, Lauren; Turkbey, Evrim B.; Summers, Ronald M.

    2016-01-01

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifier for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.

  19. Evaluating species richness: biased ecological inference results from spatial heterogeneity in species detection probabilities

    Science.gov (United States)

    McNew, Lance B.; Handel, Colleen M.

    2015-01-01

    Accurate estimates of species richness are necessary to test predictions of ecological theory and evaluate biodiversity for conservation purposes. However, species richness is difficult to measure in the field because some species will almost always be overlooked due to their cryptic nature or the observer's failure to perceive their cues. Common measures of species richness that assume consistent observability across species are inviting because they may require only single counts of species at survey sites. Single-visit estimation methods ignore spatial and temporal variation in species detection probabilities related to survey or site conditions that may confound estimates of species richness. We used simulated and empirical data to evaluate the bias and precision of raw species counts, the limiting forms of jackknife and Chao estimators, and multi-species occupancy models when estimating species richness to evaluate whether the choice of estimator can affect inferences about the relationships between environmental conditions and community size under variable detection processes. Four simulated scenarios with realistic and variable detection processes were considered. Results of simulations indicated that (1) raw species counts were always biased low, (2) single-visit jackknife and Chao estimators were significantly biased regardless of detection process, (3) multispecies occupancy models were more precise and generally less biased than the jackknife and Chao estimators, and (4) spatial heterogeneity resulting from the effects of a site covariate on species detection probabilities had significant impacts on the inferred relationships between species richness and a spatially explicit environmental condition. For a real dataset of bird observations in northwestern Alaska, the four estimation methods produced different estimates of local species richness, which severely affected inferences about the effects of shrubs on local avian richness. Overall, our results

  20. UNFOLDED REGULAR AND SEMI-REGULAR POLYHEDRA

    Directory of Open Access Journals (Sweden)

    IONIŢĂ Elena

    2015-06-01

    Full Text Available This paper proposes a presentation unfolding regular and semi-regular polyhedra. Regular polyhedra are convex polyhedra whose faces are regular and equal polygons, with the same number of sides, and whose polyhedral angles are also regular and equal. Semi-regular polyhedra are convex polyhedra with regular polygon faces, several types and equal solid angles of the same type. A net of a polyhedron is a collection of edges in the plane which are the unfolded edges of the solid. Modeling and unfolding Platonic and Arhimediene polyhedra will be using 3dsMAX program. This paper is intended as an example of descriptive geometry applications.

  1. Predicting detection performance with model observers: Fourier domain or spatial domain?

    Science.gov (United States)

    Chen, Baiyu; Yu, Lifeng; Leng, Shuai; Kofler, James; Favazza, Christopher; Vrieze, Thomas; McCollough, Cynthia

    2016-02-27

    The use of Fourier domain model observer is challenged by iterative reconstruction (IR), because IR algorithms are nonlinear and IR images have noise texture different from that of FBP. A modified Fourier domain model observer, which incorporates nonlinear noise and resolution properties, has been proposed for IR and needs to be validated with human detection performance. On the other hand, the spatial domain model observer is theoretically applicable to IR, but more computationally intensive than the Fourier domain method. The purpose of this study is to compare the modified Fourier domain model observer to the spatial domain model observer with both FBP and IR images, using human detection performance as the gold standard. A phantom with inserts of various low contrast levels and sizes was repeatedly scanned 100 times on a third-generation, dual-source CT scanner at 5 dose levels and reconstructed using FBP and IR algorithms. The human detection performance of the inserts was measured via a 2-alternative-forced-choice (2AFC) test. In addition, two model observer performances were calculated, including a Fourier domain non-prewhitening model observer and a spatial domain channelized Hotelling observer. The performance of these two mode observers was compared in terms of how well they correlated with human observer performance. Our results demonstrated that the spatial domain model observer correlated well with human observers across various dose levels, object contrast levels, and object sizes. The Fourier domain observer correlated well with human observers using FBP images, but overestimated the detection performance using IR images.

  2. A Spatially Offset Raman Spectroscopy Method for Non-Destructive Detection of Gelatin-Encapsulated Powders

    Directory of Open Access Journals (Sweden)

    Kuanglin Chao

    2017-03-01

    Full Text Available Non-destructive subsurface detection of encapsulated, coated, or seal-packaged foods and pharmaceuticals can help prevent distribution and consumption of counterfeit or hazardous products. This study used a Spatially Offset Raman Spectroscopy (SORS method to detect and identify urea, ibuprofen, and acetaminophen powders contained within one or more (up to eight layers of gelatin capsules to demonstrate subsurface chemical detection and identification. A 785-nm point-scan Raman spectroscopy system was used to acquire spatially offset Raman spectra for an offset range of 0 to 10 mm from the surfaces of 24 encapsulated samples, using a step size of 0.1 mm to obtain 101 spectral measurements per sample. As the offset distance was increased, the spectral contribution from the subsurface powder gradually outweighed that of the surface capsule layers, allowing for detection of the encapsulated powders. Containing mixed contributions from the powder and capsule, the SORS spectra for each sample were resolved into pure component spectra using self-modeling mixture analysis (SMA and the corresponding components were identified using spectral information divergence values. As demonstrated here for detecting chemicals contained inside thick capsule layers, this SORS measurement technique coupled with SMA has the potential to be a reliable non-destructive method for subsurface inspection and authentication of foods, health supplements, and pharmaceutical products that are prepared or packaged with semi-transparent materials.

  3. Detection of auditory signals in quiet and noisy backgrounds while performing a visuo-spatial task

    Directory of Open Access Journals (Sweden)

    Vishakha W Rawool

    2016-01-01

    Full Text Available Context: The ability to detect important auditory signals while performing visual tasks may be further compounded by background chatter. Thus, it is important to know how task performance may interact with background chatter to hinder signal detection. Aim: To examine any interactive effects of speech spectrum noise and task performance on the ability to detect signals. Settings and Design: The setting was a sound-treated booth. A repeated measures design was used. Materials and Methods: Auditory thresholds of 20 normal adults were determined at 0.5, 1, 2 and 4 kHz in the following conditions presented in a random order: (1 quiet with attention; (2 quiet with a visuo-spatial task or puzzle (distraction; (3 noise with attention and (4 noise with task. Statistical Analysis: Multivariate analyses of variance (MANOVA with three repeated factors (quiet versus noise, visuo-spatial task versus no task, signal frequency. Results: MANOVA revealed significant main effects for noise and signal frequency and significant noise–frequency and task–frequency interactions. Distraction caused by performing the task worsened the thresholds for tones presented at the beginning of the experiment and had no effect on tones presented in the middle. At the end of the experiment, thresholds (4 kHz were better while performing the task than those obtained without performing the task. These effects were similar across the quiet and noise conditions. Conclusion: Detection of auditory signals is difficult at the beginning of a distracting visuo-spatial task but over time, task learning and auditory training effects can nullify the effect of distraction and may improve detection of high frequency sounds.

  4. Regularization design for high-quality cone-beam CT of intracranial hemorrhage using statistical reconstruction

    Science.gov (United States)

    Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2016-03-01

    Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.

  5. Detecting Temporal and Spatial Effects of Epithelial Cancers with Raman Spectroscopy

    Directory of Open Access Journals (Sweden)

    Matthew D. Keller

    2008-01-01

    Full Text Available Epithelial cancers, including those of the skin and cervix, are the most common type of cancers in humans. Many recent studies have attempted to use Raman spectroscopy to diagnose these cancers. In this paper, Raman spectral markers related to the temporal and spatial effects of cervical and skin cancers are examined through four separate but related studies. Results from a clinical cervix study show that previous disease has a significant effect on the Raman signatures of the cervix, which allow for near 100% classification for discriminating previous disease versus a true normal. A Raman microspectroscopy study showed that Raman can detect changes due to adjacent regions of dysplasia or HPV that cannot be detected histologically, while a clinical skin study showed that Raman spectra may be detecting malignancy associated changes in tissues surrounding nonmelanoma skin cancers. Finally, results of an organotypic raft culture study provided support for both the skin and the in vitro cervix results. These studies add to the growing body of evidence that optical spectroscopy, in this case Raman spectral markers, can be used to detect subtle temporal and spatial effects in tissue near cancerous sites that go otherwise undetected by conventional histology.

  6. A Mobile Outdoor Augmented Reality Method Combining Deep Learning Object Detection and Spatial Relationships for Geovisualization.

    Science.gov (United States)

    Rao, Jinmeng; Qiao, Yanjun; Ren, Fu; Wang, Junxing; Du, Qingyun

    2017-08-24

    The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device's built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction.

  7. A Mobile Outdoor Augmented Reality Method Combining Deep Learning Object Detection and Spatial Relationships for Geovisualization

    Directory of Open Access Journals (Sweden)

    Jinmeng Rao

    2017-08-01

    Full Text Available The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device’s built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction.

  8. A Mobile Outdoor Augmented Reality Method Combining Deep Learning Object Detection and Spatial Relationships for Geovisualization

    Science.gov (United States)

    Rao, Jinmeng; Qiao, Yanjun; Ren, Fu; Wang, Junxing; Du, Qingyun

    2017-01-01

    The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device’s built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction. PMID:28837096

  9. Uncertainty of a detected spatial cluster in 1D: quantification and visualization

    KAUST Repository

    Lee, Junho; Gangnon, Ronald E.; Zhu, Jun; Liang, Jingjing

    2017-01-01

    Spatial cluster detection is an important problem in a variety of scientific disciplines such as environmental sciences, epidemiology and sociology. However, there appears to be very limited statistical methodology for quantifying the uncertainty of a detected cluster. In this paper, we develop a new method for the quantification and visualization of uncertainty associated with a detected cluster. Our approach is defining a confidence set for the true cluster and visualizing the confidence set, based on the maximum likelihood, in time or in one-dimensional space. We evaluate the pivotal property of the statistic used to construct the confidence set and the coverage rate for the true cluster via empirical distributions. For illustration, our methodology is applied to both simulated data and an Alaska boreal forest dataset. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Spatial-time-state fusion algorithm for defect detection through eddy current pulsed thermography

    Science.gov (United States)

    Xiao, Xiang; Gao, Bin; Woo, Wai Lok; Tian, Gui Yun; Xiao, Xiao Ting

    2018-05-01

    Eddy Current Pulsed Thermography (ECPT) has received extensive attention due to its high sensitive of detectability on surface and subsurface cracks. However, it remains as a difficult challenge in unsupervised detection as to identify defects without knowing any prior knowledge. This paper presents a spatial-time-state features fusion algorithm to obtain fully profile of the defects by directional scanning. The proposed method is intended to conduct features extraction by using independent component analysis (ICA) and automatic features selection embedding genetic algorithm. Finally, the optimal feature of each step is fused to obtain defects reconstruction by applying common orthogonal basis extraction (COBE) method. Experiments have been conducted to validate the study and verify the efficacy of the proposed method on blind defect detection.

  11. Uncertainty of a detected spatial cluster in 1D: quantification and visualization

    KAUST Repository

    Lee, Junho

    2017-10-19

    Spatial cluster detection is an important problem in a variety of scientific disciplines such as environmental sciences, epidemiology and sociology. However, there appears to be very limited statistical methodology for quantifying the uncertainty of a detected cluster. In this paper, we develop a new method for the quantification and visualization of uncertainty associated with a detected cluster. Our approach is defining a confidence set for the true cluster and visualizing the confidence set, based on the maximum likelihood, in time or in one-dimensional space. We evaluate the pivotal property of the statistic used to construct the confidence set and the coverage rate for the true cluster via empirical distributions. For illustration, our methodology is applied to both simulated data and an Alaska boreal forest dataset. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Spatial and temporal control of thermal waves by using DMDs for interference based crack detection

    Science.gov (United States)

    Thiel, Erik; Kreutzbruck, Marc; Ziegler, Mathias

    2016-02-01

    Active Thermography is a well-established non-destructive testing method and used to detect cracks, voids or material inhomogeneities. It is based on applying thermal energy to a samples' surface whereas inner defects alter the nonstationary heat flow. Conventional excitation of a sample is hereby done spatially, either planar (e.g. using a lamp) or local (e.g. using a focused laser) and temporally, either pulsed or periodical. In this work we combine a high power laser with a Digital Micromirror Device (DMD) allowing us to merge all degrees of freedom to a spatially and temporally controlled heat source. This enables us to exploit the possibilities of coherent thermal wave shaping. Exciting periodically while controlling at the same time phase and amplitude of the illumination source induces - via absorption at the sample's surface - a defined thermal wave propagation through a sample. That means thermal waves can be controlled almost like acoustical or optical waves. However, in contrast to optical or acoustical waves, thermal waves are highly damped due to the diffusive character of the thermal heat flow and therefore limited in penetration depth in relation to the achievable resolution. Nevertheless, the coherence length of thermal waves can be chosen in the mmrange for modulation frequencies below 10 Hz which is perfectly met by DMD technology. This approach gives us the opportunity to transfer known technologies from wave shaping techniques to thermography methods. We will present experiments on spatial and temporal wave shaping, demonstrating interference based crack detection.

  13. Detection of the spatial accuracy of an O-arm in the region of surgical interest

    Science.gov (United States)

    Koivukangas, Tapani; Katisko, Jani P. A.; Koivukangsa, John P.

    2013-03-01

    Medical imaging is an essential component of a wide range of surgical procedures1. For image guided surgical (IGS) procedures, medical images are the main source of information2. The IGS procedures rely largely on obtained image data, so the data needs to provide differentiation between normal and abnormal tissues, especially when other surgical guidance devices are used in the procedures. The image data also needs to provide accurate spatial representation of the patient3. This research has concentrated on the concept of accuracy assessment of IGS devices to meet the needs of quality assurance in the hospital environment. For this purpose, two precision engineered accuracy assessment phantoms have been developed as advanced materials and methods for the community. The phantoms were designed to mimic the volume of a human head as the common region of surgical interest (ROSI). This paper introduces the utilization of the phantoms in spatial accuracy assessment of a commercial surgical 3D CT scanner, the O-Arm. The study presents methods and results of image quality detection of possible geometrical distortions in the region of surgical interest. The results show that in the pre-determined ROSI there are clear image distortion and artefacts using too high imaging parameters when scanning the objects. On the other hand, when using optimal parameters, the O-Arm causes minimal error in IGS accuracy. The detected spatial inaccuracy of the O-Arm with used parameters was in the range of less than 1.00 mm.

  14. Detection of the power lines in UAV remote sensed images using spectral-spatial methods.

    Science.gov (United States)

    Bhola, Rishav; Krishna, Nandigam Hari; Ramesh, K N; Senthilnath, J; Anand, Gautham

    2018-01-15

    In this paper, detection of the power lines on images acquired by Unmanned Aerial Vehicle (UAV) based remote sensing is carried out using spectral-spatial methods. Spectral clustering was performed using Kmeans and Expectation Maximization (EM) algorithm to classify the pixels into the power lines and non-power lines. The spectral clustering methods used in this study are parametric in nature, to automate the number of clusters Davies-Bouldin index (DBI) is used. The UAV remote sensed image is clustered into the number of clusters determined by DBI. The k clustered image is merged into 2 clusters (power lines and non-power lines). Further, spatial segmentation was performed using morphological and geometric operations, to eliminate the non-power line regions. In this study, UAV images acquired at different altitudes and angles were analyzed to validate the robustness of the proposed method. It was observed that the EM with spatial segmentation (EM-Seg) performed better than the Kmeans with spatial segmentation (Kmeans-Seg) on most of the UAV images. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Dim small targets detection based on self-adaptive caliber temporal-spatial filtering

    Science.gov (United States)

    Fan, Xiangsuo; Xu, Zhiyong; Zhang, Jianlin; Huang, Yongmei; Peng, Zhenming

    2017-09-01

    To boost the detect ability of dim small targets, this paper began by using improved anisotropy for background prediction (IABP), followed by target enhancement by improved high-order cumulates (HQS). Finally, on the basis of image pre-processing, to address the problem of missed and wrong detection caused by fixed caliber of traditional pipeline filtering, this paper used targets' multi-frame movement correlation in the time-space domain, combined with the scale-space theory, to propose a temporal-spatial filtering algorithm which allows the caliber to make self-adaptive changes according to the changes of the targets' scale, effectively solving the detection-related issues brought by unchanged caliber and decreased/increased size of the targets. Experiments showed that the improved anisotropic background predication could be loyal to the true background of the original image to the maximum extent, presenting a superior overall performance to other background prediction methods; the improved HQS significantly increased the signal-noise ratio of images; when the signal-noise ratio was lower than 2.6 dB, this detection algorithm could effectively eliminate noise and detect targets. For the algorithm, the lowest signal-to-noise ratio of the detectable target is 0.37.

  16. Coordinate-invariant regularization

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1987-01-01

    A general phase-space framework for coordinate-invariant regularization is given. The development is geometric, with all regularization contained in regularized DeWitt Superstructures on field deformations. Parallel development of invariant coordinate-space regularization is obtained by regularized functional integration of the momenta. As representative examples of the general formulation, the regularized general non-linear sigma model and regularized quantum gravity are discussed. copyright 1987 Academic Press, Inc

  17. Analysis of laser-printed spatial resolution for mammographic microcalcification detection

    International Nuclear Information System (INIS)

    Smathers, R.L.; Kowarski, D.

    1987-01-01

    The detectability of microcalfications in mammograms was compared in Kodak Min-R screen-film mammograms versus digitized laser-printed films. Pulverized bone specks were used as the phantoms to produce the original mammograms. The mammograms were then digitized to a spatial resolution of 2,048 x, 2048 with 4,096 gray levels and laser-printed at spatial resolutions of 512 x 512, 1,024 x 1,024, and 2,048 x 2,048 with 256 gray levels. The number of bone specks was determined on a region-by region basis. The 512 x 512 resolution laser-printed images were nondiagnostic, 1,024 x 1,024 images were better, and 2,048 x 2,048 images were quite comparable to the original screen-film mammograms

  18. Spatial-temporal features of thermal images for Carpal Tunnel Syndrome detection

    Science.gov (United States)

    Estupinan Roldan, Kevin; Ortega Piedrahita, Marco A.; Benitez, Hernan D.

    2014-02-01

    Disorders associated with repeated trauma account for about 60% of all occupational illnesses, Carpal Tunnel Syndrome (CTS) being the most consulted today. Infrared Thermography (IT) has come to play an important role in the field of medicine. IT is non-invasive and detects diseases based on measuring temperature variations. IT represents a possible alternative to prevalent methods for diagnosis of CTS (i.e. nerve conduction studies and electromiography). This work presents a set of spatial-temporal features extracted from thermal images taken in healthy and ill patients. Support Vector Machine (SVM) classifiers test this feature space with Leave One Out (LOO) validation error. The results of the proposed approach show linear separability and lower validation errors when compared to features used in previous works that do not account for temperature spatial variability.

  19. Improved QRD-M Detection Algorithm for Generalized Spatial Modulation Scheme

    Directory of Open Access Journals (Sweden)

    Xiaorong Jing

    2017-01-01

    Full Text Available Generalized spatial modulation (GSM is a spectral and energy efficient multiple-input multiple-output (MIMO transmission scheme. It will lead to imperfect detection performance with relatively high computational complexity by directly applying the original QR-decomposition with M algorithm (QRD-M to the GSM scheme. In this paper an improved QRD-M algorithm is proposed for GSM signal detection, which achieves near-optimal performance but with relatively low complexity. Based on the QRD, the improved algorithm firstly transforms the maximum likelihood (ML detection of the GSM signals into searching an inverted tree structure. Then, in the searching process of the M branches, the branches corresponding to the illegitimate transmit antenna combinations (TACs and related to invalid number of active antennas are cut in order to improve the validity of the resultant branches at each level by taking advantage of characteristics of GSM signals. Simulation results show that the improved QRD-M detection algorithm provides similar performance to maximum likelihood (ML with the reduced computational complexity compared to the original QRD-M algorithm, and the optimal value of parameter M of the improved QRD-M algorithm for detection of the GSM scheme is equal to modulation order plus one.

  20. DARK MATTER SUBSTRUCTURE DETECTION USING SPATIALLY RESOLVED SPECTROSCOPY OF LENSED DUSTY GALAXIES

    International Nuclear Information System (INIS)

    Hezaveh, Yashar; Holder, Gilbert; Dalal, Neal; Kuhlen, Michael; Marrone, Daniel; Murray, Norman; Vieira, Joaquin

    2013-01-01

    We investigate how strong lensing of dusty, star-forming galaxies (DSFGs) by foreground galaxies can be used as a probe of dark matter halo substructure. We find that spatially resolved spectroscopy of lensed sources allows dramatic improvements to measurements of lens parameters. In particular, we find that modeling of the full, three-dimensional (angular position and radial velocity) data can significantly facilitate substructure detection, increasing the sensitivity of observables to lower mass subhalos. We carry out simulations of lensed dusty sources observed by early ALMA (Cycle 1) and use a Fisher matrix analysis to study the parameter degeneracies and mass detection limits of this method. We find that even with conservative assumptions, it is possible to detect galactic dark matter subhalos of ∼10 8 M ☉ with high significance in most lensed DSFGs. Specifically, we find that in typical DSFG lenses, there is a ∼55% probability of detecting a substructure with M > 10 8 M ☉ with more than 5σ detection significance in each lens, if the abundance of substructure is consistent with previous lensing results. The full ALMA array, with its significantly enhanced sensitivity and resolution, should improve these estimates considerably. Given the sample of ∼100 lenses provided by surveys such as the South Pole Telescope, our understanding of dark matter substructure in typical galaxy halos is poised to improve dramatically over the next few years.

  1. DARK MATTER SUBSTRUCTURE DETECTION USING SPATIALLY RESOLVED SPECTROSCOPY OF LENSED DUSTY GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Hezaveh, Yashar; Holder, Gilbert [Department of Physics, McGill University, 3600 Rue University, Montreal, Quebec H3A 2T8 (Canada); Dalal, Neal [Astronomy Department, University of Illinois at Urbana-Champaign, 1002 West Green Street, Urbana, IL 61801 (United States); Kuhlen, Michael [Theoretical Astrophysics Center, University of California, Berkeley, CA 94720 (United States); Marrone, Daniel [Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States); Murray, Norman [CITA, University of Toronto, 60 St. George Street, Toronto, ON M5S 3H8 (Canada); Vieira, Joaquin [California Institute of Technology, 1200 East California Blvd, MC 249-17, Pasadena, CA 91125 (United States)

    2013-04-10

    We investigate how strong lensing of dusty, star-forming galaxies (DSFGs) by foreground galaxies can be used as a probe of dark matter halo substructure. We find that spatially resolved spectroscopy of lensed sources allows dramatic improvements to measurements of lens parameters. In particular, we find that modeling of the full, three-dimensional (angular position and radial velocity) data can significantly facilitate substructure detection, increasing the sensitivity of observables to lower mass subhalos. We carry out simulations of lensed dusty sources observed by early ALMA (Cycle 1) and use a Fisher matrix analysis to study the parameter degeneracies and mass detection limits of this method. We find that even with conservative assumptions, it is possible to detect galactic dark matter subhalos of {approx}10{sup 8} M{sub Sun} with high significance in most lensed DSFGs. Specifically, we find that in typical DSFG lenses, there is a {approx}55% probability of detecting a substructure with M > 10{sup 8} M{sub Sun} with more than 5{sigma} detection significance in each lens, if the abundance of substructure is consistent with previous lensing results. The full ALMA array, with its significantly enhanced sensitivity and resolution, should improve these estimates considerably. Given the sample of {approx}100 lenses provided by surveys such as the South Pole Telescope, our understanding of dark matter substructure in typical galaxy halos is poised to improve dramatically over the next few years.

  2. Deep Spatial-Temporal Joint Feature Representation for Video Object Detection

    Directory of Open Access Journals (Sweden)

    Baojun Zhao

    2018-03-01

    Full Text Available With the development of deep neural networks, many object detection frameworks have shown great success in the fields of smart surveillance, self-driving cars, and facial recognition. However, the data sources are usually videos, and the object detection frameworks are mostly established on still images and only use the spatial information, which means that the feature consistency cannot be ensured because the training procedure loses temporal information. To address these problems, we propose a single, fully-convolutional neural network-based object detection framework that involves temporal information by using Siamese networks. In the training procedure, first, the prediction network combines the multiscale feature map to handle objects of various sizes. Second, we introduce a correlation loss by using the Siamese network, which provides neighboring frame features. This correlation loss represents object co-occurrences across time to aid the consistent feature generation. Since the correlation loss should use the information of the track ID and detection label, our video object detection network has been evaluated on the large-scale ImageNet VID dataset where it achieves a 69.5% mean average precision (mAP.

  3. Deep Spatial-Temporal Joint Feature Representation for Video Object Detection.

    Science.gov (United States)

    Zhao, Baojun; Zhao, Boya; Tang, Linbo; Han, Yuqi; Wang, Wenzheng

    2018-03-04

    With the development of deep neural networks, many object detection frameworks have shown great success in the fields of smart surveillance, self-driving cars, and facial recognition. However, the data sources are usually videos, and the object detection frameworks are mostly established on still images and only use the spatial information, which means that the feature consistency cannot be ensured because the training procedure loses temporal information. To address these problems, we propose a single, fully-convolutional neural network-based object detection framework that involves temporal information by using Siamese networks. In the training procedure, first, the prediction network combines the multiscale feature map to handle objects of various sizes. Second, we introduce a correlation loss by using the Siamese network, which provides neighboring frame features. This correlation loss represents object co-occurrences across time to aid the consistent feature generation. Since the correlation loss should use the information of the track ID and detection label, our video object detection network has been evaluated on the large-scale ImageNet VID dataset where it achieves a 69.5% mean average precision (mAP).

  4. Intrinsic spatial resolution limitations due to differences between positron emission position and annihilation detection localization

    International Nuclear Information System (INIS)

    Perez, Pedro; Malano, Francisco; Valente, Mauro

    2012-01-01

    Since its successful implementation for clinical diagnostic, positron emission tomography (PET) represents the most promising medical imaging technique. The recent major growth of PET imaging is mainly due to its ability to trace the biologic pathways of different compounds in the patient's body, assuming the patient can be labeled with some PET isotope. Regardless of the type of isotope, the PET imaging method is based on the detection of two 511-keV gamma photons being emitted in opposite directions, with almost 180 deg between them, as a consequence of electron-positron annihilation. Therefore, this imaging method is intrinsically limited by random uncertainties in spatial resolutions, related with differences between the actual position of positron emission and the location of the detected annihilation. This study presents an approach with the Monte Carlo method to analyze the influence of this effect on different isotopes of potential implementation in PET. (author)

  5. Blind detection of isolated astrophysical pulses in the spatial Fourier transform domain

    Science.gov (United States)

    Schmid, Natalia A.; Prestage, Richard M.

    2018-04-01

    We present a novel approach for the detection of isolated transients in pulsar surveys and fast radio transient observations. Rather than the conventional approach of performing a computationally expensive blind DM search, we take the spatial Fourier transform (SFT) of short (˜ few seconds) sections of data. A transient will have a characteristic signature in the SFT domain, and we present a blind statistic which may be used to detect this signature at an empirical zero False Alarm Rate (FAR). The method has been evaluated using simulations, and also applied to two fast radio burst observations. In addition to its use for current observations, we expect this method will be extremely beneficial for future multi-beam observations made by telescopes equipped with phased array feeds.

  6. Spatial statistical analysis of organs for intelligent CAD and its application to disease detection

    International Nuclear Information System (INIS)

    Takizawa, Hotaka

    2009-01-01

    The present article reports our research that was performed in a research project supported by a Grantin-Aid for Scientific Research on Priority Area from the Ministry of Education, Culture Sports, Science and Technology, JAPAN, from 2003 to 2006. Our method developed in the research acquired the trend of variation of spatial relations between true diseases, false positives and image features through statistical analysis of a set of medical images and improved the accuracy of disease detection by predicting their occurrence positions in an image based on the trend. This article describes the formulation of the method in general form and shows the results obtained by applying the method to chest X-ray CT images for detection of pulmonary nodules. (author)

  7. Speech detection in noise and spatial unmasking in children with simultaneous versus sequential bilateral cochlear implants.

    Science.gov (United States)

    Chadha, Neil K; Papsin, Blake C; Jiwani, Salima; Gordon, Karen A

    2011-09-01

    To measure speech detection in noise performance for children with bilateral cochlear implants (BiCI), to compare performance in children with simultaneous implant versus those with sequential implant, and to compare performance to normal-hearing children. Prospective cohort study. Tertiary academic pediatric center. Children with early-onset bilateral deafness and 2-year BiCI experience, comprising the "sequential" group (>2 yr interimplantation delay, n = 12) and "simultaneous group" (no interimplantation delay, n = 10) and normal-hearing controls (n = 8). Thresholds to speech detection (at 0-degree azimuth) were measured with noise at 0-degree azimuth or ± 90-degree azimuth. Spatial unmasking (SU) as the noise condition changed from 0-degree azimuth to ± 90-degree azimuth and binaural summation advantage (BSA) of 2 over 1 CI. Speech detection in noise was significantly poorer than controls for both BiCI groups (p simultaneous group approached levels found in normal controls (7.2 ± 0.6 versus 8.6 ± 0.6 dB, p > 0.05) and was significantly better than that in the sequential group (3.9 ± 0.4 dB, p simultaneous group but, in the sequential group, was significantly better when noise was moved to the second rather than the first implanted ear (4.8 ± 0.5 versus 3.0 ± 0.4 dB, p sequential group's second rather than first CI. Children with simultaneously implanted BiCI demonstrated an advantage over children with sequential implant by using spatial cues to improve speech detection in noise.

  8. Crosstalk elimination in the detection of dual-beam optical tweezers by spatial filtering

    International Nuclear Information System (INIS)

    Ott, Dino; Oddershede, Lene B.; Reihani, S. Nader S.

    2014-01-01

    In dual-beam optical tweezers, the accuracy of position and force measurements is often compromised by crosstalk between the two detected signals, this crosstalk leading to systematic and significant errors on the measured forces and distances. This is true both for dual-beam optical traps where the splitting of the two traps is done by polarization optics and for dual optical traps constructed by other methods, e.g., holographic tweezers. If the two traps are orthogonally polarized, most often crosstalk is minimized by inserting polarization optics in front of the detector; however, this method is not perfect because of the de-polarization of the trapping beam introduced by the required high numerical aperture optics. Here we present a simple and easy-to-implement method to efficiently eliminate crosstalk. The method is based on spatial filtering by simply inserting a pinhole at the correct position and is highly compatible with standard back focal plane photodiode based detection of position and force. Our spatial filtering method reduces crosstalk up to five times better than polarization filtering alone. The effectiveness is dependent on pinhole size and distance between the traps and is here quantified experimentally and reproduced by theoretical modeling. The method here proposed will improve the accuracy of force-distance measurements, e.g., of single molecules, performed by dual-beam optical traps and hence give much more scientific value for the experimental efforts

  9. A spatial approach of magnitude-squared coherence applied to selective attention detection.

    Science.gov (United States)

    Bonato Felix, Leonardo; de Souza Ranaudo, Fernando; D'affonseca Netto, Aluizio; Ferreira Leite Miranda de Sá, Antonio Mauricio

    2014-05-30

    Auditory selective attention is the human ability of actively focusing in a certain sound stimulus while avoiding all other ones. This ability can be used, for example, in behavioral studies and brain-machine interface. In this work we developed an objective method - called Spatial Coherence - to detect the side where a subject is focusing attention to. This method takes into consideration the Magnitude Squared Coherence and the topographic distribution of responses among electroencephalogram electrodes. The individuals were stimulated with amplitude-modulated tones binaurally and were oriented to focus attention to only one of the stimuli. The results indicate a contralateral modulation of ASSR in the attention condition and are in agreement with prior studies. Furthermore, the best combination of electrodes led to a hit rate of 82% for 5.03 commands per minute. Using a similar paradigm, in a recent work, a maximum hit rate of 84.33% was achieved, but with a greater a classification time (20s, i.e. 3 commands per minute). It seems that Spatial Coherence is a useful technique for detecting focus of auditory selective attention. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Remote sensing and spatial analysis based study for detecting deforestation and the associated drivers

    Science.gov (United States)

    El-Abbas, Mustafa M.; Csaplovics, Elmar; Deafalla, Taisser H.

    2013-10-01

    Nowadays, remote-sensing technologies are becoming increasingly interlinked to the issue of deforestation. They offer a systematized and objective strategy to document, understand and simulate the deforestation process and its associated causes. In this context, the main goal of this study, conducted in the Blue Nile region of Sudan, in which most of the natural habitats were dramatically destroyed, was to develop spatial methodologies to assess the deforestation dynamics and its associated factors. To achieve that, optical multispectral satellite scenes (i.e., ASTER and LANDSAT) integrated with field survey in addition to multiple data sources were used for the analyses. Spatiotemporal Object Based Image Analysis (STOBIA) was applied to assess the change dynamics within the period of study. Broadly, the above mentioned analyses include; Object Based (OB) classifications, post-classification change detection, data fusion, information extraction and spatial analysis. Hierarchical multi-scale segmentation thresholds were applied and each class was delimited with semantic meanings by a set of rules associated with membership functions. Consequently, the fused multi-temporal data were introduced to create detailed objects of change classes from the input LU/LC classes. The dynamic changes were quantified and spatially located as well as the spatial and contextual relations from adjacent areas were analyzed. The main finding of the present study is that, the forest areas were drastically decreased, while the agrarian structure in conversion of forest into agricultural fields and grassland was the main force of deforestation. In contrast, the capability of the area to recover was clearly observed. The study concludes with a brief assessment of an 'oriented' framework, focused on the alarming areas where serious dynamics are located and where urgent plans and interventions are most critical, guided with potential solutions based on the identified driving forces.

  11. Wind turbine extraction from high spatial resolution remote sensing images based on saliency detection

    Science.gov (United States)

    Chen, Jingbo; Yue, Anzhi; Wang, Chengyi; Huang, Qingqing; Chen, Jiansheng; Meng, Yu; He, Dongxu

    2018-01-01

    The wind turbine is a device that converts the wind's kinetic energy into electrical power. Accurate and automatic extraction of wind turbine is instructive for government departments to plan wind power plant projects. A hybrid and practical framework based on saliency detection for wind turbine extraction, using Google Earth image at spatial resolution of 1 m, is proposed. It can be viewed as a two-phase procedure: coarsely detection and fine extraction. In the first stage, we introduced a frequency-tuned saliency detection approach for initially detecting the area of interest of the wind turbines. This method exploited features of color and luminance, was simple to implement, and was computationally efficient. Taking into account the complexity of remote sensing images, in the second stage, we proposed a fast method for fine-tuning results in frequency domain and then extracted wind turbines from these salient objects by removing the irrelevant salient areas according to the special properties of the wind turbines. Experiments demonstrated that our approach consistently obtains higher precision and better recall rates. Our method was also compared with other techniques from the literature and proves that it is more applicable and robust.

  12. The characteristics and spatial distributions of initially missed and rebiopsy-detected prostate cancers

    Directory of Open Access Journals (Sweden)

    Myung-Won You

    2016-07-01

    Full Text Available Purpose: The purpose of this study was to analyze the characteristics of initially missed and rebiopsy-detected prostate cancers following 12-core transrectal biopsy. Methods: A total of 45 patients with prostate cancers detected on rebiopsy and 45 patients with prostate cancers initially detected on transrectal ultrasound-guided biopsy were included in the study. For result analysis, the prostate was divided into six compartments, and the cancer positive rates, estimated tumor burden, and agreement rates between biopsy and surgical specimens, along with clinical data, were evaluated. Results: The largest mean tumor burden was located in the medial apex in both groups. There were significantly more tumors in this location in the rebiopsy group (44.9% than in the control group (30.1%, P=0.015. The overall sensitivity of biopsy was significantly lower in the rebiopsy group (22.5% vs. 43.4%, P<0.001. The agreement rate of cancer positive cores between biopsy and surgical specimens was significantly lower in the medial apex in the rebiopsy group compared with that of the control group (50.0% vs. 65.6%, P=0.035. The cancer positive rates of target biopsy cores and premalignant lesions in the rebiopsy group were 63.1% and 42.3%, respectively. Conclusion: Rebiopsy-detected prostate cancers showed different spatial distribution and lower cancer detection rate of biopsy cores compared with initially diagnosed cancers. To overcome lower cancer detection rate, target biopsy of abnormal sonographic findings, premalignant lesions and medial apex which revealed larger tumor burden would be recommended when performing rebiopsy.

  13. Temporal and spatial predictability of an irrelevant event differently affect detection and memory of items in a visual sequence

    Directory of Open Access Journals (Sweden)

    Junji eOhyama

    2016-02-01

    Full Text Available We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition, it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection reaction times were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.

  14. Combining time-frequency and spatial information for the detection of sleep spindles

    Directory of Open Access Journals (Sweden)

    Christian eO'Reilly

    2015-02-01

    Full Text Available EEG sleep spindles are short (0.5-2.0 s bursts of activity in the 11-16 Hz band occurring during non-rapid eye movement (NREM sleep. This sporadic activity is thought to play a role in memory consolidation, brain plasticity, and protection of sleep integrity. Many automatic detectors have been proposed to assist or replace experts for sleep spindle scoring. However, these algorithms usually detect too many events making it difficult to achieve a good tradeoff between sensitivity (Se and false detection rate (FDr. In this work, we propose a semi-automatic detector comprising a sensitivity phase based on well-established criteria followed by a specificity phase using spatial and spectral criteria.In the sensitivity phase, selected events are those which amplitude in the 10 – 16 Hz band and spectral ratio characteristics both reject a null hypothesis (p <0.1 stating that the considered event is not a spindle. This null hypothesis is constructed from events occurring during rapid eye movement (REM sleep epochs. In the specificity phase, a hierarchical clustering of the selected candidates is done based on events’ frequency and spatial position along the anterior-posterior axis. Only events from the classes grouping most (at least 80% spindles scored by an expert are kept. We obtain Se = 93.2% and FDr = 93.0% in the first phase and Se = 85.4% and FDr = 86.2% in the second phase. For these two phases, Matthew’s correlation coefficients are respectively 0.228 and 0.324. Results suggest that spindles are defined by specific spatio-spectral properties and that automatic detection methods can be improved by considering these features.

  15. People Detection Based on Spatial Mapping of Friendliness and Floor Boundary Points for a Mobile Navigation Robot

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Tasaki

    2011-01-01

    Full Text Available Navigation robots must single out partners requiring navigation and move in the cluttered environment where people walk around. Developing such robots requires two different people detections: detecting partners and detecting all moving people around the robots. For detecting partners, we design divided spaces based on the spatial relationships and sensing ranges. Mapping the friendliness of each divided space based on the stimulus from the multiple sensors to detect people calling robots positively, robots detect partners on the highest friendliness space. For detecting moving people, we regard objects’ floor boundary points in an omnidirectional image as obstacles. We classify obstacles as moving people by comparing movement of each point with robot movement using odometry data, dynamically changing thresholds to detect. Our robot detected 95.0% of partners while it stands by and interacts with people and detected 85.0% of moving people while robot moves, which was four times higher than previous methods did.

  16. A hierarchical model for estimating the spatial distribution and abundance of animals detected by continuous-time recorders.

    Directory of Open Access Journals (Sweden)

    Robert M Dorazio

    Full Text Available Several spatial capture-recapture (SCR models have been developed to estimate animal abundance by analyzing the detections of individuals in a spatial array of traps. Most of these models do not use the actual dates and times of detection, even though this information is readily available when using continuous-time recorders, such as microphones or motion-activated cameras. Instead most SCR models either partition the period of trap operation into a set of subjectively chosen discrete intervals and ignore multiple detections of the same individual within each interval, or they simply use the frequency of detections during the period of trap operation and ignore the observed times of detection. Both practices make inefficient use of potentially important information in the data.We developed a hierarchical SCR model to estimate the spatial distribution and abundance of animals detected with continuous-time recorders. Our model includes two kinds of point processes: a spatial process to specify the distribution of latent activity centers of individuals within the region of sampling and a temporal process to specify temporal patterns in the detections of individuals. We illustrated this SCR model by analyzing spatial and temporal patterns evident in the camera-trap detections of tigers living in and around the Nagarahole Tiger Reserve in India. We also conducted a simulation study to examine the performance of our model when analyzing data sets of greater complexity than the tiger data.Our approach provides three important benefits: First, it exploits all of the information in SCR data obtained using continuous-time recorders. Second, it is sufficiently versatile to allow the effects of both space use and behavior of animals to be specified as functions of covariates that vary over space and time. Third, it allows both the spatial distribution and abundance of individuals to be estimated, effectively providing a species distribution model, even in

  17. Detecting spatial patterns with the cumulant function – Part 1: The theory

    Directory of Open Access Journals (Sweden)

    P. Naveau

    2008-02-01

    Full Text Available In climate studies, detecting spatial patterns that largely deviate from the sample mean still remains a statistical challenge. Although a Principal Component Analysis (PCA, or equivalently a Empirical Orthogonal Functions (EOF decomposition, is often applied for this purpose, it provides meaningful results only if the underlying multivariate distribution is Gaussian. Indeed, PCA is based on optimizing second order moments, and the covariance matrix captures the full dependence structure of multivariate Gaussian vectors. Whenever the application at hand can not satisfy this normality hypothesis (e.g. precipitation data, alternatives and/or improvements to PCA have to be developed and studied. To go beyond this second order statistics constraint, that limits the applicability of the PCA, we take advantage of the cumulant function that can produce higher order moments information. The cumulant function, well-known in the statistical literature, allows us to propose a new, simple and fast procedure to identify spatial patterns for non-Gaussian data. Our algorithm consists in maximizing the cumulant function. Three families of multivariate random vectors, for which explicit computations are obtained, are implemented to illustrate our approach. In addition, we show that our algorithm corresponds to selecting the directions along which projected data display the largest spread over the marginal probability density tails.

  18. Using demographic characteristics of populations to detect spatial fragmentation following suspected ebola outbreaks in great apes.

    Science.gov (United States)

    Genton, Céline; Cristescu, Romane; Gatti, Sylvain; Levréro, Florence; Bigot, Elodie; Motsch, Peggy; Le Gouar, Pascaline; Pierre, Jean-Sébastien; Ménard, Nelly

    2017-09-01

    Demographic crashes due to emerging diseases can contribute to population fragmentation and increase extinction risk of small populations. Ebola outbreaks in 2002-2004 are suspected to have caused a decline of more than 80% in some Western lowland gorilla (Gorilla gorilla gorilla) populations. We investigated whether demographic indicators of this event allowed for the detection of spatial fragmentation in gorilla populations. We collected demographic data from two neighbouring populations: the Lokoué population, suspected to have been affected by an Ebola outbreak (followed from 2001 to 2014), and the Romani population, of unknown demographic status before Ebola outbreaks (followed from 2005 to 2014). Ten years after the outbreak, the Lokoué population is slowly recovering and the short-term demographic indicators of a population crash were no longer detectable. The Lokoué population has not experienced any additional demographic perturbation over the past decade. The Romani population did not show any of the demographic indicators of a population crash over the past decade. Its demographic structure remained similar to that of unaffected populations. Our results highlighted that the Ebola disease could contribute to fragmentation of gorilla populations due to the spatially heterogeneous impact of its outbreaks. The demographic structure of populations (i.e., age-sex and group structure) can be useful indicators of a possible occurrence of recent Ebola outbreaks in populations without known history, and may be more broadly used in other emerging disease/species systems. Longitudinal data are critical to our understanding of the impact of emerging diseases on wild populations and their conservation. © 2017 Wiley Periodicals, Inc.

  19. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Gang Li

    2016-09-01

    Full Text Available The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs. Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data.

  20. Reliability and Minimum Detectable Change of Temporal-Spatial, Kinematic, and Dynamic Stability Measures during Perturbed Gait.

    Directory of Open Access Journals (Sweden)

    Christopher A Rábago

    Full Text Available Temporal-spatial, kinematic variability, and dynamic stability measures collected during perturbation-based assessment paradigms are often used to identify dysfunction associated with gait instability. However, it remains unclear which measures are most reliable for detecting and tracking responses to perturbations. This study systematically determined the between-session reliability and minimum detectable change values of temporal-spatial, kinematic variability, and dynamic stability measures during three types of perturbed gait. Twenty young healthy adults completed two identical testing sessions two weeks apart, comprised of an unperturbed and three perturbed (cognitive, physical, and visual walking conditions in a virtual reality environment. Within each session, perturbation responses were compared to unperturbed walking using paired t-tests. Between-session reliability and minimum detectable change values were also calculated for each measure and condition. All temporal-spatial, kinematic variability and dynamic stability measures demonstrated fair to excellent between-session reliability. Minimal detectable change values, normalized to mean values ranged from 1-50%. Step width mean and variability measures demonstrated the greatest response to perturbations with excellent between-session reliability and low minimum detectable change values. Orbital stability measures demonstrated specificity to perturbation direction and sensitivity with excellent between-session reliability and low minimum detectable change values. We observed substantially greater between-session reliability and lower minimum detectable change values for local stability measures than previously described which may be the result of averaging across trials within a session and using velocity versus acceleration data for reconstruction of state spaces. Across all perturbation types, temporal-spatial, orbital and local measures were the most reliable measures with the

  1. Contribution to the study of position sensitive detectors with high spatial resolution for thermal neutron detection

    International Nuclear Information System (INIS)

    Idrissi Fakhr-Eddine, Abdellah.

    1978-01-01

    With a view to improving the spatial resolution of the localization of thermal neutrons, the work covers four position sensitive detectors: - 800 cell multi-detectors (1 dimension), - linear 'Jeu de Jacquet' detectors (1 dimension) - Multi-detector XYP 128x128 (2 dimensions), - 'Jeu de Jacquet' detector with 2 dimensions. Mention is made of the various position finding methods known so far, as well as the reasons for selecting BF 3 as detector gas. A study is then made of the parameters of the multiwire chamber whose principle will form the basis of most of the position detecting appliances subsequently dealt with. Finally, a description is given of the detection tests of the thermal neutrons in the multiwire chamber depending on the pressure, a parameter that greatly affects the accuracy of the position finding. The single dimension position tests on two kinds of appliance, the 800 cell multi-detector for the wide angle diffraction studies, and the linear 'Jeu de Jacquet' detector designed for small angle diffraction are mentioned. A description is then given of two position appliances with two dimensions; the multi-detector XYP 128x128 and the two dimensional 'Jeu de Jacquet' detector. In the case of this latter detector, only the hoped for characteristics are indicated [fr

  2. Detection of precursory deformation using a TLS. Application to spatial prediction of rockfalls.

    Science.gov (United States)

    Abellán, Antonio; Vilaplana, Joan Manuel; Calvet, Jaume; Rodriguez, Xavier

    2010-05-01

    Different applications on the use of Terrestrial Laser Scanner (TLS) on rock slopes are undergoing rapid development, mainly in the characterization of 3D discontinuities and the monitoring of rock slopes. The emphasis of this research is on detection of precursory deformation and its application to spatial prediction of rockfalls. The pilot study area corresponds to the main scarp of an old slide located at Puigcercós (Catalonia, Spain). 3D temporal variations of the terrain were analyzed by comparing sequential TLS datasets. Five areas characterized by centimetric precursory deformations were detected in the study area. Of these deformations, (a) growing deformation across three areas culminated in a rockfall occurrence; and (b) another growing deformation across two areas was detected, making a subsequent rockfall likely. The areas with precursory deformations detected in Puigcercós showed the following characteristics: (a) a sub-vertical fracture delimiting the moving part from the rest of the slope; (b) an increase in the horizontal displacement upwards, typical of a toppling failure mechanism (Muller 1968; Goodman and Bray, 1976). In addition, decimetric-scale rockfalls were observed in the upper part of the moving areas, which is consistent with the observations of Rosser et al., (2007). TLS ILRIS 3D technical characteristics are as follows: high accuracy (7.2 mm at a range of 50 meters), high angular resolution (e.g. 1 point every few cm), fast data acquisition: 2,500 points/second; broad coverage; high maximum range on natural slopes: ~600m. The single point distances between the surface of reference and the successive data point clouds were computed using a conventional methodology (data vs. reference comparison). The direction of comparison was defined as the normal vector of the rock face at its central part. We focused in the study of the small scale displacements towards the origin of coordinates, which reflect the pre-failure deformation on part of

  3. Simultaneous detection of landmarks and key-frame in cardiac perfusion MRI using a joint spatial-temporal context model

    Science.gov (United States)

    Lu, Xiaoguang; Xue, Hui; Jolly, Marie-Pierre; Guetter, Christoph; Kellman, Peter; Hsu, Li-Yueh; Arai, Andrew; Zuehlsdorff, Sven; Littmann, Arne; Georgescu, Bogdan; Guehring, Jens

    2011-03-01

    Cardiac perfusion magnetic resonance imaging (MRI) has proven clinical significance in diagnosis of heart diseases. However, analysis of perfusion data is time-consuming, where automatic detection of anatomic landmarks and key-frames from perfusion MR sequences is helpful for anchoring structures and functional analysis of the heart, leading toward fully automated perfusion analysis. Learning-based object detection methods have demonstrated their capabilities to handle large variations of the object by exploring a local region, i.e., context. Conventional 2D approaches take into account spatial context only. Temporal signals in perfusion data present a strong cue for anchoring. We propose a joint context model to encode both spatial and temporal evidence. In addition, our spatial context is constructed not only based on the landmark of interest, but also the landmarks that are correlated in the neighboring anatomies. A discriminative model is learned through a probabilistic boosting tree. A marginal space learning strategy is applied to efficiently learn and search in a high dimensional parameter space. A fully automatic system is developed to simultaneously detect anatomic landmarks and key frames in both RV and LV from perfusion sequences. The proposed approach was evaluated on a database of 373 cardiac perfusion MRI sequences from 77 patients. Experimental results of a 4-fold cross validation show superior landmark detection accuracies of the proposed joint spatial-temporal approach to the 2D approach that is based on spatial context only. The key-frame identification results are promising.

  4. Major Depression Detection from EEG Signals Using Kernel Eigen-Filter-Bank Common Spatial Patterns.

    Science.gov (United States)

    Liao, Shih-Cheng; Wu, Chien-Te; Huang, Hao-Chuan; Cheng, Wei-Teng; Liu, Yi-Hung

    2017-06-14

    Major depressive disorder (MDD) has become a leading contributor to the global burden of disease; however, there are currently no reliable biological markers or physiological measurements for efficiently and effectively dissecting the heterogeneity of MDD. Here we propose a novel method based on scalp electroencephalography (EEG) signals and a robust spectral-spatial EEG feature extractor called kernel eigen-filter-bank common spatial pattern (KEFB-CSP). The KEFB-CSP first filters the multi-channel raw EEG signals into a set of frequency sub-bands covering the range from theta to gamma bands, then spatially transforms the EEG signals of each sub-band from the original sensor space to a new space where the new signals (i.e., CSPs) are optimal for the classification between MDD and healthy controls, and finally applies the kernel principal component analysis (kernel PCA) to transform the vector containing the CSPs from all frequency sub-bands to a lower-dimensional feature vector called KEFB-CSP. Twelve patients with MDD and twelve healthy controls participated in this study, and from each participant we collected 54 resting-state EEGs of 6 s length (5 min and 24 s in total). Our results show that the proposed KEFB-CSP outperforms other EEG features including the powers of EEG frequency bands, and fractal dimension, which had been widely applied in previous EEG-based depression detection studies. The results also reveal that the 8 electrodes from the temporal areas gave higher accuracies than other scalp areas. The KEFB-CSP was able to achieve an average EEG classification accuracy of 81.23% in single-trial analysis when only the 8-electrode EEGs of the temporal area and a support vector machine (SVM) classifier were used. We also designed a voting-based leave-one-participant-out procedure to test the participant-independent individual classification accuracy. The voting-based results show that the mean classification accuracy of about 80% can be achieved by the KEFP

  5. Major Depression Detection from EEG Signals Using Kernel Eigen-Filter-Bank Common Spatial Patterns

    Directory of Open Access Journals (Sweden)

    Shih-Cheng Liao

    2017-06-01

    Full Text Available Major depressive disorder (MDD has become a leading contributor to the global burden of disease; however, there are currently no reliable biological markers or physiological measurements for efficiently and effectively dissecting the heterogeneity of MDD. Here we propose a novel method based on scalp electroencephalography (EEG signals and a robust spectral-spatial EEG feature extractor called kernel eigen-filter-bank common spatial pattern (KEFB-CSP. The KEFB-CSP first filters the multi-channel raw EEG signals into a set of frequency sub-bands covering the range from theta to gamma bands, then spatially transforms the EEG signals of each sub-band from the original sensor space to a new space where the new signals (i.e., CSPs are optimal for the classification between MDD and healthy controls, and finally applies the kernel principal component analysis (kernel PCA to transform the vector containing the CSPs from all frequency sub-bands to a lower-dimensional feature vector called KEFB-CSP. Twelve patients with MDD and twelve healthy controls participated in this study, and from each participant we collected 54 resting-state EEGs of 6 s length (5 min and 24 s in total. Our results show that the proposed KEFB-CSP outperforms other EEG features including the powers of EEG frequency bands, and fractal dimension, which had been widely applied in previous EEG-based depression detection studies. The results also reveal that the 8 electrodes from the temporal areas gave higher accuracies than other scalp areas. The KEFB-CSP was able to achieve an average EEG classification accuracy of 81.23% in single-trial analysis when only the 8-electrode EEGs of the temporal area and a support vector machine (SVM classifier were used. We also designed a voting-based leave-one-participant-out procedure to test the participant-independent individual classification accuracy. The voting-based results show that the mean classification accuracy of about 80% can be

  6. Early detection of tuberculosis outbreaks among the San Francisco homeless: trade-offs between spatial resolution and temporal scale.

    Directory of Open Access Journals (Sweden)

    Brandon W Higgs

    Full Text Available BACKGROUND: San Francisco has the highest rate of tuberculosis (TB in the U.S. with recurrent outbreaks among the homeless and marginally housed. It has been shown for syndromic data that when exact geographic coordinates of individual patients are used as the spatial base for outbreak detection, higher detection rates and accuracy are achieved compared to when data are aggregated into administrative regions such as zip codes and census tracts. We examine the effect of varying the spatial resolution in the TB data within the San Francisco homeless population on detection sensitivity, timeliness, and the amount of historical data needed to achieve better performance measures. METHODS AND FINDINGS: We apply a variation of space-time permutation scan statistic to the TB data in which a patient's location is either represented by its exact coordinates or by the centroid of its census tract. We show that the detection sensitivity and timeliness of the method generally improve when exact locations are used to identify real TB outbreaks. When outbreaks are simulated, while the detection timeliness is consistently improved when exact coordinates are used, the detection sensitivity varies depending on the size of the spatial scanning window and the number of tracts in which cases are simulated. Finally, we show that when exact locations are used, smaller amount of historical data is required for training the model. CONCLUSION: Systematic characterization of the spatio-temporal distribution of TB cases can widely benefit real time surveillance and guide public health investigations of TB outbreaks as to what level of spatial resolution results in improved detection sensitivity and timeliness. Trading higher spatial resolution for better performance is ultimately a tradeoff between maintaining patient confidentiality and improving public health when sharing data. Understanding such tradeoffs is critical to managing the complex interplay between public

  7. Distance-regular graphs

    NARCIS (Netherlands)

    van Dam, Edwin R.; Koolen, Jack H.; Tanaka, Hajime

    2016-01-01

    This is a survey of distance-regular graphs. We present an introduction to distance-regular graphs for the reader who is unfamiliar with the subject, and then give an overview of some developments in the area of distance-regular graphs since the monograph 'BCN'[Brouwer, A.E., Cohen, A.M., Neumaier,

  8. LL-regular grammars

    NARCIS (Netherlands)

    Nijholt, Antinus

    1980-01-01

    Culik II and Cogen introduced the class of LR-regular grammars, an extension of the LR(k) grammars. In this paper we consider an analogous extension of the LL(k) grammars called the LL-regular grammars. The relation of this class of grammars to other classes of grammars will be shown. Any LL-regular

  9. Regular Expression Pocket Reference

    CERN Document Server

    Stubblebine, Tony

    2007-01-01

    This handy little book offers programmers a complete overview of the syntax and semantics of regular expressions that are at the heart of every text-processing application. Ideal as a quick reference, Regular Expression Pocket Reference covers the regular expression APIs for Perl 5.8, Ruby (including some upcoming 1.9 features), Java, PHP, .NET and C#, Python, vi, JavaScript, and the PCRE regular expression libraries. This concise and easy-to-use reference puts a very powerful tool for manipulating text and data right at your fingertips. Composed of a mixture of symbols and text, regular exp

  10. A spatial hazard model for cluster detection on continuous indicators of disease: application to somatic cell score.

    Science.gov (United States)

    Gay, Emilie; Senoussi, Rachid; Barnouin, Jacques

    2007-01-01

    Methods for spatial cluster detection dealing with diseases quantified by continuous variables are few, whereas several diseases are better approached by continuous indicators. For example, subclinical mastitis of the dairy cow is evaluated using a continuous marker of udder inflammation, the somatic cell score (SCS). Consequently, this study proposed to analyze spatialized risk and cluster components of herd SCS through a new method based on a spatial hazard model. The dataset included annual SCS for 34 142 French dairy herds for the year 2000, and important SCS risk factors: mean parity, percentage of winter and spring calvings, and herd size. The model allowed the simultaneous estimation of the effects of known risk factors and of potential spatial clusters on SCS, and the mapping of the estimated clusters and their range. Mean parity and winter and spring calvings were significantly associated with subclinical mastitis risk. The model with the presence of 3 clusters was highly significant, and the 3 clusters were attractive, i.e. closeness to cluster center increased the occurrence of high SCS. The three localizations were the following: close to the city of Troyes in the northeast of France; around the city of Limoges in the center-west; and in the southwest close to the city of Tarbes. The semi-parametric method based on spatial hazard modeling applies to continuous variables, and takes account of both risk factors and potential heterogeneity of the background population. This tool allows a quantitative detection but assumes a spatially specified form for clusters.

  11. INTERSECTION DETECTION BASED ON QUALITATIVE SPATIAL REASONING ON STOPPING POINT CLUSTERS

    Directory of Open Access Journals (Sweden)

    S. Zourlidou

    2016-06-01

    Full Text Available The purpose of this research is to propose and test a method for detecting intersections by analysing collectively acquired trajectories of moving vehicles. Instead of solely relying on the geometric features of the trajectories, such as heading changes, which may indicate turning points and consequently intersections, we extract semantic features of the trajectories in form of sequences of stops and moves. Under this spatiotemporal prism, the extracted semantic information which indicates where vehicles stop can reveal important locations, such as junctions. The advantage of the proposed approach in comparison with existing turning-points oriented approaches is that it can detect intersections even when not all the crossing road segments are sampled and therefore no turning points are observed in the trajectories. The challenge with this approach is that first of all, not all vehicles stop at the same location – thus, the stop-location is blurred along the direction of the road; this, secondly, leads to the effect that nearby junctions can induce similar stop-locations. As a first step, a density-based clustering is applied on the layer of stop observations and clusters of stop events are found. Representative points of the clusters are determined (one per cluster and in a last step the existence of an intersection is clarified based on spatial relational cluster reasoning, with which less informative geospatial clusters, in terms of whether a junction exists and where its centre lies, are transformed in more informative ones. Relational reasoning criteria, based on the relative orientation of the clusters with their adjacent ones are discussed for making sense of the relation that connects them, and finally for forming groups of stop events that belong to the same junction.

  12. The performance of spatially offset Raman spectroscopy for liquid explosive detection

    Science.gov (United States)

    Loeffen, Paul W.; Maskall, Guy; Bonthron, Stuart; Bloomfield, Matthew; Tombling, Craig; Matousek, Pavel

    2016-10-01

    Aviation security requirements adopted in 2014 require liquids to be screened at most airports throughout Europe, North America and Australia. Cobalt's unique Spatially Offset Raman Spectroscopy (SORS™) technology has proven extremely effective at screening liquids, aerosols and gels (LAGS) with extremely low false alarm rates. SORS is compatible with a wide range of containers, including coloured, opaque or clear plastics, glass and paper, as well as duty-free bottles in STEBs (secure tamper-evident bags). Our award-winning Insight range has been specially developed for table-top screening at security checkpoints. Insight systems use our patented SORS technology for rapid and accurate chemical analysis of substances in unopened non-metallic containers. Insight100M™ and the latest member of the range - Insight200M™ - also screen metallic containers. Our unique systems screen liquids, aerosols and gels with the highest detection capability and lowest false alarm rates of any ECAC-approved scanner, with several hundred units already in use at airports including eight of the top ten European hubs. This paper presents an analysis of real performance data for these systems.

  13. Detection of regularities in variation in geomechanical behavior of rock mass during multi-roadway preparation and mining of an extraction panel

    Science.gov (United States)

    Tsvetkov, AB; Pavlova, LD; Fryanov, VN

    2018-03-01

    The results of numerical simulation of the stress–strain state in a rock block and surrounding mass mass under multi-roadway preparation to mining are presented. The numerical solutions obtained by the nonlinear modeling and using the constitutive relations of the theory of elasticity are compared. The regularities of the stress distribution in the vicinity of the pillars located in the zone of the abutment pressure of are found.

  14. Non-Linear Detection for Joint Space-Frequency Block Coding and Spatial Multiplexing in OFDM-MIMO Systems

    DEFF Research Database (Denmark)

    Rahman, Imadur Mohamed; Marchetti, Nicola; Fitzek, Frank

    2005-01-01

    (SIC) receiver where the detection is done on subcarrier by sub-carrier basis based on both Zero Forcing (ZF) and Minimum Mean Square Error (MMSE) nulling criterion for the system. In terms of Frame Error Rate (FER), MMSE based SIC receiver performs better than all other receivers compared......In this work, we have analyzed a joint spatial diversity and multiplexing transmission structure for MIMO-OFDM system, where Orthogonal Space-Frequency Block Coding (OSFBC) is used across all spatial multiplexing branches. We have derived a BLAST-like non-linear Successive Interference Cancellation...... in this paper. We have found that a linear two-stage receiver for the proposed system [1] performs very close to the non-linear receiver studied in this work. Finally, we compared the system performance in spatially correlated scenario. It is found that higher amount of spatial correlation at the transmitter...

  15. Early non-destructive biofouling detection and spatial distribution: Application of oxygen sensing optodes

    KAUST Repository

    Farhat, Nadia; Staal, Marc; Siddiqui, Amber; Borisov, S.M.; Bucs, Szilard; Vrouwenvelder, Johannes S.

    2015-01-01

    The spatial and quantitative information on biological activity will lead to better understanding of the biofouling processes, contributing to the development of more effective biofouling control strategies.

  16. Single-trial detection of visual evoked potentials by common spatial patterns and wavelet filtering for brain-computer interface.

    Science.gov (United States)

    Tu, Yiheng; Huang, Gan; Hung, Yeung Sam; Hu, Li; Hu, Yong; Zhang, Zhiguo

    2013-01-01

    Event-related potentials (ERPs) are widely used in brain-computer interface (BCI) systems as input signals conveying a subject's intention. A fast and reliable single-trial ERP detection method can be used to develop a BCI system with both high speed and high accuracy. However, most of single-trial ERP detection methods are developed for offline EEG analysis and thus have a high computational complexity and need manual operations. Therefore, they are not applicable to practical BCI systems, which require a low-complexity and automatic ERP detection method. This work presents a joint spatial-time-frequency filter that combines common spatial patterns (CSP) and wavelet filtering (WF) for improving the signal-to-noise (SNR) of visual evoked potentials (VEP), which can lead to a single-trial ERP-based BCI.

  17. Impact of respiratory motion correction and spatial resolution on lesion detection in PET: a simulation study based on real MR dynamic data

    Science.gov (United States)

    Polycarpou, Irene; Tsoumpas, Charalampos; King, Andrew P.; Marsden, Paul K.

    2014-02-01

    The aim of this study is to investigate the impact of respiratory motion correction and spatial resolution on lesion detectability in PET as a function of lesion size and tracer uptake. Real respiratory signals describing different breathing types are combined with a motion model formed from real dynamic MR data to simulate multiple dynamic PET datasets acquired from a continuously moving subject. Lung and liver lesions were simulated with diameters ranging from 6 to 12 mm and lesion to background ratio ranging from 3:1 to 6:1. Projection data for 6 and 3 mm PET scanner resolution were generated using analytic simulations and reconstructed without and with motion correction. Motion correction was achieved using motion compensated image reconstruction. The detectability performance was quantified by a receiver operating characteristic (ROC) analysis obtained using a channelized Hotelling observer and the area under the ROC curve (AUC) was calculated as the figure of merit. The results indicate that respiratory motion limits the detectability of lung and liver lesions, depending on the variation of the breathing cycle length and amplitude. Patients with large quiescent periods had a greater AUC than patients with regular breathing cycles and patients with long-term variability in respiratory cycle or higher motion amplitude. In addition, small (less than 10 mm diameter) or low contrast (3:1) lesions showed the greatest improvement in AUC as a result of applying motion correction. In particular, after applying motion correction the AUC is improved by up to 42% with current PET resolution (i.e. 6 mm) and up to 51% for higher PET resolution (i.e. 3 mm). Finally, the benefit of increasing the scanner resolution is small unless motion correction is applied. This investigation indicates high impact of respiratory motion correction on lesion detectability in PET and highlights the importance of motion correction in order to benefit from the increased resolution of future

  18. Impact of respiratory motion correction and spatial resolution on lesion detection in PET: a simulation study based on real MR dynamic data

    International Nuclear Information System (INIS)

    Polycarpou, Irene; Tsoumpas, Charalampos; King, Andrew P; Marsden, Paul K

    2014-01-01

    The aim of this study is to investigate the impact of respiratory motion correction and spatial resolution on lesion detectability in PET as a function of lesion size and tracer uptake. Real respiratory signals describing different breathing types are combined with a motion model formed from real dynamic MR data to simulate multiple dynamic PET datasets acquired from a continuously moving subject. Lung and liver lesions were simulated with diameters ranging from 6 to 12 mm and lesion to background ratio ranging from 3:1 to 6:1. Projection data for 6 and 3 mm PET scanner resolution were generated using analytic simulations and reconstructed without and with motion correction. Motion correction was achieved using motion compensated image reconstruction. The detectability performance was quantified by a receiver operating characteristic (ROC) analysis obtained using a channelized Hotelling observer and the area under the ROC curve (AUC) was calculated as the figure of merit. The results indicate that respiratory motion limits the detectability of lung and liver lesions, depending on the variation of the breathing cycle length and amplitude. Patients with large quiescent periods had a greater AUC than patients with regular breathing cycles and patients with long-term variability in respiratory cycle or higher motion amplitude. In addition, small (less than 10 mm diameter) or low contrast (3:1) lesions showed the greatest improvement in AUC as a result of applying motion correction. In particular, after applying motion correction the AUC is improved by up to 42% with current PET resolution (i.e. 6 mm) and up to 51% for higher PET resolution (i.e. 3 mm). Finally, the benefit of increasing the scanner resolution is small unless motion correction is applied. This investigation indicates high impact of respiratory motion correction on lesion detectability in PET and highlights the importance of motion correction in order to benefit from the increased resolution of future

  19. The effects of incidentally learned temporal and spatial predictability on response times and visual fixations during target detection and discrimination.

    Directory of Open Access Journals (Sweden)

    Melissa R Beck

    Full Text Available Responses are quicker to predictable stimuli than if the time and place of appearance is uncertain. Studies that manipulate target predictability often involve overt cues to speed up response times. However, less is known about whether individuals will exhibit faster response times when target predictability is embedded within the inter-trial relationships. The current research examined the combined effects of spatial and temporal target predictability on reaction time (RT and allocation of overt attention in a sustained attention task. Participants responded as quickly as possible to stimuli while their RT and eye movements were measured. Target temporal and spatial predictability were manipulated by altering the number of: 1 different time intervals between a response and the next target; and 2 possible spatial locations of the target. The effects of target predictability on target detection (Experiment 1 and target discrimination (Experiment 2 were tested. For both experiments, shorter RTs as target predictability increased across both space and time were found. In addition, the influences of spatial and temporal target predictability on RT and the overt allocation of attention were task dependent; suggesting that effective orienting of attention relies on both spatial and temporal predictability. These results indicate that stimulus predictability can be increased without overt cues and detected purely through inter-trial relationships over the course of repeated stimulus presentations.

  20. Regularization by External Variables

    DEFF Research Database (Denmark)

    Bossolini, Elena; Edwards, R.; Glendinning, P. A.

    2016-01-01

    Regularization was a big topic at the 2016 CRM Intensive Research Program on Advances in Nonsmooth Dynamics. There are many open questions concerning well known kinds of regularization (e.g., by smoothing or hysteresis). Here, we propose a framework for an alternative and important kind of regula......Regularization was a big topic at the 2016 CRM Intensive Research Program on Advances in Nonsmooth Dynamics. There are many open questions concerning well known kinds of regularization (e.g., by smoothing or hysteresis). Here, we propose a framework for an alternative and important kind...

  1. Regular expressions cookbook

    CERN Document Server

    Goyvaerts, Jan

    2009-01-01

    This cookbook provides more than 100 recipes to help you crunch data and manipulate text with regular expressions. Every programmer can find uses for regular expressions, but their power doesn't come worry-free. Even seasoned users often suffer from poor performance, false positives, false negatives, or perplexing bugs. Regular Expressions Cookbook offers step-by-step instructions for some of the most common tasks involving this tool, with recipes for C#, Java, JavaScript, Perl, PHP, Python, Ruby, and VB.NET. With this book, you will: Understand the basics of regular expressions through a

  2. Effect of harmonicity on the detection of a signal in a complex masker and on spatial release from masking.

    Directory of Open Access Journals (Sweden)

    Astrid Klinge

    Full Text Available The amount of masking of sounds from one source (signals by sounds from a competing source (maskers heavily depends on the sound characteristics of the masker and the signal and on their relative spatial location. Numerous studies investigated the ability to detect a signal in a speech or a noise masker or the effect of spatial separation of signal and masker on the amount of masking, but there is a lack of studies investigating the combined effects of many cues on the masking as is typical for natural listening situations. The current study using free-field listening systematically evaluates the combined effects of harmonicity and inharmonicity cues in multi-tone maskers and cues resulting from spatial separation of target signal and masker on the detection of a pure tone in a multi-tone or a noise masker. A linear binaural processing model was implemented to predict the masked thresholds in order to estimate whether the observed thresholds can be accounted for by energetic masking in the auditory periphery or whether other effects are involved. Thresholds were determined for combinations of two target frequencies (1 and 8 kHz, two spatial configurations (masker and target either co-located or spatially separated by 90 degrees azimuth, and five different masker types (four complex multi-tone stimuli, one noise masker. A spatial separation of target and masker resulted in a release from masking for all masker types. The amount of masking significantly depended on the masker type and frequency range. The various harmonic and inharmonic relations between target and masker or between components of the masker resulted in a complex pattern of increased or decreased masked thresholds in comparison to the predicted energetic masking. The results indicate that harmonicity cues affect the detectability of a tonal target in a complex masker.

  3. Chromosphere of K giant stars. Geometrical extent and spatial structure detection

    Science.gov (United States)

    Berio, P.; Merle, T.; Thévenin, F.; Bonneau, D.; Mourard, D.; Chesneau, O.; Delaa, O.; Ligi, R.; Nardetto, N.; Perraut, K.; Pichon, B.; Stee, P.; Tallon-Bosc, I.; Clausse, J. M.; Spang, A.; McAlister, H.; ten Brummelaar, T.; Sturmann, J.; Sturmann, L.; Turner, N.; Farrington, C.; Goldfinger, P. J.

    2011-11-01

    Context. Interferometers provide accurate diameter measurements of stars by analyzing both the continuum and the lines formed in photospheres and chromospheres. Tests of the geometrical extent of the chromospheres are therefore possible by comparing the estimated radius in the continuum of the photosphere and the estimated radii in chromospheric lines. Aims: We aim to constrain the geometrical extent of the chromosphere of non-binary K giant stars and detect any spatial structures in the chromosphere. Methods: We performed observations with the CHARA interferometer and the VEGA beam combiner at optical wavelengths. We observed seven non-binary K giant stars (β and η Cet, δ Crt, ρ Boo, β Oph, 109 Her, and ι Cep). We measured the ratio of the radii of the photosphere to the chromosphere using the interferometric measurements in the Hα and the Ca II infrared triplet line cores. For β Cet, spectro-interferometric observations are compared to a non-local thermal equilibrium (NLTE) semi-empirical model atmosphere including a chromosphere. The NLTE computations provide line intensities and contribution functions that indicate the relative locations where the line cores are formed and can constrain the size of the limb-darkened disk of the stars with chromospheres. We measured the angular diameter of seven K giant stars and deduced their fundamental parameters: effective temperatures, radii, luminosities, and masses. We determined the geometrical extent of the chromosphere for four giant stars (β and η Cet, δ Crt and ρ Boo). Results: The chromosphere extents obtained range between 16% to 47% of the stellar radius. The NLTE computations confirm that the Ca II/849 nm line core is deeper in the chromosphere of β Cet than either of the Ca II/854 nm and Ca II/866 nm line cores. We present a modified version of a semi-empirical model atmosphere derived by fitting the Ca II triplet line cores of this star. In four of our targets, we also detect the signature of a

  4. Right hemisphere dominance during spatial selective attention and target detection occurs outside the dorsal fronto-parietal network

    Science.gov (United States)

    Shulman, Gordon L.; Pope, Daniel L. W.; Astafiev, Serguei V.; McAvoy, Mark P.; Snyder, Abraham Z.; Corbetta, Maurizio

    2010-01-01

    Spatial selective attention is widely considered to be right hemisphere dominant. Previous functional magnetic resonance imaging (fMRI) studies, however, have reported bilateral blood-oxygenation-level-dependent (BOLD) responses in dorsal fronto-parietal regions during anticipatory shifts of attention to a location (Kastner et al., 1999; Corbetta et al., 2000; Hopfinger et al., 2000). Right-lateralized activity has mainly been reported in ventral fronto-parietal regions for shifts of attention to an unattended target stimulus (Arrington et al., 2000; Corbetta et al., 2000). However, clear conclusions cannot be drawn from these studies because hemispheric asymmetries were not assessed using direct voxel-wise comparisons of activity in left and right hemispheres. Here, we used this technique to measure hemispheric asymmetries during shifts of spatial attention evoked by a peripheral cue stimulus and during target detection at the cued location. Stimulus-driven shifts of spatial attention in both visual fields evoked right-hemisphere dominant activity in temporo-parietal junction (TPJ). Target detection at the attended location produced a more widespread right hemisphere dominance in frontal, parietal, and temporal cortex, including the TPJ region asymmetrically activated during shifts of spatial attention. However, hemispheric asymmetries were not observed during either shifts of attention or target detection in the dorsal fronto-parietal regions (anterior precuneus, medial intraparietal sulcus, frontal eye fields) that showed the most robust activations for shifts of attention. Therefore, right hemisphere dominance during stimulus-driven shifts of spatial attention and target detection reflects asymmetries in cortical regions that are largely distinct from the dorsal fronto-parietal network involved in the control of selective attention. PMID:20219998

  5. Detection and Classification of Multiple Objects using an RGB-D Sensor and Linear Spatial Pyramid Matching

    DEFF Research Database (Denmark)

    Dimitriou, Michalis; Kounalakis, Tsampikos; Vidakis, Nikolaos

    2013-01-01

    , connected components detection and filtering approaches, in order to design a complete image processing algorithm for efficient object detection of multiple individual objects in a single scene, even in complex scenes with many objects. Besides, we apply the Linear Spatial Pyramid Matching (LSPM) [1] method......This paper presents a complete system for multiple object detection and classification in a 3D scene using an RGB-D sensor such as the Microsoft Kinect sensor. Successful multiple object detection and classification are crucial features in many 3D computer vision applications. The main goal...... is making machines see and understand objects like humans do. To this goal, the new RGB-D sensors can be utilized since they provide real-time depth map which can be used along with the RGB images for our tasks. In our system we employ effective depth map processing techniques, along with edge detection...

  6. High-resolution seismic data regularization and wavefield separation

    Science.gov (United States)

    Cao, Aimin; Stump, Brian; DeShon, Heather

    2018-04-01

    We present a new algorithm, non-equispaced fast antileakage Fourier transform (NFALFT), for irregularly sampled seismic data regularization. Synthetic tests from 1-D to 5-D show that the algorithm may efficiently remove leaked energy in the frequency wavenumber domain, and its corresponding regularization process is accurate and fast. Taking advantage of the NFALFT algorithm, we suggest a new method (wavefield separation) for the detection of the Earth's inner core shear wave with irregularly distributed seismic arrays or networks. All interfering seismic phases that propagate along the minor arc are removed from the time window around the PKJKP arrival. The NFALFT algorithm is developed for seismic data, but may also be used for other irregularly sampled temporal or spatial data processing.

  7. Regularities of Multifractal Measures

    Indian Academy of Sciences (India)

    First, we prove the decomposition theorem for the regularities of multifractal Hausdorff measure and packing measure in R R d . This decomposition theorem enables us to split a set into regular and irregular parts, so that we can analyze each separately, and recombine them without affecting density properties. Next, we ...

  8. Stochastic analytic regularization

    International Nuclear Information System (INIS)

    Alfaro, J.

    1984-07-01

    Stochastic regularization is reexamined, pointing out a restriction on its use due to a new type of divergence which is not present in the unregulated theory. Furthermore, we introduce a new form of stochastic regularization which permits the use of a minimal subtraction scheme to define the renormalized Green functions. (author)

  9. Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography

    International Nuclear Information System (INIS)

    Prado, T. L.; Galuzio, P. P.; Lopes, S. R.; Viana, R. L.

    2014-01-01

    Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with a better control over the spurious fragments in the image

  10. Quantifying Forest Spatial Pattern Trends at Multiple Extents: An Approach to Detect Significant Changes at Different Scales

    Directory of Open Access Journals (Sweden)

    Ludovico Frate

    2014-09-01

    Full Text Available We propose a procedure to detect significant changes in forest spatial patterns and relevant scales. Our approach consists of four sequential steps. First, based on a series of multi-temporal forest maps, a set of geographic windows of increasing extents are extracted. Second, for each extent and date, specific stochastic simulations that replicate real-world spatial pattern characteristics are run. Third, by computing pattern metrics on both simulated and real maps, their empirical distributions and confidence intervals are derived. Finally, multi-temporal scalograms are built for each metric. Based on cover maps (1954, 2011 with a resolution of 10 m we analyze forest pattern changes in a central Apennines (Italy reserve at multiple spatial extents (128, 256 and 512 pixels. We identify three types of multi-temporal scalograms, depending on pattern metric behaviors, describing different dynamics of natural reforestation process. The statistical distribution and variability of pattern metrics at multiple extents offers a new and powerful tool to detect forest variations over time. Similar procedures can (i help to identify significant changes in spatial patterns and provide the bases to relate them to landscape processes; (ii minimize the bias when comparing pattern metrics at a single extent and (iii be extended to other landscapes and scales.

  11. Charge-coupled devices for particle detection with high spatial resolution

    International Nuclear Information System (INIS)

    Farley, F.J.; Damerell, C.J.S.; Gillman, A.R.; Wickens, F.J.

    1980-10-01

    The results of a study of the possible application of a thin microelectronic device (the charge-coupled device) to high energy physics as particle detectors with good spatial resolution which can distinguish between tracks emerging from the primary vertex and those from secondary vertices due to the decay of short lived particles with higher flavours, are reported. Performance characteristics indicating the spatial resolution, particle discrimination, time resolution, readout time and lifetime of such detectors have been obtained. (U.K.)

  12. Near-real-time radiography detects 0.1% changes in areal density with 1-millimeter spatial resolution

    International Nuclear Information System (INIS)

    Stupin, D.M.

    1987-06-01

    Using digital subtraction radiography, the author detects an 0.1% change in areal density in a phantom. Areal density is the product rho x, where rho is the material density and x is the material thickness. Therefore, it is possible to detect an 0.1% difference in either density or thickness in unknown samples. A special x-ray television camera detects the areal density change on the phantom. In a difference image, formed by subtracting the 128-television-frame averages of the phantom image from the phantom-and-step image, the step is resolved with a 1-mm spatial resolution. Surprisingly, crossed 2-μm-diam tungsten wires that overlie the phantom are also detected. This procedure takes a few seconds. The performance of any digital imaging x-ray system will improve by using the averaging and digital subtraction techniques. 8 refs., 6 figs

  13. Sparse structure regularized ranking

    KAUST Repository

    Wang, Jim Jing-Yan; Sun, Yijun; Gao, Xin

    2014-01-01

    Learning ranking scores is critical for the multimedia database retrieval problem. In this paper, we propose a novel ranking score learning algorithm by exploring the sparse structure and using it to regularize ranking scores. To explore the sparse

  14. Regular expression containment

    DEFF Research Database (Denmark)

    Henglein, Fritz; Nielsen, Lasse

    2011-01-01

    We present a new sound and complete axiomatization of regular expression containment. It consists of the conventional axiomatiza- tion of concatenation, alternation, empty set and (the singleton set containing) the empty string as an idempotent semiring, the fixed- point rule E* = 1 + E × E......* for Kleene-star, and a general coin- duction rule as the only additional rule. Our axiomatization gives rise to a natural computational inter- pretation of regular expressions as simple types that represent parse trees, and of containment proofs as coercions. This gives the axiom- atization a Curry......-Howard-style constructive interpretation: Con- tainment proofs do not only certify a language-theoretic contain- ment, but, under our computational interpretation, constructively transform a membership proof of a string in one regular expres- sion into a membership proof of the same string in another regular expression. We...

  15. Supersymmetric dimensional regularization

    International Nuclear Information System (INIS)

    Siegel, W.; Townsend, P.K.; van Nieuwenhuizen, P.

    1980-01-01

    There is a simple modification of dimension regularization which preserves supersymmetry: dimensional reduction to real D < 4, followed by analytic continuation to complex D. In terms of component fields, this means fixing the ranges of all indices on the fields (and therefore the numbers of Fermi and Bose components). For superfields, it means continuing in the dimensionality of x-space while fixing the dimensionality of theta-space. This regularization procedure allows the simple manipulation of spinor derivatives in supergraph calculations. The resulting rules are: (1) First do all algebra exactly as in D = 4; (2) Then do the momentum integrals as in ordinary dimensional regularization. This regularization procedure needs extra rules before one can say that it is consistent. Such extra rules needed for superconformal anomalies are discussed. Problems associated with renormalizability and higher order loops are also discussed

  16. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yunji; Jing, Bing-Yi; Gao, Xin

    2015-01-01

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  17. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan

    2015-02-12

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  18. Electrolyte system strategies for anionic isotachophoresis with electrospray-ionization mass-spectrometric detection. 1. Regular isotachophoresis and free-acid isotachophoresis

    Czech Academy of Sciences Publication Activity Database

    Malá, Zdeňka; Gebauer, Petr; Boček, Petr

    2013-01-01

    Roč. 34, 20-21 (2013), s. 3072-3078 ISSN 0173-0835 R&D Projects: GA ČR(CZ) GA13-05762S Institutional support: RVO:68081715 Keywords : Diclofenac * ESI-MS detection * Ibuprofen * isotachophoresis * water analysis Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.161, year: 2013

  19. Spatial statistics detect clustering patterns of kidney diseases in south-eastern Romania

    Directory of Open Access Journals (Sweden)

    Ruben I.

    2016-02-01

    Full Text Available Medical geography was conceptualized almost ten years ago due to its obvious usefulness in epidemiological research. Still, numerous diseases in many regions were neglected in these aspects of research, and the prevalence of kidney diseases in Eastern Europe is such an example. We evaluated the spatial patterns of main kidney diseases in south-eastern Romania, and highlighted the importance of spatial modeling in medical management in Romania. We found two statistically significant hotspots of kidney diseases prevalence. We also found differences in the spatial patterns between categories of diseases. We propose to speed up the process of creating a national database of records on kidney diseases. Offering the researchers access to a national database will allow further epidemiology studies in Romania and finally lead to a better management of medical services.

  20. Energy functions for regularization algorithms

    Science.gov (United States)

    Delingette, H.; Hebert, M.; Ikeuchi, K.

    1991-01-01

    Regularization techniques are widely used for inverse problem solving in computer vision such as surface reconstruction, edge detection, or optical flow estimation. Energy functions used for regularization algorithms measure how smooth a curve or surface is, and to render acceptable solutions these energies must verify certain properties such as invariance with Euclidean transformations or invariance with parameterization. The notion of smoothness energy is extended here to the notion of a differential stabilizer, and it is shown that to void the systematic underestimation of undercurvature for planar curve fitting, it is necessary that circles be the curves of maximum smoothness. A set of stabilizers is proposed that meet this condition as well as invariance with rotation and parameterization.

  1. Patterns of drug abuse among drug users with regular and irregular attendance for treatment as detected by comprehensive UHPLC-HR-TOF-MS.

    Science.gov (United States)

    Sundström, Mira; Pelander, Anna; Simojoki, Kaarlo; Ojanperä, Ilkka

    2016-01-01

    The most severe consequences of drug abuse include infectious diseases, overdoses, and drug-related deaths. As the range of toxicologically relevant compounds is continually changing due to the emergence of new psychoactive substances (NPS), laboratories are encountering analytical challenges. Current immunoassays are insufficient for determining the whole range of the drugs abused, and a broad-spectrum screening method is therefore needed. Here, the patterns of drug abuse in two groups of drug users were studied from urine samples using a comprehensive screening method based on high-resolution time-of-flight mass spectrometry. The two groups comprised drug abusers undergoing opioid maintenance treatment (OMT) or drug withdrawal therapy and routinely visiting a rehabilitation clinic, and drug abusers with irregular attendance at a harm reduction unit (HRU) and suspected of potential NPS abuse. Polydrug abuse was observed in both groups, but was more pronounced among the HRU subjects with a mean number of concurrent drugs per sample of 3.9, whereas among the regularly treated subjects the corresponding number was 2.1. NPS and pregabalin were more frequent among HRU subjects, and their abuse was always related to drug co-use. The most common drug combination for an HRU subject included amphetamine, cannabis, buprenorphine, benzodiazepine, and alpha-pyrrolidinovalerophenone. A typical set of drugs for treated subjects was buprenorphine, benzodiazepine, and occasionally amphetamine. Abuse of several concurrent drugs poses a higher risk of drug intoxication and a threat of premature termination of OMT. Since the subjects attending treatment used fewer concurrent drugs, this treatment could be valuable in reducing polydrug abuse. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Automated Detection of Geomorphic Features in LiDAR Point Clouds of Various Spatial Density

    Science.gov (United States)

    Dorninger, Peter; Székely, Balázs; Zámolyi, András.; Nothegger, Clemens

    2010-05-01

    considerably varying considerably because of the various base points that were needed to cover the whole landslide. The resulting point spacing is approximately 20 cm. The achievable accuracy was about 10 cm. The airborne data was acquired with mean point densities of 2 points per square-meter. The accuracy of this dataset was about 15 cm. The second testing site is an area of the Leithagebirge in Burgenland, Austria. The data was acquired by an airborne Riegl LMS-Q560 laser scanner mounted on a helicopter. The mean point density was 6-8 points per square with an accuracy better than 10 cm. We applied our processing chain on the datasets individually. First, they were transformed to local reference frames and fine adjustments of the individual scans respectively flight strips were applied. Subsequently, the local regression planes were determined for each point of the point clouds and planar features were extracted by means of the proposed approach. It turned out that even small displacements can be detected if the number of points used for the fit is enough to define a parallel but somewhat displaced plane. Smaller cracks and erosional incisions do not disturb the plane fitting, because mostly they are filtered out as outliers. A comparison of the different campaigns of the Doren site showed exciting matches of the detected geomorphic structures. Although the geomorphic structure of the Leithagebirge differs from the Doren landslide, and the scales of the two studies were also different, reliable results were achieved in both cases. Additionally, the approach turned out to be highly robust against points which were not located on the terrain. Hence, no false positives were determined within the dense vegetation above the terrain, while it was possible to cover the investigated areas completely with reliable planes. In some cases, however, some structures in the tree crowns were also recognized, but these small patches could be very well sorted out from the geomorphically

  3. Shaping and detecting mid-IR light with a spatial light modulator

    CSIR Research Space (South Africa)

    Maweza, Elijah L

    2016-10-01

    Full Text Available modulator Maweza, Elijah L Gailele, Lucas M Strauss, Hencharl J Litvin, Ihar Forbes, Andrew Dudley, Angela L ABSTRACT: We demonstrate the operation and calibration of a spatial light modulator in the mid-IR region by creating and measuring...

  4. An iterative detection method of MIMO over spatial correlated frequency selective channel: using list sphere decoding for simplification

    Science.gov (United States)

    Shi, Zhiping; Yan, Bing

    2010-08-01

    In multiple-input multiple-output(MIMO) wireless systems, combining good channel codes(e.g., Non-binary Repeat Accumulate codes) with adaptive turbo equalization is a good option to get better performance and lower complexity under Spatial Correlated Frequency Selective(SCFS) Channel. The key of this method is after joint antennas MMSE detection (JAD/MMSE) based on interruption cancelling using soft information, considering the detection result as an output of a Gaussian equivalent flat fading channel, and performing maximum likelihood detection(ML) to get more correct estimated result. But the using of ML brings great complexity increase, which is not allowed. In this paper, a low complexity method called list sphere decoding is introduced and applied to replace the ML in order to simplify the adaptive iterative turbo equalization system.

  5. Spatial-Temporal Synchrophasor Data Characterization and Analytics in Smart Grid Fault Detection, Identification, and Impact Causal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Huaiguang; Dai, Xiaoxiao; Gao, David Wenzhong; Zhang, Jun Jason; Zhang, Yingchen; Muljadi, Eduard

    2016-09-01

    An approach of big data characterization for smart grids (SGs) and its applications in fault detection, identification, and causal impact analysis is proposed in this paper, which aims to provide substantial data volume reduction while keeping comprehensive information from synchrophasor measurements in spatial and temporal domains. Especially, based on secondary voltage control (SVC) and local SG observation algorithm, a two-layer dynamic optimal synchrophasor measurement devices selection algorithm (OSMDSA) is proposed to determine SVC zones, their corresponding pilot buses, and the optimal synchrophasor measurement devices. Combining the two-layer dynamic OSMDSA and matching pursuit decomposition, the synchrophasor data is completely characterized in the spatial-temporal domain. To demonstrate the effectiveness of the proposed characterization approach, SG situational awareness is investigated based on hidden Markov model based fault detection and identification using the spatial-temporal characteristics generated from the reduced data. To identify the major impact buses, the weighted Granger causality for SGs is proposed to investigate the causal relationship of buses during system disturbance. The IEEE 39-bus system and IEEE 118-bus system are employed to validate and evaluate the proposed approach.

  6. Spatial frequency characteristics at image decision-point locations for observers with different radiological backgrounds in lung nodule detection

    Science.gov (United States)

    Pietrzyk, Mariusz W.; Manning, David J.; Dix, Alan; Donovan, Tim

    2009-02-01

    Aim: The goal of the study is to determine the spatial frequency characteristics at locations in the image of overt and covert observers' decisions and find out if there are any similarities in different observers' groups: the same radiological experience group or the same accuracy scored level. Background: The radiological task is described as a visual searching decision making procedure involving visual perception and cognitive processing. Humans perceive the world through a number of spatial frequency channels, each sensitive to visual information carried by different spatial frequency ranges and orientations. Recent studies have shown that particular physical properties of local and global image-based elements are correlated with the performance and the level of experience of human observers in breast cancer and lung nodule detections. Neurological findings in visual perception were an inspiration for wavelet applications in vision research because the methodology tries to mimic the brain processing algorithms. Methods: The wavelet approach to the set of postero-anterior chest radiographs analysis has been used to characterize perceptual preferences observers with different levels of experience in the radiological task. Psychophysical methodology has been applied to track eye movements over the image, where particular ROIs related to the observers' fixation clusters has been analysed in the spaces frame by Daubechies functions. Results: Significance differences have been found between the spatial frequency characteristics at the location of different decisions.

  7. Fractional Regularization Term for Variational Image Registration

    Directory of Open Access Journals (Sweden)

    Rafael Verdú-Monedero

    2009-01-01

    Full Text Available Image registration is a widely used task of image analysis with applications in many fields. Its classical formulation and current improvements are given in the spatial domain. In this paper a regularization term based on fractional order derivatives is formulated. This term is defined and implemented in the frequency domain by translating the energy functional into the frequency domain and obtaining the Euler-Lagrange equations which minimize it. The new regularization term leads to a simple formulation and design, being applicable to higher dimensions by using the corresponding multidimensional Fourier transform. The proposed regularization term allows for a real gradual transition from a diffusion registration to a curvature registration which is best suited to some applications and it is not possible in the spatial domain. Results with 3D actual images show the validity of this approach.

  8. Attentional effects on preattentive vision: Spatial precues affect the detection of simple features

    NARCIS (Netherlands)

    Theeuwes, J.; Kramer, A.F.; Atchley, P.

    1999-01-01

    Most accounts of visual perception hold that the detection of primitive features occurs preattentively, in parallel across the visual field. Evidence that preattentive vision operates without attentional limitations comes from visual search tasks in which the detection of the presence or absence of

  9. Detecting high spatial variability of ice shelf basal mass balance, Roi Baudouin Ice Shelf, Antarctica

    Directory of Open Access Journals (Sweden)

    S. Berger

    2017-11-01

    Full Text Available Ice shelves control the dynamic mass loss of ice sheets through buttressing and their integrity depends on the spatial variability of their basal mass balance (BMB, i.e. the difference between refreezing and melting. Here, we present an improved technique – based on satellite observations – to capture the small-scale variability in the BMB of ice shelves. As a case study, we apply the methodology to the Roi Baudouin Ice Shelf, Dronning Maud Land, East Antarctica, and derive its yearly averaged BMB at 10 m horizontal gridding. We use mass conservation in a Lagrangian framework based on high-resolution surface velocities, atmospheric-model surface mass balance and hydrostatic ice-thickness fields (derived from TanDEM-X surface elevation. Spatial derivatives are implemented using the total-variation differentiation, which preserves abrupt changes in flow velocities and their spatial gradients. Such changes may reflect a dynamic response to localized basal melting and should be included in the mass budget. Our BMB field exhibits much spatial detail and ranges from −14.7 to 8.6 m a−1 ice equivalent. Highest melt rates are found close to the grounding line where the pressure melting point is high, and the ice shelf slope is steep. The BMB field agrees well with on-site measurements from phase-sensitive radar, although independent radar profiling indicates unresolved spatial variations in firn density. We show that an elliptical surface depression (10 m deep and with an extent of 0.7 km × 1.3 km lowers by 0.5 to 1.4 m a−1, which we tentatively attribute to a transient adaptation to hydrostatic equilibrium. We find evidence for elevated melting beneath ice shelf channels (with melting being concentrated on the channel's flanks. However, farther downstream from the grounding line, the majority of ice shelf channels advect passively (i.e. no melting nor refreezing toward the ice shelf front. Although the absolute, satellite

  10. Improvements in GRACE Gravity Fields Using Regularization

    Science.gov (United States)

    Save, H.; Bettadpur, S.; Tapley, B. D.

    2008-12-01

    The unconstrained global gravity field models derived from GRACE are susceptible to systematic errors that show up as broad "stripes" aligned in a North-South direction on the global maps of mass flux. These errors are believed to be a consequence of both systematic and random errors in the data that are amplified by the nature of the gravity field inverse problem. These errors impede scientific exploitation of the GRACE data products, and limit the realizable spatial resolution of the GRACE global gravity fields in certain regions. We use regularization techniques to reduce these "stripe" errors in the gravity field products. The regularization criteria are designed such that there is no attenuation of the signal and that the solutions fit the observations as well as an unconstrained solution. We have used a computationally inexpensive method, normally referred to as "L-ribbon", to find the regularization parameter. This paper discusses the characteristics and statistics of a 5-year time-series of regularized gravity field solutions. The solutions show markedly reduced stripes, are of uniformly good quality over time, and leave little or no systematic observation residuals, which is a frequent consequence of signal suppression from regularization. Up to degree 14, the signal in regularized solution shows correlation greater than 0.8 with the un-regularized CSR Release-04 solutions. Signals from large-amplitude and small-spatial extent events - such as the Great Sumatra Andaman Earthquake of 2004 - are visible in the global solutions without using special post-facto error reduction techniques employed previously in the literature. Hydrological signals as small as 5 cm water-layer equivalent in the small river basins, like Indus and Nile for example, are clearly evident, in contrast to noisy estimates from RL04. The residual variability over the oceans relative to a seasonal fit is small except at higher latitudes, and is evident without the need for de-striping or

  11. Data, data everywhere: detecting spatial patterns in fine-scale ecological information collected across a continent

    Science.gov (United States)

    Kevin M. Potter; Frank H. Koch; Christopher M. Oswalt; Basil V. Iannone

    2016-01-01

    Context Fine-scale ecological data collected across broad regions are becoming increasingly available. Appropriate geographic analyses of these data can help identify locations of ecological concern. Objectives We present one such approach, spatial association of scalable hexagons (SASH), whichidentifies locations where ecological phenomena occur at greater...

  12. Detecting spatial memory deficits beyond blindness in tg2576 Alzheimer mice.

    Science.gov (United States)

    Yassine, Nour; Lazaris, Anelise; Dorner-Ciossek, Cornelia; Després, Olivier; Meyer, Laurence; Maitre, Michel; Mensah-Nyagan, Ayikoe Guy; Cassel, Jean-Christophe; Mathis, Chantal

    2013-03-01

    The retinal degeneration Pde6b(rd1) (rd) mutation can be a major pitfall in behavioral studies using tg2576 mice bred on a B6:SJL genetic background, 1 of the most widely used models of Alzheimer's disease. After a pilot study in wild type mice, performance of 8- and 16-month-old tg2576 mice were assessed in several behavioral tasks with the challenge of selecting 1 or more task(s) showing robust memory deficits on this genetic background. Water maze acquisition was impossible in rd homozygotes, whereas Y-maze alternation, object recognition, and olfactory discrimination were unaffected by both the transgene and the rd mutation. Spatial memory retention of 8- and 16-month-old tg2576 mice, however, was dramatically affected independently of the rd mutation when mice had to recognize a spatial configuration of objects or to perform the Barnes maze. Thus, the latter tasks appear extremely useful to evaluate spatial memory deficits and to test cognitive therapies in tg2576 mice and other mouse models bred on a background susceptible to visual impairment. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Importance of spatial and spectral data reduction in the detection of internal defects in food products.

    Science.gov (United States)

    Zhang, Xuechen; Nansen, Christian; Aryamanesh, Nader; Yan, Guijun; Boussaid, Farid

    2015-04-01

    Despite the importance of data reduction as part of the processing of reflection-based classifications, this study represents one of the first in which the effects of both spatial and spectral data reductions on classification accuracies are quantified. Furthermore, the effects of approaches to data reduction were quantified for two separate classification methods, linear discriminant analysis (LDA) and support vector machine (SVM). As the model dataset, reflection data were acquired using a hyperspectral camera in 230 spectral channels from 401 to 879 nm (spectral resolution of 2.1 nm) from field pea (Pisum sativum) samples with and without internal pea weevil (Bruchus pisorum) infestation. We deployed five levels of spatial data reduction (binning) and eight levels of spectral data reduction (40 datasets). Forward stepwise LDA was used to select and include only spectral channels contributing the most to the separation of pixels from non-infested and infested field peas. Classification accuracies obtained with LDA and SVM were based on the classification of independent validation datasets. Overall, SVMs had significantly higher classification accuracies than LDAs (P food products with internal defects, and it highlights that spatial and spectral data reductions can (1) improve classification accuracies, (2) vastly decrease computer constraints, and (3) reduce analytical concerns associated with classifications of large and high-dimensional datasets.

  14. Effect of spatial noise of medical grade Liquid Crystal Displays (LCD) on the detection of micro-calcification

    Science.gov (United States)

    Roehrig, Hans; Fan, Jiahua; Dallas, William J.; Krupinski, Elizabeth A.; Johnson, Jeffrey

    2009-08-01

    This presentation describes work in progress that is the result of an NIH SBIR Phase 1 project that addresses the wide- spread concern for the large number of breast-cancers and cancer victims [1,2]. The primary goal of the project is to increase the detection rate of microcalcifications as a result of the decrease of spatial noise of the LCDs used to display the mammograms [3,4]. Noise reduction is to be accomplished with the aid of a high performance CCD camera and subsequent application of local-mean equalization and error diffusion [5,6]. A second goal of the project is the actual detection of breast cancer. Contrary to the approach to mammography, where the mammograms typically have a pixel matrix of approximately 1900 x 2300 pixels, otherwise known as FFDM or Full-Field Digital Mammograms, we will only use sections of mammograms with a pixel matrix of 256 x 256 pixels. This is because at this time, reduction of spatial noise on an LCD can only be done on relatively small areas like 256 x 256 pixels. In addition, judging the efficacy for detection of breast cancer will be done using two methods: One is a conventional ROC study [7], the other is a vision model developed over several years starting at the Sarnoff Research Center and continuing at the Siemens Corporate Research in Princeton NJ [8].

  15. Noise-induced tinnitus using individualized gap detection analysis and its relationship with hyperacusis, anxiety, and spatial cognition.

    Directory of Open Access Journals (Sweden)

    Edward Pace

    Full Text Available Tinnitus has a complex etiology that involves auditory and non-auditory factors and may be accompanied by hyperacusis, anxiety and cognitive changes. Thus far, investigations of the interrelationship between tinnitus and auditory and non-auditory impairment have yielded conflicting results. To further address this issue, we noise exposed rats and assessed them for tinnitus using a gap detection behavioral paradigm combined with statistically-driven analysis to diagnose tinnitus in individual rats. We also tested rats for hearing detection, responsivity, and loss using prepulse inhibition and auditory brainstem response, and for spatial cognition and anxiety using Morris water maze and elevated plus maze. We found that our tinnitus diagnosis method reliably separated noise-exposed rats into tinnitus((+ and tinnitus((- groups and detected no evidence of tinnitus in tinnitus((- and control rats. In addition, the tinnitus((+ group demonstrated enhanced startle amplitude, indicating hyperacusis-like behavior. Despite these results, neither tinnitus, hyperacusis nor hearing loss yielded any significant effects on spatial learning and memory or anxiety, though a majority of rats with the highest anxiety levels had tinnitus. These findings showed that we were able to develop a clinically relevant tinnitus((+ group and that our diagnosis method is sound. At the same time, like clinical studies, we found that tinnitus does not always result in cognitive-emotional dysfunction, although tinnitus may predispose subjects to certain impairment like anxiety. Other behavioral assessments may be needed to further define the relationship between tinnitus and anxiety, cognitive deficits, and other impairments.

  16. Integrated circuit-based electrochemical sensor for spatially resolved detection of redox-active metabolites in biofilms.

    Science.gov (United States)

    Bellin, Daniel L; Sakhtah, Hassan; Rosenstein, Jacob K; Levine, Peter M; Thimot, Jordan; Emmett, Kevin; Dietrich, Lars E P; Shepard, Kenneth L

    2014-01-01

    Despite advances in monitoring spatiotemporal expression patterns of genes and proteins with fluorescent probes, direct detection of metabolites and small molecules remains challenging. A technique for spatially resolved detection of small molecules would benefit the study of redox-active metabolites that are produced by microbial biofilms and can affect their development. Here we present an integrated circuit-based electrochemical sensing platform featuring an array of working electrodes and parallel potentiostat channels. 'Images' over a 3.25 × 0.9 mm(2) area can be captured with a diffusion-limited spatial resolution of 750 μm. We demonstrate that square wave voltammetry can be used to detect, identify and quantify (for concentrations as low as 2.6 μM) four distinct redox-active metabolites called phenazines. We characterize phenazine production in both wild-type and mutant Pseudomonas aeruginosa PA14 colony biofilms, and find correlations with fluorescent reporter imaging of phenazine biosynthetic gene expression.

  17. Manifold Regularized Reinforcement Learning.

    Science.gov (United States)

    Li, Hongliang; Liu, Derong; Wang, Ding

    2018-04-01

    This paper introduces a novel manifold regularized reinforcement learning scheme for continuous Markov decision processes. Smooth feature representations for value function approximation can be automatically learned using the unsupervised manifold regularization method. The learned features are data-driven, and can be adapted to the geometry of the state space. Furthermore, the scheme provides a direct basis representation extension for novel samples during policy learning and control. The performance of the proposed scheme is evaluated on two benchmark control tasks, i.e., the inverted pendulum and the energy storage problem. Simulation results illustrate the concepts of the proposed scheme and show that it can obtain excellent performance.

  18. Detecting spatial patterns with the cumulant function – Part 2: An application to El Niño

    Directory of Open Access Journals (Sweden)

    P. Yiou

    2008-02-01

    Full Text Available The spatial coherence of a measured variable (e.g. temperature or pressure is often studied to determine the regions of high variability or to find teleconnections, i.e. correlations between specific regions. While usual methods to find spatial patterns, such as Principal Components Analysis (PCA, are constrained by linear symmetries, the dependence of variables such as temperature or pressure at different locations is generally nonlinear. In particular, large deviations from the sample mean are expected to be strongly affected by such nonlinearities. Here we apply a newly developed nonlinear technique (Maxima of Cumulant Function, MCF for detection of typical spatial patterns that largely deviate from the mean. In order to test the technique and to introduce the methodology, we focus on the El Niño/Southern Oscillation and its spatial patterns. We find nonsymmetric temperature patterns corresponding to El Niño and La Niña, and we compare the results of MCF with other techniques, such as the symmetric solutions of PCA, and the nonsymmetric solutions of Nonlinear PCA (NLPCA. We found that MCF solutions are more reliable than the NLPCA fits, and can capture mixtures of principal components. Finally, we apply Extreme Value Theory on the temporal variations extracted from our methodology. We find that the tails of the distribution of extreme temperatures during La Niña episodes is bounded, while the tail during El Niños is less likely to be bounded. This implies that the mean spatial patterns of the two phases are asymmetric, as well as the behaviour of their extremes.

  19. Low Frequency Waves Detected in a Large Wave Flume under Irregular Waves with Different Grouping Factor and Combination of Regular Waves

    Directory of Open Access Journals (Sweden)

    Luigia Riefolo

    2018-02-01

    Full Text Available This paper describes a set of experiments undertaken at Universitat Politècnica de Catalunya in the large wave flume of the Maritime Engineering Laboratory. The purpose of this study is to highlight the effects of wave grouping and long-wave short-wave combinations regimes on low frequency generations. An eigen-value decomposition has been performed to discriminate low frequencies. In particular, measured eigen modes, determined through the spectral analysis, have been compared with calculated modes by means of eigen analysis. The low frequencies detection appears to confirm the dependence on groupiness of the modal amplitudes generated in the wave flume. Some evidence of the influence of low frequency waves on runup and transport patterns are shown. In particular, the generation and evolution of secondary bedforms are consistent with energy transferred between the standing wave modes.

  20. Detection of Tuberculosis Infection Hotspots Using Activity Spaces Based Spatial Approach in an Urban Tokyo, from 2003 to 2011.

    Directory of Open Access Journals (Sweden)

    Kiyohiko Izumi

    Full Text Available Identifying ongoing tuberculosis infection sites is crucial for breaking chains of transmission in tuberculosis-prevalent urban areas. Previous studies have pointed out that detection of local accumulation of tuberculosis patients based on their residential addresses may be limited by a lack of matching between residences and tuberculosis infection sites. This study aimed to identify possible tuberculosis hotspots using TB genotype clustering statuses and a concept of "activity space", a place where patients spend most of their waking hours. We further compared the spatial distribution by different residential statuses and describe urban environmental features of the detected hotspots.Culture-positive tuberculosis patients notified to Shinjuku city from 2003 to 2011 were enrolled in this case-based cross-sectional study, and their demographic and clinical information, TB genotype clustering statuses, and activity space were collected. Spatial statistics (Global Moran's I and Getis-Ord Gi* statistics identified significant hotspots in 152 census tracts, and urban environmental features and tuberculosis patients' characteristics in these hotspots were assessed.Of the enrolled 643 culture-positive tuberculosis patients, 416 (64.2% were general inhabitants, 42 (6.5% were foreign-born people, and 184 were homeless people (28.6%. The percentage of overall genotype clustering was 43.7%. Genotype-clustered general inhabitants and homeless people formed significant hotspots around a major railway station, whereas the non-clustered general inhabitants formed no hotspots. This suggested the detected hotspots of activity spaces may reflect ongoing tuberculosis transmission sites and were characterized by smaller residential floor size and a higher proportion of non-working households.Activity space-based spatial analysis suggested possible TB transmission sites around the major railway station and it can assist in further comprehension of TB transmission

  1. Detection of Tuberculosis Infection Hotspots Using Activity Spaces Based Spatial Approach in an Urban Tokyo, from 2003 to 2011.

    Science.gov (United States)

    Izumi, Kiyohiko; Ohkado, Akihiro; Uchimura, Kazuhiro; Murase, Yoshiro; Tatsumi, Yuriko; Kayebeta, Aya; Watanabe, Yu; Ishikawa, Nobukatsu

    2015-01-01

    Identifying ongoing tuberculosis infection sites is crucial for breaking chains of transmission in tuberculosis-prevalent urban areas. Previous studies have pointed out that detection of local accumulation of tuberculosis patients based on their residential addresses may be limited by a lack of matching between residences and tuberculosis infection sites. This study aimed to identify possible tuberculosis hotspots using TB genotype clustering statuses and a concept of "activity space", a place where patients spend most of their waking hours. We further compared the spatial distribution by different residential statuses and describe urban environmental features of the detected hotspots. Culture-positive tuberculosis patients notified to Shinjuku city from 2003 to 2011 were enrolled in this case-based cross-sectional study, and their demographic and clinical information, TB genotype clustering statuses, and activity space were collected. Spatial statistics (Global Moran's I and Getis-Ord Gi* statistics) identified significant hotspots in 152 census tracts, and urban environmental features and tuberculosis patients' characteristics in these hotspots were assessed. Of the enrolled 643 culture-positive tuberculosis patients, 416 (64.2%) were general inhabitants, 42 (6.5%) were foreign-born people, and 184 were homeless people (28.6%). The percentage of overall genotype clustering was 43.7%. Genotype-clustered general inhabitants and homeless people formed significant hotspots around a major railway station, whereas the non-clustered general inhabitants formed no hotspots. This suggested the detected hotspots of activity spaces may reflect ongoing tuberculosis transmission sites and were characterized by smaller residential floor size and a higher proportion of non-working households. Activity space-based spatial analysis suggested possible TB transmission sites around the major railway station and it can assist in further comprehension of TB transmission dynamics in an

  2. Fast EEG spike detection via eigenvalue analysis and clustering of spatial amplitude distribution

    Science.gov (United States)

    Fukami, Tadanori; Shimada, Takamasa; Ishikawa, Bunnoshin

    2018-06-01

    Objective. In the current study, we tested a proposed method for fast spike detection in electroencephalography (EEG). Approach. We performed eigenvalue analysis in two-dimensional space spanned by gradients calculated from two neighboring samples to detect high-amplitude negative peaks. We extracted the spike candidates by imposing restrictions on parameters regarding spike shape and eigenvalues reflecting detection characteristics of individual medical doctors. We subsequently performed clustering, classifying detected peaks by considering the amplitude distribution at 19 scalp electrodes. Clusters with a small number of candidates were excluded. We then defined a score for eliminating spike candidates for which the pattern of detected electrodes differed from the overall pattern in a cluster. Spikes were detected by setting the score threshold. Main results. Based on visual inspection by a psychiatrist experienced in EEG, we evaluated the proposed method using two statistical measures of precision and recall with respect to detection performance. We found that precision and recall exhibited a trade-off relationship. The average recall value was 0.708 in eight subjects with the score threshold that maximized the F-measure, with 58.6  ±  36.2 spikes per subject. Under this condition, the average precision was 0.390, corresponding to a false positive rate 2.09 times higher than the true positive rate. Analysis of the required processing time revealed that, using a general-purpose computer, our method could be used to perform spike detection in 12.1% of the recording time. The process of narrowing down spike candidates based on shape occupied most of the processing time. Significance. Although the average recall value was comparable with that of other studies, the proposed method significantly shortened the processing time.

  3. Land cover mapping and change detection in urban watersheds using QuickBird high spatial resolution satellite imagery

    Science.gov (United States)

    Hester, David Barry

    The objective of this research was to develop methods for urban land cover analysis using QuickBird high spatial resolution satellite imagery. Such imagery has emerged as a rich commercially available remote sensing data source and has enjoyed high-profile broadcast news media and Internet applications, but methods of quantitative analysis have not been thoroughly explored. The research described here consists of three studies focused on the use of pan-sharpened 61-cm spatial resolution QuickBird imagery, the spatial resolution of which is the highest of any commercial satellite. In the first study, a per-pixel land cover classification method is developed for use with this imagery. This method utilizes a per-pixel classification approach to generate an accurate six-category high spatial resolution land cover map of a developing suburban area. The primary objective of the second study was to develop an accurate land cover change detection method for use with QuickBird land cover products. This work presents an efficient fuzzy framework for transforming map uncertainty into accurate and meaningful high spatial resolution land cover change analysis. The third study described here is an urban planning application of the high spatial resolution QuickBird-based land cover product developed in the first study. This work both meaningfully connects this exciting new data source to urban watershed management and makes an important empirical contribution to the study of suburban watersheds. Its analysis of residential roads and driveways as well as retail parking lots sheds valuable light on the impact of transportation-related land use on the suburban landscape. Broadly, these studies provide new methods for using state-of-the-art remote sensing data to inform land cover analysis and urban planning. These methods are widely adaptable and produce land cover products that are both meaningful and accurate. As additional high spatial resolution satellites are launched and the

  4. Locating sensors for detecting source-to-target patterns of special nuclear material smuggling: a spatial information theoretic approach.

    Science.gov (United States)

    Przybyla, Jay; Taylor, Jeffrey; Zhou, Xuesong

    2010-01-01

    In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM) smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  5. Locating Sensors for Detecting Source-to-Target Patterns of Special Nuclear Material Smuggling: A Spatial Information Theoretic Approach

    Directory of Open Access Journals (Sweden)

    Xuesong Zhou

    2010-08-01

    Full Text Available In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  6. Land Cover Change Detection in Urban Lake Areas Using Multi-Temporary Very High Spatial Resolution Aerial Images

    Directory of Open Access Journals (Sweden)

    Wenyuan Zhang

    2018-01-01

    Full Text Available The availability of very high spatial resolution (VHR remote sensing imagery provides unique opportunities to exploit meaningful change information in detail with object-oriented image analysis. This study investigated land cover (LC changes in Shahu Lake of Wuhan using multi-temporal VHR aerial images in the years 1978, 1981, 1989, 1995, 2003, and 2011. A multi-resolution segmentation algorithm and CART (classification and regression trees classifier were employed to perform highly accurate LC classification of the individual images, while a post-classification comparison method was used to detect changes. The experiments demonstrated that significant changes in LC occurred along with the rapid urbanization during 1978–2011. The dominant changes that took place in the study area were lake and vegetation shrinking, replaced by high density buildings and roads. The total area of Shahu Lake decreased from ~7.64 km2 to ~3.60 km2 during the past 33 years, where 52.91% of its original area was lost. The presented results also indicated that urban expansion and inadequate legislative protection are the main factors in Shahu Lake’s shrinking. The object-oriented change detection schema presented in this manuscript enables us to better understand the specific spatial changes of Shahu Lake, which can be used to make reasonable decisions for lake protection and urban development.

  7. A novel airport extraction model based on saliency region detection for high spatial resolution remote sensing images

    Science.gov (United States)

    Lv, Wen; Zhang, Libao; Zhu, Yongchun

    2017-06-01

    The airport is one of the most crucial traffic facilities in military and civil fields. Automatic airport extraction in high spatial resolution remote sensing images has many applications such as regional planning and military reconnaissance. Traditional airport extraction strategies usually base on prior knowledge and locate the airport target by template matching and classification, which will cause high computation complexity and large costs of computing resources for high spatial resolution remote sensing images. In this paper, we propose a novel automatic airport extraction model based on saliency region detection, airport runway extraction and adaptive threshold segmentation. In saliency region detection, we choose frequency-tuned (FT) model for computing airport saliency using low level features of color and luminance that is easy and fast to implement and can provide full-resolution saliency maps. In airport runway extraction, Hough transform is adopted to count the number of parallel line segments. In adaptive threshold segmentation, the Otsu threshold segmentation algorithm is proposed to obtain more accurate airport regions. The experimental results demonstrate that the proposed model outperforms existing saliency analysis models and shows good performance in the extraction of the airport.

  8. Regularity and predictability of human mobility in personal space.

    Directory of Open Access Journals (Sweden)

    Daniel Austin

    Full Text Available Fundamental laws governing human mobility have many important applications such as forecasting and controlling epidemics or optimizing transportation systems. These mobility patterns, studied in the context of out of home activity during travel or social interactions with observations recorded from cell phone use or diffusion of money, suggest that in extra-personal space humans follow a high degree of temporal and spatial regularity - most often in the form of time-independent universal scaling laws. Here we show that mobility patterns of older individuals in their home also show a high degree of predictability and regularity, although in a different way than has been reported for out-of-home mobility. Studying a data set of almost 15 million observations from 19 adults spanning up to 5 years of unobtrusive longitudinal home activity monitoring, we find that in-home mobility is not well represented by a universal scaling law, but that significant structure (predictability and regularity is uncovered when explicitly accounting for contextual data in a model of in-home mobility. These results suggest that human mobility in personal space is highly stereotyped, and that monitoring discontinuities in routine room-level mobility patterns may provide an opportunity to predict individual human health and functional status or detect adverse events and trends.

  9. Detection of User Independent Single Trial ERPs in Brain Computer Interfaces: An Adaptive Spatial Filtering Approach

    DEFF Research Database (Denmark)

    Leza, Cristina; Puthusserypady, Sadasivan

    2017-01-01

    Brain Computer Interfaces (BCIs) use brain signals to communicate with the external world. The main challenges to address are speed, accuracy and adaptability. Here, a novel algorithm for P300 based BCI spelling system is presented, specifically suited for single-trial detection of Event...

  10. Spatially distributed damage detection in CMC thermal protection materials using thin-film piezoelectric sensors

    Science.gov (United States)

    Kuhr, Samuel J.; Blackshire, James L.; Na, Jeong K.

    2009-03-01

    Thermal protection systems (TPS) of aerospace vehicles are subjected to impacts during in-flight use and vehicle refurbishment. The damage resulting from such impacts can produce localized regions that are unable to resist extreme temperatures. Therefore it is essential to have a reliable method to detect, locate, and quantify the damage occurring from such impacts. The objective of this research is to demonstrate a capability that could lead to detecting, locating and quantifying impact events for ceramic matrix composite (CMC) wrapped tile TPS via sensors embedded in the TPS material. Previous research had shown a correlation between impact energies, material damage state, and polyvinylidene fluoride (PVDF) sensor response for impact energies between 0.07 - 1.00 Joules, where impact events were located directly over the sensor positions1. In this effort, the effectiveness of a sensor array is evaluated for detecting and locating low energy impacts on a CMC wrapped TPS. The sensor array, which is adhered to the internal surface of the TPS tile, is used to detect low energy impact events that occur at different locations. The analysis includes an evaluation of signal amplitude levels, time-of-flight measurements, and signal frequency content. Multiple impacts are performed at each location to study the repeatability of each measurement.

  11. Designing efficient surveys: spatial arrangement of sample points for detection of invasive species

    Czech Academy of Sciences Publication Activity Database

    Berec, Luděk; Kean, J. M.; Epanchin-Niell, R.; Liebhold, A. M.; Haight, R. G.

    2015-01-01

    Roč. 17, č. 1 (2015), s. 445-459 ISSN 1387-3547 Grant - others:National Science Foundation(US) DEB-0553768 Institutional support: RVO:60077344 Keywords : biosecurity * early pest detection * eradication Subject RIV: EH - Ecology, Behaviour Impact factor: 2.855, year: 2015 http://link.springer.com/article/10.1007%2Fs10530-014-0742-x

  12. Genet-specific DNA methylation probabilities detected in a spatial epigenetic analysis of a clonal plant population.

    Directory of Open Access Journals (Sweden)

    Kiwako S Araki

    Full Text Available In sessile organisms such as plants, spatial genetic structures of populations show long-lasting patterns. These structures have been analyzed across diverse taxa to understand the processes that determine the genetic makeup of organismal populations. For many sessile organisms that mainly propagate via clonal spread, epigenetic status can vary between clonal individuals in the absence of genetic changes. However, fewer previous studies have explored the epigenetic properties in comparison to the genetic properties of natural plant populations. Here, we report the simultaneous evaluation of the spatial structure of genetic and epigenetic variation in a natural population of the clonal plant Cardamine leucantha. We applied a hierarchical Bayesian model to evaluate the effects of membership of a genet (a group of individuals clonally derived from a single seed and vegetation cover on the epigenetic variation between ramets (clonal plants that are physiologically independent individuals. We sampled 332 ramets in a 20 m × 20 m study plot that contained 137 genets (identified using eight SSR markers. We detected epigenetic variation in DNA methylation at 24 methylation-sensitive amplified fragment length polymorphism (MS-AFLP loci. There were significant genet effects at all 24 MS-AFLP loci in the distribution of subepiloci. Vegetation cover had no statistically significant effect on variation in the majority of MS-AFLP loci. The spatial aggregation of epigenetic variation is therefore largely explained by the aggregation of ramets that belong to the same genets. By applying hierarchical Bayesian analyses, we successfully identified a number of genet-specific changes in epigenetic status within a natural plant population in a complex context, where genotypes and environmental factors are unevenly distributed. This finding suggests that it requires further studies on the spatial epigenetic structure of natural populations of diverse organisms

  13. Genet-specific DNA methylation probabilities detected in a spatial epigenetic analysis of a clonal plant population.

    Science.gov (United States)

    Araki, Kiwako S; Kubo, Takuya; Kudoh, Hiroshi

    2017-01-01

    In sessile organisms such as plants, spatial genetic structures of populations show long-lasting patterns. These structures have been analyzed across diverse taxa to understand the processes that determine the genetic makeup of organismal populations. For many sessile organisms that mainly propagate via clonal spread, epigenetic status can vary between clonal individuals in the absence of genetic changes. However, fewer previous studies have explored the epigenetic properties in comparison to the genetic properties of natural plant populations. Here, we report the simultaneous evaluation of the spatial structure of genetic and epigenetic variation in a natural population of the clonal plant Cardamine leucantha. We applied a hierarchical Bayesian model to evaluate the effects of membership of a genet (a group of individuals clonally derived from a single seed) and vegetation cover on the epigenetic variation between ramets (clonal plants that are physiologically independent individuals). We sampled 332 ramets in a 20 m × 20 m study plot that contained 137 genets (identified using eight SSR markers). We detected epigenetic variation in DNA methylation at 24 methylation-sensitive amplified fragment length polymorphism (MS-AFLP) loci. There were significant genet effects at all 24 MS-AFLP loci in the distribution of subepiloci. Vegetation cover had no statistically significant effect on variation in the majority of MS-AFLP loci. The spatial aggregation of epigenetic variation is therefore largely explained by the aggregation of ramets that belong to the same genets. By applying hierarchical Bayesian analyses, we successfully identified a number of genet-specific changes in epigenetic status within a natural plant population in a complex context, where genotypes and environmental factors are unevenly distributed. This finding suggests that it requires further studies on the spatial epigenetic structure of natural populations of diverse organisms, particularly for

  14. Spatial filtering velocimetry revisited: exact short-time detecting schemes from arbitrarily small-size reticles

    International Nuclear Information System (INIS)

    Ando, S; Nara, T; Kurihara, T

    2014-01-01

    Spatial filtering velocimetry was proposed in 1963 by Ator as a velocity-sensing technique for aerial camera-control systems. The total intensity of a moving surface is observed through a set of parallel-slit reticles, resulting in a narrow-band temporal signal whose frequency is directly proportional to the image velocity. However, even despite its historical importance and inherent technical advantages, the mathematical formulation of this technique is only valid when infinite-length observation in both space and time is possible, which causes significant errors in most applications where a small receptive window and high resolution in both axes are desired. In this study, we apply a novel mathematical technique, the weighted integral method, to solve this problem, and obtain exact sensing schemes and algorithms for finite (arbitrarily small but non-zero) size reticles and short-time estimation. Practical considerations for utilizing these schemes are also explored both theoretically and experimentally. (paper)

  15. Low contrast detectability and spatial resolution with model-based iterative reconstructions of MDCT images: a phantom and cadaveric study

    Energy Technology Data Exchange (ETDEWEB)

    Millon, Domitille; Coche, Emmanuel E. [Universite Catholique de Louvain, Department of Radiology and Medical Imaging, Cliniques Universitaires Saint Luc, Brussels (Belgium); Vlassenbroek, Alain [Philips Healthcare, Brussels (Belgium); Maanen, Aline G. van; Cambier, Samantha E. [Universite Catholique de Louvain, Statistics Unit, King Albert II Cancer Institute, Brussels (Belgium)

    2017-03-15

    To compare image quality [low contrast (LC) detectability, noise, contrast-to-noise (CNR) and spatial resolution (SR)] of MDCT images reconstructed with an iterative reconstruction (IR) algorithm and a filtered back projection (FBP) algorithm. The experimental study was performed on a 256-slice MDCT. LC detectability, noise, CNR and SR were measured on a Catphan phantom scanned with decreasing doses (48.8 down to 0.7 mGy) and parameters typical of a chest CT examination. Images were reconstructed with FBP and a model-based IR algorithm. Additionally, human chest cadavers were scanned and reconstructed using the same technical parameters. Images were analyzed to illustrate the phantom results. LC detectability and noise were statistically significantly different between the techniques, supporting model-based IR algorithm (p < 0.0001). At low doses, the noise in FBP images only enabled SR measurements of high contrast objects. The superior CNR of model-based IR algorithm enabled lower dose measurements, which showed that SR was dose and contrast dependent. Cadaver images reconstructed with model-based IR illustrated that visibility and delineation of anatomical structure edges could be deteriorated at low doses. Model-based IR improved LC detectability and enabled dose reduction. At low dose, SR became dose and contrast dependent. (orig.)

  16. Diverse Regular Employees and Non-regular Employment (Japanese)

    OpenAIRE

    MORISHIMA Motohiro

    2011-01-01

    Currently there are high expectations for the introduction of policies related to diverse regular employees. These policies are a response to the problem of disparities between regular and non-regular employees (part-time, temporary, contract and other non-regular employees) and will make it more likely that workers can balance work and their private lives while companies benefit from the advantages of regular employment. In this paper, I look at two issues that underlie this discussion. The ...

  17. Sparse structure regularized ranking

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-04-17

    Learning ranking scores is critical for the multimedia database retrieval problem. In this paper, we propose a novel ranking score learning algorithm by exploring the sparse structure and using it to regularize ranking scores. To explore the sparse structure, we assume that each multimedia object could be represented as a sparse linear combination of all other objects, and combination coefficients are regarded as a similarity measure between objects and used to regularize their ranking scores. Moreover, we propose to learn the sparse combination coefficients and the ranking scores simultaneously. A unified objective function is constructed with regard to both the combination coefficients and the ranking scores, and is optimized by an iterative algorithm. Experiments on two multimedia database retrieval data sets demonstrate the significant improvements of the propose algorithm over state-of-the-art ranking score learning algorithms.

  18. 'Regular' and 'emergency' repair

    International Nuclear Information System (INIS)

    Luchnik, N.V.

    1975-01-01

    Experiments on the combined action of radiation and a DNA inhibitor using Crepis roots and on split-dose irradiation of human lymphocytes lead to the conclusion that there are two types of repair. The 'regular' repair takes place twice in each mitotic cycle and ensures the maintenance of genetic stability. The 'emergency' repair is induced at all stages of the mitotic cycle by high levels of injury. (author)

  19. Regularization of divergent integrals

    OpenAIRE

    Felder, Giovanni; Kazhdan, David

    2016-01-01

    We study the Hadamard finite part of divergent integrals of differential forms with singularities on submanifolds. We give formulae for the dependence of the finite part on the choice of regularization and express them in terms of a suitable local residue map. The cases where the submanifold is a complex hypersurface in a complex manifold and where it is a boundary component of a manifold with boundary, arising in string perturbation theory, are treated in more detail.

  20. Regularizing portfolio optimization

    International Nuclear Information System (INIS)

    Still, Susanne; Kondor, Imre

    2010-01-01

    The optimization of large portfolios displays an inherent instability due to estimation error. This poses a fundamental problem, because solutions that are not stable under sample fluctuations may look optimal for a given sample, but are, in effect, very far from optimal with respect to the average risk. In this paper, we approach the problem from the point of view of statistical learning theory. The occurrence of the instability is intimately related to over-fitting, which can be avoided using known regularization methods. We show how regularized portfolio optimization with the expected shortfall as a risk measure is related to support vector regression. The budget constraint dictates a modification. We present the resulting optimization problem and discuss the solution. The L2 norm of the weight vector is used as a regularizer, which corresponds to a diversification 'pressure'. This means that diversification, besides counteracting downward fluctuations in some assets by upward fluctuations in others, is also crucial because it improves the stability of the solution. The approach we provide here allows for the simultaneous treatment of optimization and diversification in one framework that enables the investor to trade off between the two, depending on the size of the available dataset.

  1. Regularizing portfolio optimization

    Science.gov (United States)

    Still, Susanne; Kondor, Imre

    2010-07-01

    The optimization of large portfolios displays an inherent instability due to estimation error. This poses a fundamental problem, because solutions that are not stable under sample fluctuations may look optimal for a given sample, but are, in effect, very far from optimal with respect to the average risk. In this paper, we approach the problem from the point of view of statistical learning theory. The occurrence of the instability is intimately related to over-fitting, which can be avoided using known regularization methods. We show how regularized portfolio optimization with the expected shortfall as a risk measure is related to support vector regression. The budget constraint dictates a modification. We present the resulting optimization problem and discuss the solution. The L2 norm of the weight vector is used as a regularizer, which corresponds to a diversification 'pressure'. This means that diversification, besides counteracting downward fluctuations in some assets by upward fluctuations in others, is also crucial because it improves the stability of the solution. The approach we provide here allows for the simultaneous treatment of optimization and diversification in one framework that enables the investor to trade off between the two, depending on the size of the available dataset.

  2. Regular Single Valued Neutrosophic Hypergraphs

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam Malik

    2016-12-01

    Full Text Available In this paper, we define the regular and totally regular single valued neutrosophic hypergraphs, and discuss the order and size along with properties of regular and totally regular single valued neutrosophic hypergraphs. We also extend work on completeness of single valued neutrosophic hypergraphs.

  3. The geometry of continuum regularization

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1987-03-01

    This lecture is primarily an introduction to coordinate-invariant regularization, a recent advance in the continuum regularization program. In this context, the program is seen as fundamentally geometric, with all regularization contained in regularized DeWitt superstructures on field deformations

  4. Detecting the Land-Cover Changes Induced by Large-Physical Disturbances Using Landscape Metrics, Spatial Sampling, Simulation and Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Hone-Jay Chu

    2009-08-01

    Full Text Available The objectives of the study are to integrate the conditional Latin Hypercube Sampling (cLHS, sequential Gaussian simulation (SGS and spatial analysis in remotely sensed images, to monitor the effects of large chronological disturbances on spatial characteristics of landscape changes including spatial heterogeneity and variability. The multiple NDVI images demonstrate that spatial patterns of disturbed landscapes were successfully delineated by spatial analysis such as variogram, Moran’I and landscape metrics in the study area. The hybrid method delineates the spatial patterns and spatial variability of landscapes caused by these large disturbances. The cLHS approach is applied to select samples from Normalized Difference Vegetation Index (NDVI images from SPOT HRV images in the Chenyulan watershed of Taiwan, and then SGS with sufficient samples is used to generate maps of NDVI images. In final, the NDVI simulated maps are verified using indexes such as the correlation coefficient and mean absolute error (MAE. Therefore, the statistics and spatial structures of multiple NDVI images present a very robust behavior, which advocates the use of the index for the quantification of the landscape spatial patterns and land cover change. In addition, the results transferred by Open Geospatial techniques can be accessed from web-based and end-user applications of the watershed management.

  5. Noise-Induced Tinnitus Using Individualized Gap Detection Analysis and Its Relationship with Hyperacusis, Anxiety, and Spatial Cognition

    Science.gov (United States)

    Pace, Edward; Zhang, Jinsheng

    2013-01-01

    Tinnitus has a complex etiology that involves auditory and non-auditory factors and may be accompanied by hyperacusis, anxiety and cognitive changes. Thus far, investigations of the interrelationship between tinnitus and auditory and non-auditory impairment have yielded conflicting results. To further address this issue, we noise exposed rats and assessed them for tinnitus using a gap detection behavioral paradigm combined with statistically-driven analysis to diagnose tinnitus in individual rats. We also tested rats for hearing detection, responsivity, and loss using prepulse inhibition and auditory brainstem response, and for spatial cognition and anxiety using Morris water maze and elevated plus maze. We found that our tinnitus diagnosis method reliably separated noise-exposed rats into tinnitus(+) and tinnitus(−) groups and detected no evidence of tinnitus in tinnitus(−) and control rats. In addition, the tinnitus(+) group demonstrated enhanced startle amplitude, indicating hyperacusis-like behavior. Despite these results, neither tinnitus, hyperacusis nor hearing loss yielded any significant effects on spatial learning and memory or anxiety, though a majority of rats with the highest anxiety levels had tinnitus. These findings showed that we were able to develop a clinically relevant tinnitus(+) group and that our diagnosis method is sound. At the same time, like clinical studies, we found that tinnitus does not always result in cognitive-emotional dysfunction, although tinnitus may predispose subjects to certain impairment like anxiety. Other behavioral assessments may be needed to further define the relationship between tinnitus and anxiety, cognitive deficits, and other impairments. PMID:24069375

  6. Spatial-temporal Detection of Sea-breeze Penetration Over Megacities from Himawari-8

    Science.gov (United States)

    Ferdiansyah, M. R.; Inagaki, A.; Kanda, M.

    2017-12-01

    For a coastal urban region, sea breeze is very important for air ventilation and cooling. However, most of sea-breeze monitoring is lacking and inadequate temporally and spatially. Japanese new geostationary meteorological satellite (Himawari-8) has been launched which can provide high resolution satellite imagery. This enables better monitoring of mesoscale weather phenomena such as sea breeze. In this study, we first attempt the feasibility of acquiring temporal-spatial information of sea breeze in a coastal urban region using Himawari-8. For study area, Tokyo (Japan) and Jakarta (Indonesia) area were selected as representative coastal urban regions; both cities located in very distant latitudes. Sea breeze events (Tokyo:16 cases and Jakarta:17 cases) in JAS season of 2015 and 2016 were analyzed. Convergence zones of two sea-breeze systems and delayed sea-breeze penetration were found for both Tokyo and Jakarta. Estimation of inland penetration speed and convergence area for sea breeze event, accompanied by the formation of non-precipitating cumulus type cloudline, is the primary objective. From the visible band image of Himawari-8, cumulus cloudline for each sea breeze event was extracted. The inland penetration speed was then estimated automatically from temporal evolution of these cloudlines. For the case of Tokyo, it was found that the sea breeze from Tokyo Bay had slower penetration speed than another sea breeze (Sagami Bay) coming from a less urbanized area. The average penetration speed of sea-breeze front was estimated to be 3.6 m/s and 1.3 m/s for sea breeze from Sagami Bay and Tokyo Bay, respectively. The penetration differences (from Sagami Bay and Tokyo Bay) could be attributed to the difference in urbanization levels between the coastal areas of Sagami and Tokyo Bay. For the case of Jakarta, the convergence of two sea-breeze systems were found persistent slightly east from the center of Jakarta. Interestingly, the sea-breeze delay was more pronounced

  7. Spatial-temporal detection of risk factors for bacillary dysentery in Beijing, Tianjin and Hebei, China

    Directory of Open Access Journals (Sweden)

    Chengdong Xu

    2017-09-01

    Full Text Available Abstract Background Bacillary dysentery is the third leading notifiable disease and remains a major public health concern in China. The Beijing–Tianjin–Hebei urban region is the biggest urban agglomeration in northern China, and it is one of the areas in the country that is most heavily infected with bacillary dysentery. The objective of the study was to analyze the spatial-temporal pattern and to determine any contributory risk factors on the bacillary dysentery. Methods Bacillary dysentery case data from 1 January 2012 to 31 December 2012 in Beijing–Tianjin– Hebei were employed. GeoDetector method was used to determine the impact of potential risk factors, and to identify regions and seasons at high risk of the disease. Results There were 36,472 cases of bacillary dysentery in 2012 in the study region. The incidence of bacillary dysentery varies widely amongst different age groups; the higher incidence of bacillary dysentery mainly occurs in the population under the age of five. Bacillary dysentery presents apparent seasonal variance, with the highest incidence occurring from June to September. In terms of the potential meteorological risk factors, mean temperature, relative humidity, precipitation, mean wind speed and sunshine hours explain the time variant of bacillary dysentery at 83%, 31%, 25%, 17% and 13%, respectively. The interactive effect between temperature and humidity has an explanatory power of 87%, indicating that a hot and humid environment is more likely to lead to the occurrence of bacillary dysentery. Socio-economic factors affect the spatial distribution of bacillary dysentery. The top four factors are age group, per capita GDP, population density and rural population proportion, and their determinant powers are 61%, 27%, 25% and 21%, respectively. The interactive effect between age group and the other factors accounts for more than 60% of bacillary dysentery transmission. Conclusions Bacillary dysentery poses a

  8. Quantifying and mapping spatial variability in simulated forest plots

    Science.gov (United States)

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  9. Multiwire proportional chambers with a high spatial resolution for X radiation detection and localization

    International Nuclear Information System (INIS)

    Algre, J.-L.

    1975-01-01

    A multiwire proportional counter, with a high spatial resolution has been developed, and some basic characteristics of this type of detector specified. A method of calculating the potential and consequently the field at each point of the volume limited by the counter was defined. The method allows the problems of the outer wires to be solved, and the consequences of a wire displacement predicted. The analysis of the pulses observed on both cathode and anode showed that they were hardly formed from the only ion migration from the cathode to the anode. An estimation of the formation time duration established that in argon mixtures with a low percentage of methane, Ar + ions are in majority. Then it can be predicted that anode wires are not to be spaced by less than 1mm when a conventional electronics is used. The study of the multiplication factor as a function of the main geometric parameters of the counter gave a relation between the multiplication coefficient and the geometric parameters of the chamber; consequently the optimal operation conditions can be predicted. Especially, the diameter of the multiplying wires must be as weak as possible to improve the energy resolution of the detector. A localization method showed that an interpolation may be done between two wires so that, with 1mm spaced wires, the two-dimensional position of an event can be determined with a resolution better than 0.5mm for both directions [fr

  10. Brain-wide mapping of axonal connections: workflow for automated detection and spatial analysis of labeling in microscopic sections

    Directory of Open Access Journals (Sweden)

    Eszter Agnes ePapp

    2016-04-01

    Full Text Available Axonal tracing techniques are powerful tools for exploring the structural organization of neuronal connections. Tracers such as biotinylated dextran amine (BDA and Phaseolus vulgaris leucoagglutinin (Pha-L allow brain-wide mapping of connections through analysis of large series of histological section images. We present a workflow for efficient collection and analysis of tract-tracing datasets with a focus on newly developed modules for image processing and assignment of anatomical location to tracing data. New functionality includes automatic detection of neuronal labeling in large image series, alignment of images to a volumetric brain atlas, and analytical tools for measuring the position and extent of labeling. To evaluate the workflow, we used high-resolution microscopic images from axonal tracing experiments in which different parts of the rat primary somatosensory cortex had been injected with BDA or Pha-L. Parameters from a set of representative images were used to automate detection of labeling in image series covering the entire brain, resulting in binary maps of the distribution of labeling. For high to medium labeling densities, automatic detection was found to provide reliable results when compared to manual analysis, whereas weak labeling required manual curation for optimal detection. To identify brain regions corresponding to labeled areas, section images were aligned to the Waxholm Space (WHS atlas of the Sprague Dawley rat brain (v2 by custom-angle slicing of the MRI template to match individual sections. Based on the alignment, WHS coordinates were obtained for labeled elements and transformed to stereotaxic coordinates. The new workflow modules increase the efficiency and reliability of labeling detection in large series of images from histological sections, and enable anchoring to anatomical atlases for further spatial analysis and comparison with other data.

  11. Development of a neutron detector with high detection efficiency and high spatial resolution and its applications to reactor physics experiments

    International Nuclear Information System (INIS)

    Tojo, Takao

    1979-09-01

    For detection of thermal neutrons in multiplying systems, a scintillator mixture of ZnS(Ag), 6 LiF and polyethylene was prepared, and its characteristics were shown. A sintillation detector using the mixture and a long acrylic-resin light guide was developed for measuring thermal neutrons in an U-H 2 O subcritical assembly(JAERISA). The detector was applied in the following reactor physics measurements with JAERISA: (1) cadmium ratio, (2) infinite multiplication factor, (3) material buckling, and (4) prompt neutron lifetime by pulsed neutron method. These experiments revealed that neutrons in the assembly are successfully detected by the detector owing to its outstanding characteristics of gamma-ray insensitivity, high detection efficiency and high spatial resolution. In the process of activity measurement of a foil activation detector with a GM counter, it was shown that accurate counting loss correction are difficult by usual method, because of the appreciable resolving time dependence on counting rates. In accurate correction, a new method was introduced for precise measurement of the resolving time; the dependence was made clear. A new correction method was developed, which enables direct reading of the corrected counting rates, even at high counting rates. (author)

  12. Annotation of Regular Polysemy

    DEFF Research Database (Denmark)

    Martinez Alonso, Hector

    Regular polysemy has received a lot of attention from the theory of lexical semantics and from computational linguistics. However, there is no consensus on how to represent the sense of underspecified examples at the token level, namely when annotating or disambiguating senses of metonymic words...... and metonymic. We have conducted an analysis in English, Danish and Spanish. Later on, we have tried to replicate the human judgments by means of unsupervised and semi-supervised sense prediction. The automatic sense-prediction systems have been unable to find empiric evidence for the underspecified sense, even...

  13. Regularity of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J; Kuster, Albrecht

    2010-01-01

    "Regularity of Minimal Surfaces" begins with a survey of minimal surfaces with free boundaries. Following this, the basic results concerning the boundary behaviour of minimal surfaces and H-surfaces with fixed or free boundaries are studied. In particular, the asymptotic expansions at interior and boundary branch points are derived, leading to general Gauss-Bonnet formulas. Furthermore, gradient estimates and asymptotic expansions for minimal surfaces with only piecewise smooth boundaries are obtained. One of the main features of free boundary value problems for minimal surfaces is t

  14. Regularities of radiation heredity

    International Nuclear Information System (INIS)

    Skakov, M.K.; Melikhov, V.D.

    2001-01-01

    One analyzed regularities of radiation heredity in metals and alloys. One made conclusion about thermodynamically irreversible changes in structure of materials under irradiation. One offers possible ways of heredity transmittance of radiation effects at high-temperature transformations in the materials. Phenomenon of radiation heredity may be turned to practical use to control structure of liquid metal and, respectively, structure of ingot via preliminary radiation treatment of charge. Concentration microheterogeneities in material defect structure induced by preliminary irradiation represent the genetic factor of radiation heredity [ru

  15. Contour Propagation With Riemannian Elasticity Regularization

    DEFF Research Database (Denmark)

    Bjerre, Troels; Hansen, Mads Fogtmann; Sapru, W.

    2011-01-01

    Purpose/Objective(s): Adaptive techniques allow for correction of spatial changes during the time course of the fractionated radiotherapy. Spatial changes include tumor shrinkage and weight loss, causing tissue deformation and residual positional errors even after translational and rotational image...... the planning CT onto the rescans and correcting to reflect actual anatomical changes. For deformable registration, a free-form, multi-level, B-spline deformation model with Riemannian elasticity, penalizing non-rigid local deformations, and volumetric changes, was used. Regularization parameters was defined...... on the original delineation and tissue deformation in the time course between scans form a better starting point than rigid propagation. There was no significant difference of locally and globally defined regularization. The method used in the present study suggests that deformed contours need to be reviewed...

  16. An Object-Based Image Analysis Approach for Detecting Penguin Guano in very High Spatial Resolution Satellite Images

    Directory of Open Access Journals (Sweden)

    Chandi Witharana

    2016-04-01

    Full Text Available The logistical challenges of Antarctic field work and the increasing availability of very high resolution commercial imagery have driven an interest in more efficient search and classification of remotely sensed imagery. This exploratory study employed geographic object-based analysis (GEOBIA methods to classify guano stains, indicative of chinstrap and Adélie penguin breeding areas, from very high spatial resolution (VHSR satellite imagery and closely examined the transferability of knowledge-based GEOBIA rules across different study sites focusing on the same semantic class. We systematically gauged the segmentation quality, classification accuracy, and the reproducibility of fuzzy rules. A master ruleset was developed based on one study site and it was re-tasked “without adaptation” and “with adaptation” on candidate image scenes comprising guano stains. Our results suggest that object-based methods incorporating the spectral, textural, spatial, and contextual characteristics of guano are capable of successfully detecting guano stains. Reapplication of the master ruleset on candidate scenes without modifications produced inferior classification results, while adapted rules produced comparable or superior results compared to the reference image. This work provides a road map to an operational “image-to-assessment pipeline” that will enable Antarctic wildlife researchers to seamlessly integrate VHSR imagery into on-demand penguin population census.

  17. High Spatial resolution remote sensing for salt marsh change detection on Fire Island National Seashore

    Science.gov (United States)

    Campbell, A.; Wang, Y.

    2017-12-01

    Salt marshes are under increasing pressure due to anthropogenic stressors including sea level rise, nutrient enrichment, herbivory and disturbances. Salt marsh losses risk the important ecosystem services they provide including biodiversity, water filtration, wave attenuation, and carbon sequestration. This study determines salt marsh change on Fire Island National Seashore, a barrier island along the south shore of Long Island, New York. Object-based image analysis was used to classifying Worldview-2, high resolution satellite, and topobathymetric LiDAR. The site was impacted by Hurricane Sandy in October of 2012 causing a breach in the Barrier Island and extensive overwash. In situ training data from vegetation plots were used to train the Random Forest classifier. The object-based Worldview-2 classification achieved an overall classification accuracy of 92.75. Salt marsh change for the study site was determined by comparing the 2015 classification with a 1997 classification. The study found a shift from high marsh to low marsh and a reduction in Phragmites on Fire Island. Vegetation losses were observed along the edge of the marsh and in the marsh interior. The analysis agreed with many of the trends found throughout the region including the reduction of high marsh and decline of salt marsh. The reduction in Phragmites could be due to the species shrinking niche between rising seas and dune vegetation on barrier islands. The complex management issues facing salt marsh across the United States including sea level rise and eutrophication necessitate very high resolution classification and change detection of salt marsh to inform management decisions such as restoration, salt marsh migration, and nutrient inputs.

  18. Accounting for regional background and population size in the detection of spatial clusters and outliers using geostatistical filtering and spatial neutral models: the case of lung cancer in Long Island, New York

    Directory of Open Access Journals (Sweden)

    Goovaerts Pierre

    2004-07-01

    methodology allows one to identify geographic pattern above and beyond background variation. The implementation of this approach in spatial statistical software will facilitate the detection of spatial disparities in mortality rates, establishing the rationale for targeted cancer control interventions, including consideration of health services needs, and resource allocation for screening and diagnostic testing. It will allow researchers to systematically evaluate how sensitive their results are to assumptions implicit under alternative null hypotheses.

  19. Investigating the effect of pixel size of high spatial resolution FTIR imaging for detection of colorectal cancer

    Science.gov (United States)

    Lloyd, G. R.; Nallala, J.; Stone, N.

    2016-03-01

    FTIR is a well-established technique and there is significant interest in applying this technique to medical diagnostics e.g. to detect cancer. The introduction of focal plane array (FPA) detectors means that FTIR is particularly suited to rapid imaging of biopsy sections as an adjunct to digital pathology. Until recently however each pixel in the image has been limited to a minimum of 5.5 µm which results in a comparatively low magnification image or histology applications and potentially the loss of important diagnostic information. The recent introduction of higher magnification optics gives image pixels that cover approx. 1.1 µm. This reduction in image pixel size gives images of higher magnification and improved spatial detail can be observed. However, the effect of increasing the magnification on spectral quality and the ability to discriminate between disease states is not well studied. In this work we test the discriminatory performance of FTIR imaging using both standard (5.5 µm) and high (1.1 µm) magnification for the detection of colorectal cancer and explore the effect of binning to degrade high resolution images to determine whether similar diagnostic information and performance can be obtained using both magnifications. Results indicate that diagnostic performance using high magnification may be reduced as compared to standard magnification when using existing multivariate approaches. Reduction of the high magnification data to standard magnification via binning can potentially recover some of the lost performance.

  20. A Low-Cost Imaging Method for the Temporal and Spatial Colorimetric Detection of Free Amines on Maize Root Surfaces

    Directory of Open Access Journals (Sweden)

    Truc H. Doan

    2017-08-01

    Full Text Available Plant root exudates are important mediators in the interactions that occur between plants and microorganisms in the soil, yet much remains to be learned about spatial and temporal variation in their production. This work outlines a method utilizing a novel colorimetric paper to detect spatial and temporal changes in the production of nitrogen-containing compounds on the root surface. While existing methods have made it possible to conduct detailed analysis of root exudate composition, relatively less is known about where in the root system exudates are produced and how this localization changes as the root grows. Furthermore, there is much to learn about how exudate localization and composition varies in response to stress. Root exudates are chemically diverse secretions composed of organic acids, amino acids, proteins, sugars, and other metabolites. The sensor utilized for the method, ninhydrin, is a colorless substance in solution that reacts with free amino groups to form a purple dye. A detection paper was developed by formulating ninhydrin into a print solution that was uniformly deposited onto paper with a commercial ink jet printer. This “ninhydrin paper” was used to analyze the chemical makeup of root surfaces from maize seedlings grown vertically on germination paper. Through contact between the ninhydrin paper and seedling root surfaces, combined with images of both the seedlings and dried ninhydrin papers captured using a standard flatbed scanner, nitrogen-containing substances on the root surface can be localized and concentration of signal estimated for over 2 weeks of development. The method was found to be non-inhibiting to plant growth over the analysis period although damage to root hairs was observed. The method is sensitive in the detection of free amines at concentrations as little as 140 μM. Furthermore, ninhydrin paper is stable, showing consistent color changes up to 2 weeks after printing. This relatively simple, low

  1. Laplacian manifold regularization method for fluorescence molecular tomography

    Science.gov (United States)

    He, Xuelei; Wang, Xiaodong; Yi, Huangjian; Chen, Yanrong; Zhang, Xu; Yu, Jingjing; He, Xiaowei

    2017-04-01

    Sparse regularization methods have been widely used in fluorescence molecular tomography (FMT) for stable three-dimensional reconstruction. Generally, ℓ1-regularization-based methods allow for utilizing the sparsity nature of the target distribution. However, in addition to sparsity, the spatial structure information should be exploited as well. A joint ℓ1 and Laplacian manifold regularization model is proposed to improve the reconstruction performance, and two algorithms (with and without Barzilai-Borwein strategy) are presented to solve the regularization model. Numerical studies and in vivo experiment demonstrate that the proposed Gradient projection-resolved Laplacian manifold regularization method for the joint model performed better than the comparative algorithm for ℓ1 minimization method in both spatial aggregation and location accuracy.

  2. Effective field theory dimensional regularization

    International Nuclear Information System (INIS)

    Lehmann, Dirk; Prezeau, Gary

    2002-01-01

    A Lorentz-covariant regularization scheme for effective field theories with an arbitrary number of propagating heavy and light particles is given. This regularization scheme leaves the low-energy analytic structure of Greens functions intact and preserves all the symmetries of the underlying Lagrangian. The power divergences of regularized loop integrals are controlled by the low-energy kinematic variables. Simple diagrammatic rules are derived for the regularization of arbitrary one-loop graphs and the generalization to higher loops is discussed

  3. Effective field theory dimensional regularization

    Science.gov (United States)

    Lehmann, Dirk; Prézeau, Gary

    2002-01-01

    A Lorentz-covariant regularization scheme for effective field theories with an arbitrary number of propagating heavy and light particles is given. This regularization scheme leaves the low-energy analytic structure of Greens functions intact and preserves all the symmetries of the underlying Lagrangian. The power divergences of regularized loop integrals are controlled by the low-energy kinematic variables. Simple diagrammatic rules are derived for the regularization of arbitrary one-loop graphs and the generalization to higher loops is discussed.

  4. 75 FR 76006 - Regular Meeting

    Science.gov (United States)

    2010-12-07

    ... FARM CREDIT SYSTEM INSURANCE CORPORATION Regular Meeting AGENCY: Farm Credit System Insurance Corporation Board. ACTION: Regular meeting. SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). Date and Time: The meeting of the Board will be held...

  5. The detection and mapping of the spatial distribution of insect defense compounds by desorption atmospheric pressure photoionization Orbitrap mass spectrometry.

    Science.gov (United States)

    Rejšek, Jan; Vrkoslav, Vladimír; Hanus, Robert; Vaikkinen, Anu; Haapala, Markus; Kauppila, Tiina J; Kostiainen, Risto; Cvačka, Josef

    2015-07-30

    Many insects use chemicals synthesized in exocrine glands and stored in reservoirs to protect themselves. Two chemically defended insects were used as models for the development of a new rapid analytical method based on desorption atmospheric pressure photoionization-mass spectrometry (DAPPI-MS). The distribution of defensive chemicals on the insect body surface was studied. Since these chemicals are predominantly nonpolar, DAPPI was a suitable analytical method. Repeatability of DAPPI-MS signals and effects related to non-planarity and roughness of samples were investigated using acrylic sheets uniformly covered with an analyte. After that, analytical figures of merit of the technique were determined. The spatial distribution of (E)-1-nitropentadec-1-ene, a toxic nitro compound synthesized by soldiers of the termite Prorhinotermes simplex, was investigated. Then, the spatial distribution of the unsaturated aldehydes (E)-hex-2-enal, (E)-4-oxohex-2-enal, (E)-oct-2-enal, (E,E)-deca-2,4-dienal and (E)-dec-2-enal was monitored in the stink bug Graphosoma lineatum. Chemicals present on the body surface were scanned along the median line of the insect from the head to the abdomen and vice versa, employing either the MS or MS(2) mode. In this fast and simple way, the opening of the frontal gland on the frons of termite soldiers and the position of the frontal gland reservoir, extending deep into the abdominal cavity, were localized. In the stink bug, the opening of the metathoracic scent glands (ostiole) on the ventral side of the thorax as well as the gland reservoir in the median position under the ventral surface of the anterior abdomen were detected and localized. The developed method has future prospects in routine laboratory use in life sciences. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Novel Ordered Stepped-Wedge Cluster Trial Designs for Detecting Ebola Vaccine Efficacy Using a Spatially Structured Mathematical Model.

    Directory of Open Access Journals (Sweden)

    Ibrahim Diakite

    2016-08-01

    Full Text Available During the 2014 Ebola virus disease (EVD outbreak, policy-makers were confronted with difficult decisions on how best to test the efficacy of EVD vaccines. On one hand, many were reluctant to withhold a vaccine that might prevent a fatal disease from study participants randomized to a control arm. On the other, regulatory bodies called for rigorous placebo-controlled trials to permit direct measurement of vaccine efficacy prior to approval of the products. A stepped-wedge cluster study (SWCT was proposed as an alternative to a more traditional randomized controlled vaccine trial to address these concerns. Here, we propose novel "ordered stepped-wedge cluster trial" (OSWCT designs to further mitigate tradeoffs between ethical concerns, logistics, and statistical rigor.We constructed a spatially structured mathematical model of the EVD outbreak in Sierra Leone. We used the output of this model to simulate and compare a series of stepped-wedge cluster vaccine studies. Our model reproduced the observed order of first case occurrence within districts of Sierra Leone. Depending on the infection risk within the trial population and the trial start dates, the statistical power to detect a vaccine efficacy of 90% varied from 14% to 32% for standard SWCT, and from 67% to 91% for OSWCTs for an alpha error of 5%. The model's projection of first case occurrence was robust to changes in disease natural history parameters.Ordering clusters in a step-wedge trial based on the cluster's underlying risk of infection as predicted by a spatial model can increase the statistical power of a SWCT. In the event of another hemorrhagic fever outbreak, implementation of our proposed OSWCT designs could improve statistical power when a step-wedge study is desirable based on either ethical concerns or logistical constraints.

  7. Ensemble manifold regularization.

    Science.gov (United States)

    Geng, Bo; Tao, Dacheng; Xu, Chao; Yang, Linjun; Hua, Xian-Sheng

    2012-06-01

    We propose an automatic approximation of the intrinsic manifold for general semi-supervised learning (SSL) problems. Unfortunately, it is not trivial to define an optimization function to obtain optimal hyperparameters. Usually, cross validation is applied, but it does not necessarily scale up. Other problems derive from the suboptimality incurred by discrete grid search and the overfitting. Therefore, we develop an ensemble manifold regularization (EMR) framework to approximate the intrinsic manifold by combining several initial guesses. Algorithmically, we designed EMR carefully so it 1) learns both the composite manifold and the semi-supervised learner jointly, 2) is fully automatic for learning the intrinsic manifold hyperparameters implicitly, 3) is conditionally optimal for intrinsic manifold approximation under a mild and reasonable assumption, and 4) is scalable for a large number of candidate manifold hyperparameters, from both time and space perspectives. Furthermore, we prove the convergence property of EMR to the deterministic matrix at rate root-n. Extensive experiments over both synthetic and real data sets demonstrate the effectiveness of the proposed framework.

  8. Applying spatial analysis tools in public health: an example using SaTScan to detect geographic targets for colorectal cancer screening interventions.

    Science.gov (United States)

    Sherman, Recinda L; Henry, Kevin A; Tannenbaum, Stacey L; Feaster, Daniel J; Kobetz, Erin; Lee, David J

    2014-03-20

    Epidemiologists are gradually incorporating spatial analysis into health-related research as geocoded cases of disease become widely available and health-focused geospatial computer applications are developed. One health-focused application of spatial analysis is cluster detection. Using cluster detection to identify geographic areas with high-risk populations and then screening those populations for disease can improve cancer control. SaTScan is a free cluster-detection software application used by epidemiologists around the world to describe spatial clusters of infectious and chronic disease, as well as disease vectors and risk factors. The objectives of this article are to describe how spatial analysis can be used in cancer control to detect geographic areas in need of colorectal cancer screening intervention, identify issues commonly encountered by SaTScan users, detail how to select the appropriate methods for using SaTScan, and explain how method selection can affect results. As an example, we used various methods to detect areas in Florida where the population is at high risk for late-stage diagnosis of colorectal cancer. We found that much of our analysis was underpowered and that no single method detected all clusters of statistical or public health significance. However, all methods detected 1 area as high risk; this area is potentially a priority area for a screening intervention. Cluster detection can be incorporated into routine public health operations, but the challenge is to identify areas in which the burden of disease can be alleviated through public health intervention. Reliance on SaTScan's default settings does not always produce pertinent results.

  9. Detection and spatial distribution of multiple-contaminants in agro-ecological Mediterranean wetlands (Marjal de Pego-Oliva, Spain)

    Science.gov (United States)

    Pascual-Aguilar, Juan Antonio; Andreu, Vicente; Gimeno-García, Eugenia; Picó, Yolanda; Masia, Ana

    2015-04-01

    Socio economic activities are more and more producing amounts (in quantity and quality) of non desirable chemical substances (contaminants) that can be found in open air environments. As many of these products persist and may also circulate among environmental compartments, the cumulative incidence of such multiple contaminants combination may be a cause of treat that should not exists taking only in consideration concentrations of each contaminant individually because the number and the type of compounds are not known, as well as their cumulative and interaction effects. Thus prior to any further work analyzing the environmental risk of multiple contaminants their identification and level of concentration is required. In this work the potential presence of multiple contaminants of anthropogenic origin in a protected agro-ecological Mediterranean wetland is studied: the Pego-Oliva Marsh Natural Park (Valencian Community, Spain), which is characterized by a long history of human pressures, such as marsh transformation for agricultural uses. Two major groups of relevant pollutants have been targeted according o two distinct environmental matrices: seven heavy metals in soils (Cd, Co, Cr, Cu, Ni, Pb and Zn) and fourteen emerging contaminants /drugs of abuse in surface waters of the natural lagoon, rivers and artificial irrigation networks (6-ACMOR, AMP, BECG, COC, ECGME, HER, KET, MAMP, MDA, MDMA, MET, MOR, THC, THC-COOH). The wetland was divided in nine representative zones with different types of land cover and land use. For soils, 24 samples were collected and for waters 33 taking in consideration the spatial representativeness of the above mention nine environments. Spatial analysis applying Geographical Information Systems to determine areas with greater incidence of both types of contaminants were also performed. With regard to heavy metals, Zn showed values under the detection limits in all samples, the remainder metals appeared in concentrations surpassing the

  10. DETECTION OF THE VELOCITY SHEAR EFFECT ON THE SPATIAL DISTRIBUTIONS OF THE GALACTIC SATELLITES IN ISOLATED SYSTEMS

    International Nuclear Information System (INIS)

    Lee, Jounghun; Choi, Yun-Young

    2015-01-01

    We report a detection of the effect of the large-scale velocity shear on the spatial distributions of the galactic satellites around the isolated hosts. Identifying the isolated galactic systems, each of which consists of a single host galaxy and its satellites, from the Seventh Data Release of the Sloan Digital Sky Survey and reconstructing linearly the velocity shear field in the local universe, we measure the alignments between the relative positions of the satellites from their isolated hosts and the principal axes of the local velocity shear tensors projected onto the plane of sky. We find a clear signal that the galactic satellites in isolated systems are located preferentially along the directions of the minor principal axes of the large-scale velocity shear field. Those galactic satellites that are spirals, are brighter, are located at distances larger than the projected virial radii of the hosts, and belong to the spiral hosts yield stronger alignment signals, which implies that the alignment strength depends on the formation and accretion epochs of the galactic satellites. It is also shown that the alignment strength is quite insensitive to the cosmic web environment, as well as the size and luminosity of the isolated hosts. Although this result is consistent with the numerical finding of Libeskind et al. based on an N-body experiment, owing to the very low significance of the observed signals, it remains inconclusive whether or not the velocity shear effect on the satellite distribution is truly universal

  11. Detection of the Velocity Shear Effect on the Spatial Distributions of the Galactic Satellites in Isolated Systems

    Science.gov (United States)

    Lee, Jounghun; Choi, Yun-Young

    2015-02-01

    We report a detection of the effect of the large-scale velocity shear on the spatial distributions of the galactic satellites around the isolated hosts. Identifying the isolated galactic systems, each of which consists of a single host galaxy and its satellites, from the Seventh Data Release of the Sloan Digital Sky Survey and reconstructing linearly the velocity shear field in the local universe, we measure the alignments between the relative positions of the satellites from their isolated hosts and the principal axes of the local velocity shear tensors projected onto the plane of sky. We find a clear signal that the galactic satellites in isolated systems are located preferentially along the directions of the minor principal axes of the large-scale velocity shear field. Those galactic satellites that are spirals, are brighter, are located at distances larger than the projected virial radii of the hosts, and belong to the spiral hosts yield stronger alignment signals, which implies that the alignment strength depends on the formation and accretion epochs of the galactic satellites. It is also shown that the alignment strength is quite insensitive to the cosmic web environment, as well as the size and luminosity of the isolated hosts. Although this result is consistent with the numerical finding of Libeskind et al. based on an N-body experiment, owing to the very low significance of the observed signals, it remains inconclusive whether or not the velocity shear effect on the satellite distribution is truly universal.

  12. Bypassing the Limits of Ll Regularization: Convex Sparse Signal Processing Using Non-Convex Regularization

    Science.gov (United States)

    Parekh, Ankit

    Sparsity has become the basis of some important signal processing methods over the last ten years. Many signal processing problems (e.g., denoising, deconvolution, non-linear component analysis) can be expressed as inverse problems. Sparsity is invoked through the formulation of an inverse problem with suitably designed regularization terms. The regularization terms alone encode sparsity into the problem formulation. Often, the ℓ1 norm is used to induce sparsity, so much so that ℓ1 regularization is considered to be `modern least-squares'. The use of ℓ1 norm, as a sparsity-inducing regularizer, leads to a convex optimization problem, which has several benefits: the absence of extraneous local minima, well developed theory of globally convergent algorithms, even for large-scale problems. Convex regularization via the ℓ1 norm, however, tends to under-estimate the non-zero values of sparse signals. In order to estimate the non-zero values more accurately, non-convex regularization is often favored over convex regularization. However, non-convex regularization generally leads to non-convex optimization, which suffers from numerous issues: convergence may be guaranteed to only a stationary point, problem specific parameters may be difficult to set, and the solution is sensitive to the initialization of the algorithm. The first part of this thesis is aimed toward combining the benefits of non-convex regularization and convex optimization to estimate sparse signals more effectively. To this end, we propose to use parameterized non-convex regularizers with designated non-convexity and provide a range for the non-convex parameter so as to ensure that the objective function is strictly convex. By ensuring convexity of the objective function (sum of data-fidelity and non-convex regularizer), we can make use of a wide variety of convex optimization algorithms to obtain the unique global minimum reliably. The second part of this thesis proposes a non-linear signal

  13. Adaptive Regularization of Neural Classifiers

    DEFF Research Database (Denmark)

    Andersen, Lars Nonboe; Larsen, Jan; Hansen, Lars Kai

    1997-01-01

    We present a regularization scheme which iteratively adapts the regularization parameters by minimizing the validation error. It is suggested to use the adaptive regularization scheme in conjunction with optimal brain damage pruning to optimize the architecture and to avoid overfitting. Furthermo......, we propose an improved neural classification architecture eliminating an inherent redundancy in the widely used SoftMax classification network. Numerical results demonstrate the viability of the method...

  14. EIT image reconstruction with four dimensional regularization.

    Science.gov (United States)

    Dai, Tao; Soleimani, Manuchehr; Adler, Andy

    2008-09-01

    Electrical impedance tomography (EIT) reconstructs internal impedance images of the body from electrical measurements on body surface. The temporal resolution of EIT data can be very high, although the spatial resolution of the images is relatively low. Most EIT reconstruction algorithms calculate images from data frames independently, although data are actually highly correlated especially in high speed EIT systems. This paper proposes a 4-D EIT image reconstruction for functional EIT. The new approach is developed to directly use prior models of the temporal correlations among images and 3-D spatial correlations among image elements. A fast algorithm is also developed to reconstruct the regularized images. Image reconstruction is posed in terms of an augmented image and measurement vector which are concatenated from a specific number of previous and future frames. The reconstruction is then based on an augmented regularization matrix which reflects the a priori constraints on temporal and 3-D spatial correlations of image elements. A temporal factor reflecting the relative strength of the image correlation is objectively calculated from measurement data. Results show that image reconstruction models which account for inter-element correlations, in both space and time, show improved resolution and noise performance, in comparison to simpler image models.

  15. Spatial and temporal resolution requirements for quench detection in (RE)Ba2Cu3Ox magnets using Rayleigh-scattering-based fiber optic distributed sensing

    International Nuclear Information System (INIS)

    Chan, W K; Schwartz, J; Flanagan, G

    2013-01-01

    One of the key remaining challenges to safe and reliable operation of large, high temperature superconductor (HTS)-based magnet systems is quench detection and protection. Due to the slow quench propagation in HTS systems, the conventional discrete voltage-tap approach developed for NbTi and Nb 3 Sn magnets may not be sufficient. In contrast, a distributed temperature profile, generated by a distributed temperature sensor and facilitating continuous monitoring of the temperature at any monitored locations within a magnet with high spatial resolution, may be required. One such distributed temperature sensing option is the use of Rayleigh-based fiber optic sensors (FOS), which are immune to electromagnetic interference. The detection of a quench via Rayleigh-based FOS relies on converting the spectral shifts in the Rayleigh scattering spectra into temperature variations. As a result, the higher the spatial sampling resolution the larger the data processing volume, and thus the lower the temporal sampling resolution. So, for effective quench detection, which requires the quick and accurate identification of a hot spot, it is important to find a balance between the spatial and temporal resolutions executable on a given data acquisition and processing (DAQ) system. This paper discusses a method for finding an appropriate DAQ technology that matches the characteristic of a superconducting coil, and determining the acceptable resolutions for efficient and safe quench detection. A quench detection algorithm based on distributed temperature sensing is proposed and its implementation challenges are discussed. (paper)

  16. Spatial distribution

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Hendrichsen, Ditte Katrine; Nachman, Gøsta Støger

    2008-01-01

    , depending on the nature of intraspecific interactions between them: while the individuals of some species repel each other and partition the available area, others form groups of varying size, determined by the fitness of each group member. The spatial distribution pattern of individuals again strongly......Living organisms are distributed over the entire surface of the planet. The distribution of the individuals of each species is not random; on the contrary, they are strongly dependent on the biology and ecology of the species, and vary over different spatial scale. The structure of whole...... populations reflects the location and fragmentation pattern of the habitat types preferred by the species, and the complex dynamics of migration, colonization, and population growth taking place over the landscape. Within these, individuals are distributed among each other in regular or clumped patterns...

  17. Effort variation regularization in sound field reproduction

    DEFF Research Database (Denmark)

    Stefanakis, Nick; Jacobsen, Finn; Sarris, Ioannis

    2010-01-01

    In this paper, active control is used in order to reproduce a given sound field in an extended spatial region. A method is proposed which minimizes the reproduction error at a number of control positions with the reproduction sources holding a certain relation within their complex strengths......), and adaptive wave field synthesis (AWFS), both under free-field conditions and in reverberant rooms. It is shown that effort variation regularization overcomes the problems associated with small spaces and with a low ratio of direct to reverberant energy, improving thus the reproduction accuracy...

  18. 75 FR 53966 - Regular Meeting

    Science.gov (United States)

    2010-09-02

    ... FARM CREDIT SYSTEM INSURANCE CORPORATION Regular Meeting AGENCY: Farm Credit System Insurance Corporation Board. SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). DATE AND TIME: The meeting of the Board will be held at the offices of the Farm...

  19. Online co-regularized algorithms

    NARCIS (Netherlands)

    Ruijter, T. de; Tsivtsivadze, E.; Heskes, T.

    2012-01-01

    We propose an online co-regularized learning algorithm for classification and regression tasks. We demonstrate that by sequentially co-regularizing prediction functions on unlabeled data points, our algorithm provides improved performance in comparison to supervised methods on several UCI benchmarks

  20. Spatial distribution of trachoma cases in the City of Bauru, State of São Paulo, Brazil, detected in 2006: defining key areas for improvement of health resources

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Macharelli

    2013-04-01

    Full Text Available Introduction The objective of this study was to analyze the spatial behavior of the occurrence of trachoma cases detected in the City of Bauru, State of São Paulo, Brazil, in 2006 in order to use the information collected to set priority areas for optimization of health resources. Methods the trachoma cases identified in 2006 were georeferenced. The data evaluated were: schools where the trachoma cases studied, data from the 2000 Census, census tract, type of housing, water supply conditions, distribution of income and levels of education of household heads. In the Google Earth® software and TerraView® were made descriptive spatial analysis and estimates of the Kernel. Each area was studied by interpolation of the density surfaces exposing events to facilitate to recognize the clusters. Results Of the 66 cases detected, only one (1.5% was not a resident of the city's outskirts. A positive association was detected of trachoma cases and the percentage of heads of household with income below three minimum wages and schooling under eight years of education. Conclusions The recognition of the spatial distribution of trachoma cases coincided with the areas of greatest social inequality in Bauru City. The micro-areas identified are those that should be prioritized in the rationalization of health resources. There is the possibility of using the trachoma cases detected as an indicator of performance of micro priority health programs.

  1. Spatial distribution of trachoma cases in the City of Bauru, State of São Paulo, Brazil, detected in 2006: defining key areas for improvement of health resources

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Macharelli

    2013-02-01

    Full Text Available Introduction The objective of this study was to analyze the spatial behavior of the occurrence of trachoma cases detected in the City of Bauru, State of São Paulo, Brazil, in 2006 in order to use the information collected to set priority areas for optimization of health resources. Methods the trachoma cases identified in 2006 were georeferenced. The data evaluated were: schools where the trachoma cases studied, data from the 2000 Census, census tract, type of housing, water supply conditions, distribution of income and levels of education of household heads. In the Google Earth® software and TerraView® were made descriptive spatial analysis and estimates of the Kernel. Each area was studied by interpolation of the density surfaces exposing events to facilitate to recognize the clusters. Results Of the 66 cases detected, only one (1.5% was not a resident of the city's outskirts. A positive association was detected of trachoma cases and the percentage of heads of household with income below three minimum wages and schooling under eight years of education. Conclusions The recognition of the spatial distribution of trachoma cases coincided with the areas of greatest social inequality in Bauru City. The micro-areas identified are those that should be prioritized in the rationalization of health resources. There is the possibility of using the trachoma cases detected as an indicator of performance of micro priority health programs.

  2. Continuum-regularized quantum gravity

    International Nuclear Information System (INIS)

    Chan Huesum; Halpern, M.B.

    1987-01-01

    The recent continuum regularization of d-dimensional Euclidean gravity is generalized to arbitrary power-law measure and studied in some detail as a representative example of coordinate-invariant regularization. The weak-coupling expansion of the theory illustrates a generic geometrization of regularized Schwinger-Dyson rules, generalizing previous rules in flat space and flat superspace. The rules are applied in a non-trivial explicit check of Einstein invariance at one loop: the cosmological counterterm is computed and its contribution is included in a verification that the graviton mass is zero. (orig.)

  3. Multi-class geospatial object detection based on a position-sensitive balancing framework for high spatial resolution remote sensing imagery

    Science.gov (United States)

    Zhong, Yanfei; Han, Xiaobing; Zhang, Liangpei

    2018-04-01

    Multi-class geospatial object detection from high spatial resolution (HSR) remote sensing imagery is attracting increasing attention in a wide range of object-related civil and engineering applications. However, the distribution of objects in HSR remote sensing imagery is location-variable and complicated, and how to accurately detect the objects in HSR remote sensing imagery is a critical problem. Due to the powerful feature extraction and representation capability of deep learning, the deep learning based region proposal generation and object detection integrated framework has greatly promoted the performance of multi-class geospatial object detection for HSR remote sensing imagery. However, due to the translation caused by the convolution operation in the convolutional neural network (CNN), although the performance of the classification stage is seldom influenced, the localization accuracies of the predicted bounding boxes in the detection stage are easily influenced. The dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage has not been addressed for HSR remote sensing imagery, and causes position accuracy problems for multi-class geospatial object detection with region proposal generation and object detection. In order to further improve the performance of the region proposal generation and object detection integrated framework for HSR remote sensing imagery object detection, a position-sensitive balancing (PSB) framework is proposed in this paper for multi-class geospatial object detection from HSR remote sensing imagery. The proposed PSB framework takes full advantage of the fully convolutional network (FCN), on the basis of a residual network, and adopts the PSB framework to solve the dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage. In addition, a pre-training mechanism is utilized to accelerate the training procedure

  4. Likelihood ratio decisions in memory: three implied regularities.

    Science.gov (United States)

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  5. Implications of sensor design for coral reef detection: Upscaling ground hyperspectral imagery in spatial and spectral scales

    Science.gov (United States)

    Caras, Tamir; Hedley, John; Karnieli, Arnon

    2017-12-01

    Remote sensing offers a potential tool for large scale environmental surveying and monitoring. However, remote observations of coral reefs are difficult especially due to the spatial and spectral complexity of the target compared to sensor specifications as well as the environmental implications of the water medium above. The development of sensors is driven by technological advances and the desired products. Currently, spaceborne systems are technologically limited to a choice between high spectral resolution and high spatial resolution, but not both. The current study explores the dilemma of whether future sensor design for marine monitoring should prioritise on improving their spatial or spectral resolution. To address this question, a spatially and spectrally resampled ground-level hyperspectral image was used to test two classification elements: (1) how the tradeoff between spatial and spectral resolutions affects classification; and (2) how a noise reduction by majority filter might improve classification accuracy. The studied reef, in the Gulf of Aqaba (Eilat), Israel, is heterogeneous and complex so the local substrate patches are generally finer than currently available imagery. Therefore, the tested spatial resolution was broadly divided into four scale categories from five millimeters to one meter. Spectral resolution resampling aimed to mimic currently available and forthcoming spaceborne sensors such as (1) Environmental Mapping and Analysis Program (EnMAP) that is characterized by 25 bands of 6.5 nm width; (2) VENμS with 12 narrow bands; and (3) the WorldView series with broadband multispectral resolution. Results suggest that spatial resolution should generally be prioritized for coral reef classification because the finer spatial scale tested (pixel size mind, while the focus in this study was on the technologically limited spaceborne design, aerial sensors may presently provide an opportunity to implement the suggested setup.

  6. Deconstruction of spatial integrity in visual stimulus detected by modulation of synchronized activity in cat visual cortex.

    Science.gov (United States)

    Zhou, Zhiyi; Bernard, Melanie R; Bonds, A B

    2008-04-02

    Spatiotemporal relationships among contour segments can influence synchronization of neural responses in the primary visual cortex. We performed a systematic study to dissociate the impact of spatial and temporal factors in the signaling of contour integration via synchrony. In addition, we characterized the temporal evolution of this process to clarify potential underlying mechanisms. With a 10 x 10 microelectrode array, we recorded the simultaneous activity of multiple cells in the cat primary visual cortex while stimulating with drifting sine-wave gratings. We preserved temporal integrity and systematically degraded spatial integrity of the sine-wave gratings by adding spatial noise. Neural synchronization was analyzed in the time and frequency domains by conducting cross-correlation and coherence analyses. The general association between neural spike trains depends strongly on spatial integrity, with coherence in the gamma band (35-70 Hz) showing greater sensitivity to the change of spatial structure than other frequency bands. Analysis of the temporal dynamics of synchronization in both time and frequency domains suggests that spike timing synchronization is triggered nearly instantaneously by coherent structure in the stimuli, whereas frequency-specific oscillatory components develop more slowly, presumably through network interactions. Our results suggest that, whereas temporal integrity is required for the generation of synchrony, spatial integrity is critical in triggering subsequent gamma band synchronization.

  7. Implementation of 3D spatial indexing and compression in a large-scale molecular dynamics simulation database for rapid atomic contact detection.

    Science.gov (United States)

    Toofanny, Rudesh D; Simms, Andrew M; Beck, David A C; Daggett, Valerie

    2011-08-10

    Molecular dynamics (MD) simulations offer the ability to observe the dynamics and interactions of both whole macromolecules and individual atoms as a function of time. Taken in context with experimental data, atomic interactions from simulation provide insight into the mechanics of protein folding, dynamics, and function. The calculation of atomic interactions or contacts from an MD trajectory is computationally demanding and the work required grows exponentially with the size of the simulation system. We describe the implementation of a spatial indexing algorithm in our multi-terabyte MD simulation database that significantly reduces the run-time required for discovery of contacts. The approach is applied to the Dynameomics project data. Spatial indexing, also known as spatial hashing, is a method that divides the simulation space into regular sized bins and attributes an index to each bin. Since, the calculation of contacts is widely employed in the simulation field, we also use this as the basis for testing compression of data tables. We investigate the effects of compression of the trajectory coordinate tables with different options of data and index compression within MS SQL SERVER 2008. Our implementation of spatial indexing speeds up the calculation of contacts over a 1 nanosecond (ns) simulation window by between 14% and 90% (i.e., 1.2 and 10.3 times faster). For a 'full' simulation trajectory (51 ns) spatial indexing reduces the calculation run-time between 31 and 81% (between 1.4 and 5.3 times faster). Compression resulted in reduced table sizes but resulted in no significant difference in the total execution time for neighbour discovery. The greatest compression (~36%) was achieved using page level compression on both the data and indexes. The spatial indexing scheme significantly decreases the time taken to calculate atomic contacts and could be applied to other multidimensional neighbor discovery problems. The speed up enables on-the-fly calculation

  8. Implementation of 3D spatial indexing and compression in a large-scale molecular dynamics simulation database for rapid atomic contact detection

    Directory of Open Access Journals (Sweden)

    Toofanny Rudesh D

    2011-08-01

    Full Text Available Abstract Background Molecular dynamics (MD simulations offer the ability to observe the dynamics and interactions of both whole macromolecules and individual atoms as a function of time. Taken in context with experimental data, atomic interactions from simulation provide insight into the mechanics of protein folding, dynamics, and function. The calculation of atomic interactions or contacts from an MD trajectory is computationally demanding and the work required grows exponentially with the size of the simulation system. We describe the implementation of a spatial indexing algorithm in our multi-terabyte MD simulation database that significantly reduces the run-time required for discovery of contacts. The approach is applied to the Dynameomics project data. Spatial indexing, also known as spatial hashing, is a method that divides the simulation space into regular sized bins and attributes an index to each bin. Since, the calculation of contacts is widely employed in the simulation field, we also use this as the basis for testing compression of data tables. We investigate the effects of compression of the trajectory coordinate tables with different options of data and index compression within MS SQL SERVER 2008. Results Our implementation of spatial indexing speeds up the calculation of contacts over a 1 nanosecond (ns simulation window by between 14% and 90% (i.e., 1.2 and 10.3 times faster. For a 'full' simulation trajectory (51 ns spatial indexing reduces the calculation run-time between 31 and 81% (between 1.4 and 5.3 times faster. Compression resulted in reduced table sizes but resulted in no significant difference in the total execution time for neighbour discovery. The greatest compression (~36% was achieved using page level compression on both the data and indexes. Conclusions The spatial indexing scheme significantly decreases the time taken to calculate atomic contacts and could be applied to other multidimensional neighbor discovery

  9. Built-Up Area Detection from High-Resolution Satellite Images Using Multi-Scale Wavelet Transform and Local Spatial Statistics

    Science.gov (United States)

    Chen, Y.; Zhang, Y.; Gao, J.; Yuan, Y.; Lv, Z.

    2018-04-01

    Recently, built-up area detection from high-resolution satellite images (HRSI) has attracted increasing attention because HRSI can provide more detailed object information. In this paper, multi-resolution wavelet transform and local spatial autocorrelation statistic are introduced to model the spatial patterns of built-up areas. First, the input image is decomposed into high- and low-frequency subbands by wavelet transform at three levels. Then the high-frequency detail information in three directions (horizontal, vertical and diagonal) are extracted followed by a maximization operation to integrate the information in all directions. Afterward, a cross-scale operation is implemented to fuse different levels of information. Finally, local spatial autocorrelation statistic is introduced to enhance the saliency of built-up features and an adaptive threshold algorithm is used to achieve the detection of built-up areas. Experiments are conducted on ZY-3 and Quickbird panchromatic satellite images, and the results show that the proposed method is very effective for built-up area detection.

  10. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    Science.gov (United States)

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  11. New regular black hole solutions

    International Nuclear Information System (INIS)

    Lemos, Jose P. S.; Zanchin, Vilson T.

    2011-01-01

    In the present work we consider general relativity coupled to Maxwell's electromagnetism and charged matter. Under the assumption of spherical symmetry, there is a particular class of solutions that correspond to regular charged black holes whose interior region is de Sitter, the exterior region is Reissner-Nordstroem and there is a charged thin-layer in-between the two. The main physical and geometrical properties of such charged regular black holes are analyzed.

  12. Regular variation on measure chains

    Czech Academy of Sciences Publication Activity Database

    Řehák, Pavel; Vitovec, J.

    2010-01-01

    Roč. 72, č. 1 (2010), s. 439-448 ISSN 0362-546X R&D Projects: GA AV ČR KJB100190701 Institutional research plan: CEZ:AV0Z10190503 Keywords : regularly varying function * regularly varying sequence * measure chain * time scale * embedding theorem * representation theorem * second order dynamic equation * asymptotic properties Subject RIV: BA - General Mathematics Impact factor: 1.279, year: 2010 http://www.sciencedirect.com/science/article/pii/S0362546X09008475

  13. Manifold Regularized Correlation Object Tracking

    OpenAIRE

    Hu, Hongwei; Ma, Bo; Shen, Jianbing; Shao, Ling

    2017-01-01

    In this paper, we propose a manifold regularized correlation tracking method with augmented samples. To make better use of the unlabeled data and the manifold structure of the sample space, a manifold regularization-based correlation filter is introduced, which aims to assign similar labels to neighbor samples. Meanwhile, the regression model is learned by exploiting the block-circulant structure of matrices resulting from the augmented translated samples over multiple base samples cropped fr...

  14. On geodesics in low regularity

    Science.gov (United States)

    Sämann, Clemens; Steinbauer, Roland

    2018-02-01

    We consider geodesics in both Riemannian and Lorentzian manifolds with metrics of low regularity. We discuss existence of extremal curves for continuous metrics and present several old and new examples that highlight their subtle interrelation with solutions of the geodesic equations. Then we turn to the initial value problem for geodesics for locally Lipschitz continuous metrics and generalize recent results on existence, regularity and uniqueness of solutions in the sense of Filippov.

  15. Regularized forecasting of chaotic dynamical systems

    International Nuclear Information System (INIS)

    Bollt, Erik M.

    2017-01-01

    While local models of dynamical systems have been highly successful in terms of using extensive data sets observing even a chaotic dynamical system to produce useful forecasts, there is a typical problem as follows. Specifically, with k-near neighbors, kNN method, local observations occur due to recurrences in a chaotic system, and this allows for local models to be built by regression to low dimensional polynomial approximations of the underlying system estimating a Taylor series. This has been a popular approach, particularly in context of scalar data observations which have been represented by time-delay embedding methods. However such local models can generally allow for spatial discontinuities of forecasts when considered globally, meaning jumps in predictions because the collected near neighbors vary from point to point. The source of these discontinuities is generally that the set of near neighbors varies discontinuously with respect to the position of the sample point, and so therefore does the model built from the near neighbors. It is possible to utilize local information inferred from near neighbors as usual but at the same time to impose a degree of regularity on a global scale. We present here a new global perspective extending the general local modeling concept. In so doing, then we proceed to show how this perspective allows us to impose prior presumed regularity into the model, by involving the Tikhonov regularity theory, since this classic perspective of optimization in ill-posed problems naturally balances fitting an objective with some prior assumed form of the result, such as continuity or derivative regularity for example. This all reduces to matrix manipulations which we demonstrate on a simple data set, with the implication that it may find much broader context.

  16. Condition Number Regularized Covariance Estimation.

    Science.gov (United States)

    Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala

    2013-06-01

    Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the "large p small n " setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required.

  17. Condition Number Regularized Covariance Estimation*

    Science.gov (United States)

    Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala

    2012-01-01

    Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the “large p small n” setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required. PMID:23730197

  18. A Third-Generation Adaptive Statistical Iterative Reconstruction Technique: Phantom Study of Image Noise, Spatial Resolution, Lesion Detectability, and Dose Reduction Potential.

    Science.gov (United States)

    Euler, André; Solomon, Justin; Marin, Daniele; Nelson, Rendon C; Samei, Ehsan

    2018-06-01

    The purpose of this study was to assess image noise, spatial resolution, lesion detectability, and the dose reduction potential of a proprietary third-generation adaptive statistical iterative reconstruction (ASIR-V) technique. A phantom representing five different body sizes (12-37 cm) and a contrast-detail phantom containing lesions of five low-contrast levels (5-20 HU) and three sizes (2-6 mm) were deployed. Both phantoms were scanned on a 256-MDCT scanner at six different radiation doses (1.25-10 mGy). Images were reconstructed with filtered back projection (FBP), ASIR-V with 50% blending with FBP (ASIR-V 50%), and ASIR-V without blending (ASIR-V 100%). In the first phantom, noise properties were assessed by noise power spectrum analysis. Spatial resolution properties were measured by use of task transfer functions for objects of different contrasts. Noise magnitude, noise texture, and resolution were compared between the three groups. In the second phantom, low-contrast detectability was assessed by nine human readers independently for each condition. The dose reduction potential of ASIR-V was estimated on the basis of a generalized linear statistical regression model. On average, image noise was reduced 37.3% with ASIR-V 50% and 71.5% with ASIR-V 100% compared with FBP. ASIR-V shifted the noise power spectrum toward lower frequencies compared with FBP. The spatial resolution of ASIR-V was equivalent or slightly superior to that of FBP, except for the low-contrast object, which had lower resolution. Lesion detection significantly increased with both ASIR-V levels (p = 0.001), with an estimated radiation dose reduction potential of 15% ± 5% (SD) for ASIR-V 50% and 31% ± 9% for ASIR-V 100%. ASIR-V reduced image noise and improved lesion detection compared with FBP and had potential for radiation dose reduction while preserving low-contrast detectability.

  19. Chimeric mitochondrial peptides from contiguous regular and swinger RNA.

    Science.gov (United States)

    Seligmann, Hervé

    2016-01-01

    Previous mass spectrometry analyses described human mitochondrial peptides entirely translated from swinger RNAs, RNAs where polymerization systematically exchanged nucleotides. Exchanges follow one among 23 bijective transformation rules, nine symmetric exchanges (X ↔ Y, e.g. A ↔ C) and fourteen asymmetric exchanges (X → Y → Z → X, e.g. A → C → G → A), multiplying by 24 DNA's protein coding potential. Abrupt switches from regular to swinger polymerization produce chimeric RNAs. Here, human mitochondrial proteomic analyses assuming abrupt switches between regular and swinger transcriptions, detect chimeric peptides, encoded by part regular, part swinger RNA. Contiguous regular- and swinger-encoded residues within single peptides are stronger evidence for translation of swinger RNA than previously detected, entirely swinger-encoded peptides: regular parts are positive controls matched with contiguous swinger parts, increasing confidence in results. Chimeric peptides are 200 × rarer than swinger peptides (3/100,000 versus 6/1000). Among 186 peptides with > 8 residues for each regular and swinger parts, regular parts of eleven chimeric peptides correspond to six among the thirteen recognized, mitochondrial protein-coding genes. Chimeric peptides matching partly regular proteins are rarer and less expressed than chimeric peptides matching non-coding sequences, suggesting targeted degradation of misfolded proteins. Present results strengthen hypotheses that the short mitogenome encodes far more proteins than hitherto assumed. Entirely swinger-encoded proteins could exist.

  20. Reconstruction of signal in plastic scintillator of PET using Tikhonov regularization.

    Science.gov (United States)

    Raczynski, Lech

    2015-08-01

    The new concept of Time of Flight Positron Emission Tomography (TOF-PET) detection system, which allows for single bed imaging of the whole human body, is currently under development at the Jagiellonian University. The Jagiellonian-PET (J-PET) detector improves the TOF resolution due to the use of fast plastic scintillators. Since registration of the waveform of signals with duration times of few nanoseconds is not feasible, a novel front-end electronics allowing for sampling in a voltage domain at four thresholds was developed. To take fully advantage of these fast signals a novel scheme of recovery of the waveform of the signal, based on idea from the Tikhonov regularization method, is presented. From the Bayes theory the properties of regularized solution, especially its covariance matrix, may be easily derived. This step is crucial to introduce and prove the formula for calculations of the signal recovery error. The method is tested using signals registered by means of the single detection module of the J-PET detector built out from the 30 cm long plastic scintillator strip. It is shown that using the recovered waveform of the signals, instead of samples at four voltage levels alone, improves the spatial resolution of the hit position reconstruction from 1.05 cm to 0.94 cm. Moreover, the obtained result is only slightly worse than the one evaluated using the original raw-signal. The spatial resolution calculated under these conditions is equal to 0.93 cm.

  1. Geometric continuum regularization of quantum field theory

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1989-01-01

    An overview of the continuum regularization program is given. The program is traced from its roots in stochastic quantization, with emphasis on the examples of regularized gauge theory, the regularized general nonlinear sigma model and regularized quantum gravity. In its coordinate-invariant form, the regularization is seen as entirely geometric: only the supermetric on field deformations is regularized, and the prescription provides universal nonperturbative invariant continuum regularization across all quantum field theory. 54 refs

  2. ST Spot Detector: a web-based application for automatic spot and tissue detection for Spatial Transcriptomics image data sets.

    Science.gov (United States)

    Wong, Kim; Fernández Navarro, José; Bergenstråhle, Ludvig; Ståhl, Patrik L; Lundeberg, Joakim

    2018-01-17

    Spatial transcriptomics (ST) is a method which combines high resolution tissue imaging with high throughput transcriptome sequencing data. This data must be aligned with the images for correct visualisation, a process that involves several manual steps. Here we present ST Spot Detector, a web tool that automates and facilitates this alignment through a user friendly interface. Open source under the MIT license, available from https://github.com/SpatialTranscriptomicsResearch/st_spot_detector. jose.fernandez.navarro@scilifelab.se. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  3. Three regularities of recognition memory: the role of bias.

    Science.gov (United States)

    Hilford, Andrew; Maloney, Laurence T; Glanzer, Murray; Kim, Kisok

    2015-12-01

    A basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities. A survey of data based on group ROCs from 36 studies validated the likelihood ratio assumption by showing that its three implied regularities are ubiquitous. The study noted, however, that bias, another basic factor in Signal Detection Theory, can obscure the Mirror Effect. In this paper we examine how bias affects the regularities at the theoretical level. The theoretical analysis shows: (1) how bias obscures the Mirror Effect, not the other two regularities, and (2) four ways to counter that obscuring. We then report the results of five experiments that support the theoretical analysis. The analyses and the experimental results also demonstrate: (1) that the three regularities govern individual, as well as group, performance, (2) alternative explanations of the regularities are ruled out, and (3) that Signal Detection Theory, correctly applied, gives a simple and unified explanation of recognition memory data.

  4. UAV-based detection and spatial analyses of periglacial landforms on Demay Point (King George Island, South Shetland Islands, Antarctica)

    Science.gov (United States)

    Dąbski, Maciej; Zmarz, Anna; Pabjanek, Piotr; Korczak-Abshire, Małgorzata; Karsznia, Izabela; Chwedorzewska, Katarzyna J.

    2017-08-01

    High-resolution aerial images allow detailed analyses of periglacial landforms, which is of particular importance in light of climate change and resulting changes in active layer thickness. The aim of this study is to show possibilities of using UAV-based photography to perform spatial analysis of periglacial landforms on the Demay Point peninsula, King George Island, and hence to supplement previous geomorphological studies of the South Shetland Islands. Photogrammetric flights were performed using a PW-ZOOM fixed-winged unmanned aircraft vehicle. Digital elevation models (DEM) and maps of slope and contour lines were prepared in ESRI ArcGIS 10.3 with the Spatial Analyst extension, and three-dimensional visualizations in ESRI ArcScene 10.3 software. Careful interpretation of orthophoto and DEM, allowed us to vectorize polygons of landforms, such as (i) solifluction landforms (solifluction sheets, tongues, and lobes); (ii) scarps, taluses, and a protalus rampart; (iii) patterned ground (hummocks, sorted circles, stripes, nets and labyrinths, and nonsorted nets and stripes); (iv) coastal landforms (cliffs and beaches); (v) landslides and mud flows; and (vi) stone fields and bedrock outcrops. We conclude that geomorphological studies based on commonly accessible aerial and satellite images can underestimate the spatial extent of periglacial landforms and result in incomplete inventories. The PW-ZOOM UAV is well suited to gather detailed geomorphological data and can be used in spatial analysis of periglacial landforms in the Western Antarctic Peninsula region.

  5. Combination of panoramic and fluorescence endoscopic images to obtain tumor spatial distribution information useful for bladder cancer detection

    Science.gov (United States)

    Olijnyk, S.; Hernández Mier, Y.; Blondel, W. C. P. M.; Daul, C.; Wolf, D.; Bourg-Heckly, G.

    2007-07-01

    Bladder cancer is widely spread. Moreover, carcinoma in situ can be difficult to diagnose as it may be difficult to see, and become invasive in 50 % of case. Non invasive diagnosis methods like photodynamic or autofluorescence endoscopy allow enhancing sensitivity and specificity. Besides, bladder tumors can be multifocal. Multifocality increases the probability of recurrence and infiltration into bladder muscle. Analysis of spatial distribution of tumors could be used to improve diagnosis. We explore the feasibility to combine fluorescence and spatial information on phantoms. We developed a system allowing the acquisition of consecutive images under white light or UV excitation alternatively and automatically along the video sequence. We also developed an automatic image processing algorithm to build a partial panoramic image from a cystoscopic sequence of images. Fluorescence information is extracted from wavelength bandpass filtered images and superimposed over the cartography. Then, spatial distribution measures of fluorescent spots can be computed. This cartography can be positioned on a 3D generic shape of bladder by selecting some reference points. Our first results on phantoms show that it is possible to obtain cartography with fluorescent spots and extract quantitative information of their spatial distribution on a "wide" field of view basis.

  6. Detection of spatial aggregation of cases of cancer from data on patients and health centres contained in the Minimum Basic Data Set

    Directory of Open Access Journals (Sweden)

    Pablo Fernández-Navarro

    2018-05-01

    Full Text Available The feasibility of the Minimum Basic Data Set (MBDS as a tool in cancer research was explored monitoring its incidence through the detection of spatial clusters. Case-control studies based on MBDS and marked point process were carried out with the focus on the residence of patients from the Prince of Asturias University Hospital in Alcalá de Henares (Madrid, Spain. Patients older than 39 years with diagnoses of stomach, colorectal, lung, breast, prostate, bladder and kidney cancer, melanoma and haematological tumours were selected. Geocoding of the residence address of the cases was done by locating them in the continuous population roll provided by the Madrid Statistical Institute and extracting the coordinates. The geocoded control group was a random sample of 10 controls per case matched by frequency of age and sex. To assess case clusters, differences in Ripley K functions between cases and controls were calculated. The spatial location of clusters was explored by investigating spatial intensity and its ratio between cases and controls. Results suggest the existence of an aggregation of cancers with a common risk factor such as tobacco smoking (lung, bladder and kidney cancers. These clusters were located in an urban area with high socioeconomic deprivation. The feasibility of designing and carrying out case-control studies from the MBDS is shown and we conclude that MBDS can be a useful epidemiological tool for cancer surveillance and identification of risk factors through case-control spatial point process studies.

  7. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  8. Manifold Regularized Correlation Object Tracking.

    Science.gov (United States)

    Hu, Hongwei; Ma, Bo; Shen, Jianbing; Shao, Ling

    2018-05-01

    In this paper, we propose a manifold regularized correlation tracking method with augmented samples. To make better use of the unlabeled data and the manifold structure of the sample space, a manifold regularization-based correlation filter is introduced, which aims to assign similar labels to neighbor samples. Meanwhile, the regression model is learned by exploiting the block-circulant structure of matrices resulting from the augmented translated samples over multiple base samples cropped from both target and nontarget regions. Thus, the final classifier in our method is trained with positive, negative, and unlabeled base samples, which is a semisupervised learning framework. A block optimization strategy is further introduced to learn a manifold regularization-based correlation filter for efficient online tracking. Experiments on two public tracking data sets demonstrate the superior performance of our tracker compared with the state-of-the-art tracking approaches.

  9. Detection of infarct lesions from single MRI modality using inconsistency between voxel intensity and spatial location--a 3-D automatic approach.

    Science.gov (United States)

    Shen, Shan; Szameitat, André J; Sterr, Annette

    2008-07-01

    Detection of infarct lesions using traditional segmentation methods is always problematic due to intensity similarity between lesions and normal tissues, so that multispectral MRI modalities were often employed for this purpose. However, the high costs of MRI scan and the severity of patient conditions restrict the collection of multiple images. Therefore, in this paper, a new 3-D automatic lesion detection approach was proposed, which required only a single type of anatomical MRI scan. It was developed on a theory that, when lesions were present, the voxel-intensity-based segmentation and the spatial-location-based tissue distribution should be inconsistent in the regions of lesions. The degree of this inconsistency was calculated, which indicated the likelihood of tissue abnormality. Lesions were identified when the inconsistency exceeded a defined threshold. In this approach, the intensity-based segmentation was implemented by the conventional fuzzy c-mean (FCM) algorithm, while the spatial location of tissues was provided by prior tissue probability maps. The use of simulated MRI lesions allowed us to quantitatively evaluate the performance of the proposed method, as the size and location of lesions were prespecified. The results showed that our method effectively detected lesions with 40-80% signal reduction compared to normal tissues (similarity index > 0.7). The capability of the proposed method in practice was also demonstrated on real infarct lesions from 15 stroke patients, where the lesions detected were in broad agreement with true lesions. Furthermore, a comparison to a statistical segmentation approach presented in the literature suggested that our 3-D lesion detection approach was more reliable. Future work will focus on adapting the current method to multiple sclerosis lesion detection.

  10. Dimensional regularization in configuration space

    International Nuclear Information System (INIS)

    Bollini, C.G.; Giambiagi, J.J.

    1995-09-01

    Dimensional regularization is introduced in configuration space by Fourier transforming in D-dimensions the perturbative momentum space Green functions. For this transformation, Bochner theorem is used, no extra parameters, such as those of Feynman or Bogoliubov-Shirkov are needed for convolutions. The regularized causal functions in x-space have ν-dependent moderated singularities at the origin. They can be multiplied together and Fourier transformed (Bochner) without divergence problems. The usual ultraviolet divergences appear as poles of the resultant functions of ν. Several example are discussed. (author). 9 refs

  11. Regular algebra and finite machines

    CERN Document Server

    Conway, John Horton

    2012-01-01

    World-famous mathematician John H. Conway based this classic text on a 1966 course he taught at Cambridge University. Geared toward graduate students of mathematics, it will also prove a valuable guide to researchers and professional mathematicians.His topics cover Moore's theory of experiments, Kleene's theory of regular events and expressions, Kleene algebras, the differential calculus of events, factors and the factor matrix, and the theory of operators. Additional subjects include event classes and operator classes, some regulator algebras, context-free languages, communicative regular alg

  12. Matrix regularization of 4-manifolds

    OpenAIRE

    Trzetrzelewski, M.

    2012-01-01

    We consider products of two 2-manifolds such as S^2 x S^2, embedded in Euclidean space and show that the corresponding 4-volume preserving diffeomorphism algebra can be approximated by a tensor product SU(N)xSU(N) i.e. functions on a manifold are approximated by the Kronecker product of two SU(N) matrices. A regularization of the 4-sphere is also performed by constructing N^2 x N^2 matrix representations of the 4-algebra (and as a byproduct of the 3-algebra which makes the regularization of S...

  13. An Object-Based Image Analysis Approach for Detecting Penguin Guano in very High Spatial Resolution Satellite Images

    OpenAIRE

    Chandi Witharana; Heather J. Lynch

    2016-01-01

    The logistical challenges of Antarctic field work and the increasing availability of very high resolution commercial imagery have driven an interest in more efficient search and classification of remotely sensed imagery. This exploratory study employed geographic object-based analysis (GEOBIA) methods to classify guano stains, indicative of chinstrap and Adélie penguin breeding areas, from very high spatial resolution (VHSR) satellite imagery and closely examined the transferability of knowle...

  14. Bamboo-dominated forests of the southwest Amazon: detection, spatial extent, life cycle length and flowering waves.

    Directory of Open Access Journals (Sweden)

    Anelena L de Carvalho

    Full Text Available We map the extent, infer the life-cycle length and describe spatial and temporal patterns of flowering of sarmentose bamboos (Guadua spp in upland forests of the southwest Amazon. We first examine the spectra and the spectral separation of forests with different bamboo life stages. False-color composites from orbital sensors going back to 1975 are capable of distinguishing life stages. These woody bamboos flower produce massive quantities of seeds and then die. Life stage is synchronized, forming a single cohort within each population. Bamboo dominates at least 161,500 km(2 of forest, coincident with an area of recent or ongoing tectonic uplift, rapid mechanical erosion and poorly drained soils rich in exchangeable cations. Each bamboo population is confined to a single spatially continuous patch or to a core patch with small outliers. Using spatial congruence between pairs of mature-stage maps from different years, we estimate an average life cycle of 27-28 y. It is now possible to predict exactly where and approximately when new bamboo mortality events will occur. We also map 74 bamboo populations that flowered between 2001 and 2008 over the entire domain of bamboo-dominated forest. Population size averaged 330 km(2. Flowering events of these populations are temporally and/or spatially separated, restricting or preventing gene exchange. Nonetheless, adjacent populations flower closer in time than expected by chance, forming flowering waves. This may be a consequence of allochronic divergence from fewer ancestral populations and suggests a long history of widespread bamboo in the southwest Amazon.

  15. Bamboo-dominated forests of the southwest Amazon: detection, spatial extent, life cycle length and flowering waves.

    Science.gov (United States)

    de Carvalho, Anelena L; Nelson, Bruce W; Bianchini, Milton C; Plagnol, Daniela; Kuplich, Tatiana M; Daly, Douglas C

    2013-01-01

    We map the extent, infer the life-cycle length and describe spatial and temporal patterns of flowering of sarmentose bamboos (Guadua spp) in upland forests of the southwest Amazon. We first examine the spectra and the spectral separation of forests with different bamboo life stages. False-color composites from orbital sensors going back to 1975 are capable of distinguishing life stages. These woody bamboos flower produce massive quantities of seeds and then die. Life stage is synchronized, forming a single cohort within each population. Bamboo dominates at least 161,500 km(2) of forest, coincident with an area of recent or ongoing tectonic uplift, rapid mechanical erosion and poorly drained soils rich in exchangeable cations. Each bamboo population is confined to a single spatially continuous patch or to a core patch with small outliers. Using spatial congruence between pairs of mature-stage maps from different years, we estimate an average life cycle of 27-28 y. It is now possible to predict exactly where and approximately when new bamboo mortality events will occur. We also map 74 bamboo populations that flowered between 2001 and 2008 over the entire domain of bamboo-dominated forest. Population size averaged 330 km(2). Flowering events of these populations are temporally and/or spatially separated, restricting or preventing gene exchange. Nonetheless, adjacent populations flower closer in time than expected by chance, forming flowering waves. This may be a consequence of allochronic divergence from fewer ancestral populations and suggests a long history of widespread bamboo in the southwest Amazon.

  16. Correlation of propagation characteristics of solar cosmic rays detected onboard the spatially separated space probes Mars-7 and Prognoz-3

    International Nuclear Information System (INIS)

    Gombosi, T.; Somogyi, A.J.; Kolesov, G.Ya.; Kurt, V.G.; Kuzhevskii, B.M.; Logachev, Yu.I.; Savenko, I.A.

    1977-01-01

    Solar flare generated particle fluxes during the period 3-5 November, 1973 are analysed using the data of the Mars 7 and Prognoz-3 spacecrafts. The intensity profiles registrated onboard these satellites were quite similar, although the space probes were spatially separated by 0.3 AU. The general characteristics of the event can well be understood in terms of the effect of a corotating streat-stream interaction region on the general behaviour of energetic charged particles. (author)

  17. Spatially adaptive mixture modeling for analysis of FMRI time series.

    Science.gov (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe

    2010-04-01

    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  18. Apparent spatial uniformity of the gamma-ray bursts detected by the Konus experiment on Venera 11 and Venera 12

    International Nuclear Information System (INIS)

    Higdon, J.C.; Schmidt, M.

    1990-01-01

    The V/Vmax test is applied to gamma-ray bursts of duration longer than 1 sec recorded by the Konus experiment, to examine quantitatively the uniformity of the burst source population. A sample of 123 bursts detected on Venera 11 and Venera 12, gives mean V/Vmax = 0.45 + or - 0.03, consistent with 0.5, the value expected for a uniform distribution in space of the parent population of burst sources. It is argued that experimenters give careful attention to the detection limit for each recorded gamma-ray burst, and that quantitative data for burst properties and detection limits should be published. 28 refs

  19. Regularization of Nonmonotone Variational Inequalities

    International Nuclear Information System (INIS)

    Konnov, Igor V.; Ali, M.S.S.; Mazurkevich, E.O.

    2006-01-01

    In this paper we extend the Tikhonov-Browder regularization scheme from monotone to rather a general class of nonmonotone multivalued variational inequalities. We show that their convergence conditions hold for some classes of perfectly and nonperfectly competitive economic equilibrium problems

  20. Lattice regularized chiral perturbation theory

    International Nuclear Information System (INIS)

    Borasoy, Bugra; Lewis, Randy; Ouimet, Pierre-Philippe A.

    2004-01-01

    Chiral perturbation theory can be defined and regularized on a spacetime lattice. A few motivations are discussed here, and an explicit lattice Lagrangian is reviewed. A particular aspect of the connection between lattice chiral perturbation theory and lattice QCD is explored through a study of the Wess-Zumino-Witten term

  1. 76 FR 3629 - Regular Meeting

    Science.gov (United States)

    2011-01-20

    ... Meeting SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). Date and Time: The meeting of the Board will be held at the offices of the Farm... meeting of the Board will be open to the [[Page 3630

  2. Forcing absoluteness and regularity properties

    NARCIS (Netherlands)

    Ikegami, D.

    2010-01-01

    For a large natural class of forcing notions, we prove general equivalence theorems between forcing absoluteness statements, regularity properties, and transcendence properties over L and the core model K. We use our results to answer open questions from set theory of the reals.

  3. Globals of Completely Regular Monoids

    Institute of Scientific and Technical Information of China (English)

    Wu Qian-qian; Gan Ai-ping; Du Xian-kun

    2015-01-01

    An element of a semigroup S is called irreducible if it cannot be expressed as a product of two elements in S both distinct from itself. In this paper we show that the class C of all completely regular monoids with irreducible identity elements satisfies the strong isomorphism property and so it is globally determined.

  4. Fluid queues and regular variation

    NARCIS (Netherlands)

    Boxma, O.J.

    1996-01-01

    This paper considers a fluid queueing system, fed by N independent sources that alternate between silence and activity periods. We assume that the distribution of the activity periods of one or more sources is a regularly varying function of index ¿. We show that its fat tail gives rise to an even

  5. Fluid queues and regular variation

    NARCIS (Netherlands)

    O.J. Boxma (Onno)

    1996-01-01

    textabstractThis paper considers a fluid queueing system, fed by $N$ independent sources that alternate between silence and activity periods. We assume that the distribution of the activity periods of one or more sources is a regularly varying function of index $zeta$. We show that its fat tail

  6. Empirical laws, regularity and necessity

    NARCIS (Netherlands)

    Koningsveld, H.

    1973-01-01

    In this book I have tried to develop an analysis of the concept of an empirical law, an analysis that differs in many ways from the alternative analyse's found in contemporary literature dealing with the subject.

    1 am referring especially to two well-known views, viz. the regularity and

  7. Interval matrices: Regularity generates singularity

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří; Shary, S.P.

    2018-01-01

    Roč. 540, 1 March (2018), s. 149-159 ISSN 0024-3795 Institutional support: RVO:67985807 Keywords : interval matrix * regularity * singularity * P-matrix * absolute value equation * diagonally singilarizable matrix Subject RIV: BA - General Mathematics Impact factor: 0.973, year: 2016

  8. Regularization in Matrix Relevance Learning

    NARCIS (Netherlands)

    Schneider, Petra; Bunte, Kerstin; Stiekema, Han; Hammer, Barbara; Villmann, Thomas; Biehl, Michael

    A In this paper, we present a regularization technique to extend recently proposed matrix learning schemes in learning vector quantization (LVQ). These learning algorithms extend the concept of adaptive distance measures in LVQ to the use of relevance matrices. In general, metric learning can

  9. Detection of spatial hot spots and variation for the neon flying squid Ommastrephes bartramii resources in the northwest Pacific Ocean

    Science.gov (United States)

    Feng, Yongjiu; Chen, Xinjun; Liu, Yan

    2017-07-01

    With the increasing effects of global climate change and fishing activities, the spatial distribution of the neon flying squid ( Ommastrephes bartramii) is changing in the traditional fishing ground of 150°-160°E and 38°-45°N in the northwest Pacific Ocean. This research aims to identify the spatial hot and cold spots (i.e. spatial clusters) of O. bartramii to reveal its spatial structure using commercial fishery data from 2007 to 2010 collected by Chinese mainland squid-jigging fleets. A relatively strongly-clustered distribution for O. bartramii was observed using an exploratory spatial data analysis (ESDA) method. The results show two hot spots and one cold spot in 2007 while only one hot and one cold spots were identified each year from 2008 to 2010. The hot and cold spots in 2007 occupied 8.2% and 5.6% of the study area, respectively; these percentages for hot and cold spot areas were 5.8% and 3.1% in 2008, 10.2% and 2.9% in 2009, and 16.4% and 11.9% in 2010, respectively. Nearly half (>45%) of the squid from 2007 to 2009 reported by Chinese fleets were caught in hot spot areas while this percentage reached its peak at 68.8% in 2010, indicating that the hot spot areas are central fishing grounds. A further change analysis shows the area centered at 156°E/43.5°N was persistent as a hot spot over the whole period from 2007 to 2010. Furthermore, the hot spots were mainly identified in areas with sea surface temperature (SST) in the range of 15-20°C around warm Kuroshio Currents as well as with the chlorophyll- a (chl- a) concentration above 0.3 mg/m3. The outcome of this research improves our understanding of spatiotemporal hotspots and its variation for O. bartramii and is useful for sustainable exploitation, assessment, and management of this squid.

  10. Regularization destriping of remote sensing imagery

    Science.gov (United States)

    Basnayake, Ranil; Bollt, Erik; Tufillaro, Nicholas; Sun, Jie; Gierach, Michelle

    2017-07-01

    We illustrate the utility of variational destriping for ocean color images from both multispectral and hyperspectral sensors. In particular, we examine data from a filter spectrometer, the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar Partnership (NPP) orbiter, and an airborne grating spectrometer, the Jet Population Laboratory's (JPL) hyperspectral Portable Remote Imaging Spectrometer (PRISM) sensor. We solve the destriping problem using a variational regularization method by giving weights spatially to preserve the other features of the image during the destriping process. The target functional penalizes the neighborhood of stripes (strictly, directionally uniform features) while promoting data fidelity, and the functional is minimized by solving the Euler-Lagrange equations with an explicit finite-difference scheme. We show the accuracy of our method from a benchmark data set which represents the sea surface temperature off the coast of Oregon, USA. Technical details, such as how to impose continuity across data gaps using inpainting, are also described.

  11. Detecting small-scale spatial differences and temporal dynamics of soil organic carbon (SOC) stocks: a comparison between automatic chamber-derived C budgets and repeated soil inventories

    Science.gov (United States)

    Hoffmann, Mathias; Jurisch, Nicole; Garcia Alba, Juana; Albiac Borraz, Elisa; Schmidt, Marten; Huth, Vytas; Rogasik, Helmut; Rieckh, Helene; Verch, Gernot; Sommer, Michael; Augustin, Jürgen

    2017-04-01

    Carbon (C) sequestration in soils plays a key role in the global C cycle. It is therefore crucial to adequately monitor dynamics in soil organic carbon (ΔSOC) stocks when aiming to reveal underlying processes and potential drivers. However, small-scale spatial and temporal changes in SOC stocks, particularly pronounced on arable lands, are hard to assess. The main reasons for this are limitations of the well-established methods. On the one hand, repeated soil inventories, often used in long-term field trials, reveal spatial patterns and trends in ΔSOC but require a longer observation period and a sufficient number of repetitions. On the other hand, eddy covariance measurements of C fluxes towards a complete C budget of the soil-plant-atmosphere system may help to obtain temporal ΔSOC patterns but lack small-scale spatial resolution. To overcome these limitations, this study presents a reliable method to detect both short-term temporal as well as small-scale spatial dynamics of ΔSOC. Therefore, a combination of automatic chamber (AC) measurements of CO2 exchange and empirically modeled aboveground biomass development (NPPshoot) was used. To verify our method, results were compared with ΔSOC observed by soil resampling. AC measurements were performed from 2010 to 2014 under a silage maize/winter fodder rye/sorghum-Sudan grass hybrid/alfalfa crop rotation at a colluvial depression located in the hummocky ground moraine landscape of NE Germany. Widespread in large areas of the formerly glaciated Northern Hemisphere, this depression type is characterized by a variable groundwater level (GWL) and pronounced small-scale spatial heterogeneity in soil properties, such as SOC and nitrogen (Nt). After monitoring the initial stage during 2010, soil erosion was experimentally simulated by incorporating topsoil material from an eroded midslope soil into the plough layer of the colluvial depression. SOC stocks were quantified before and after soil manipulation and at the end

  12. Detecting small-scale spatial heterogeneity and temporal dynamics of soil organic carbon (SOC) stocks: a comparison between automatic chamber-derived C budgets and repeated soil inventories

    Science.gov (United States)

    Hoffmann, Mathias; Jurisch, Nicole; Garcia Alba, Juana; Albiac Borraz, Elisa; Schmidt, Marten; Huth, Vytas; Rogasik, Helmut; Rieckh, Helene; Verch, Gernot; Sommer, Michael; Augustin, Jürgen

    2017-03-01

    Carbon (C) sequestration in soils plays a key role in the global C cycle. It is therefore crucial to adequately monitor dynamics in soil organic carbon (ΔSOC) stocks when aiming to reveal underlying processes and potential drivers. However, small-scale spatial (10-30 m) and temporal changes in SOC stocks, particularly pronounced in arable lands, are hard to assess. The main reasons for this are limitations of the well-established methods. On the one hand, repeated soil inventories, often used in long-term field trials, reveal spatial patterns and trends in ΔSOC but require a longer observation period and a sufficient number of repetitions. On the other hand, eddy covariance measurements of C fluxes towards a complete C budget of the soil-plant-atmosphere system may help to obtain temporal ΔSOC patterns but lack small-scale spatial resolution. To overcome these limitations, this study presents a reliable method to detect both short-term temporal dynamics as well as small-scale spatial differences of ΔSOC using measurements of the net ecosystem carbon balance (NECB) as a proxy. To estimate the NECB, a combination of automatic chamber (AC) measurements of CO2 exchange and empirically modeled aboveground biomass development (NPPshoot) were used. To verify our method, results were compared with ΔSOC observed by soil resampling. Soil resampling and AC measurements were performed from 2010 to 2014 at a colluvial depression located in the hummocky ground moraine landscape of northeastern Germany. The measurement site is characterized by a variable groundwater level (GWL) and pronounced small-scale spatial heterogeneity regarding SOC and nitrogen (Nt) stocks. Tendencies and magnitude of ΔSOC values derived by AC measurements and repeated soil inventories corresponded well. The period of maximum plant growth was identified as being most important for the development of spatial differences in annual ΔSOC. Hence, we were able to confirm that AC-based C budgets are able

  13. Regular and conformal regular cores for static and rotating solutions

    Energy Technology Data Exchange (ETDEWEB)

    Azreg-Aïnou, Mustapha

    2014-03-07

    Using a new metric for generating rotating solutions, we derive in a general fashion the solution of an imperfect fluid and that of its conformal homolog. We discuss the conditions that the stress–energy tensors and invariant scalars be regular. On classical physical grounds, it is stressed that conformal fluids used as cores for static or rotating solutions are exempt from any malicious behavior in that they are finite and defined everywhere.

  14. Regular and conformal regular cores for static and rotating solutions

    International Nuclear Information System (INIS)

    Azreg-Aïnou, Mustapha

    2014-01-01

    Using a new metric for generating rotating solutions, we derive in a general fashion the solution of an imperfect fluid and that of its conformal homolog. We discuss the conditions that the stress–energy tensors and invariant scalars be regular. On classical physical grounds, it is stressed that conformal fluids used as cores for static or rotating solutions are exempt from any malicious behavior in that they are finite and defined everywhere.

  15. Near-Regular Structure Discovery Using Linear Programming

    KAUST Repository

    Huang, Qixing

    2014-06-02

    Near-regular structures are common in manmade and natural objects. Algorithmic detection of such regularity greatly facilitates our understanding of shape structures, leads to compact encoding of input geometries, and enables efficient generation and manipulation of complex patterns on both acquired and synthesized objects. Such regularity manifests itself both in the repetition of certain geometric elements, as well as in the structured arrangement of the elements. We cast the regularity detection problem as an optimization and efficiently solve it using linear programming techniques. Our optimization has a discrete aspect, that is, the connectivity relationships among the elements, as well as a continuous aspect, namely the locations of the elements of interest. Both these aspects are captured by our near-regular structure extraction framework, which alternates between discrete and continuous optimizations. We demonstrate the effectiveness of our framework on a variety of problems including near-regular structure extraction, structure-preserving pattern manipulation, and markerless correspondence detection. Robustness results with respect to geometric and topological noise are presented on synthesized, real-world, and also benchmark datasets. © 2014 ACM.

  16. Determination of Optimal Imaging Mode for Ultrasonographic Detection of Subdermal Contraceptive Rods: Comparison of Spatial Compound, Conventional, and Tissue Harmonic Imaging Methods

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Jin; Seo, Kyung; Song, Ho Taek; Park, Ah Young; Kim, Yaena; Yoon, Choon Sik [Gangnam Severance Hospital, Yonsei University College of Medicine, Seoul (Korea, Republic of); Suh, Jin Suck; Kim, Ah Hyun [Dept. of Radiology and Research Institute of Radiological Science, Severance Hospital, Yonsei University College of Medicine, Seoul (Korea, Republic of); Ryu, Jeong Ah [Dept. of Radiology, Guri Hospital, Hanyang University College of Medicine, Guri (Korea, Republic of); Park, Jeong Seon [Dept. of Radiology, Hanyang University Hospital, Hanyang University College of Medicine, Seoul (Korea, Republic of)

    2012-09-15

    To determine which mode of ultrasonography (US), among the conventional, spatial compound, and tissue-harmonic methods, exhibits the best performance for the detection of Implanon with respect to generation of posterior acoustic shadowing (PAS). A total of 21 patients, referred for localization of impalpable Implanon, underwent US, using the three modes with default settings (i.e., wide focal zone). Representative transverse images of the rods, according to each mode for all patients, were obtained. The resulting 63 images were reviewed by four observers. The observers provided a confidence score for the presence of PAS, using a five-point scale ranging from 1 (definitely absent) to 5 (definitely present), with scores of 4 or 5 for PAS being considered as detection. The average scores of PAS, obtained from the three different modes for each observer, were compared using one-way repeated measure ANOVA. The detection rates were compared using a weighted least square method. Statistically, the tissue harmonic mode was significantly superior to the other two modes, when comparing the average scores of PAS for all observers (p < 0.00-1). The detection rate was also highest for the tissue harmonic mode (p < 0.001). Tissue harmonic mode in US appears to be the most suitable in detecting subdermal contraceptive implant rods.

  17. Determination of Optimal Imaging Mode for Ultrasonographic Detection of Subdermal Contraceptive Rods: Comparison of Spatial Compound, Conventional, and Tissue Harmonic Imaging Methods

    International Nuclear Information System (INIS)

    Kim, Sung Jin; Seo, Kyung; Song, Ho Taek; Park, Ah Young; Kim, Yaena; Yoon, Choon Sik; Suh, Jin Suck; Kim, Ah Hyun; Ryu, Jeong Ah; Park, Jeong Seon

    2012-01-01

    To determine which mode of ultrasonography (US), among the conventional, spatial compound, and tissue-harmonic methods, exhibits the best performance for the detection of Implanon with respect to generation of posterior acoustic shadowing (PAS). A total of 21 patients, referred for localization of impalpable Implanon, underwent US, using the three modes with default settings (i.e., wide focal zone). Representative transverse images of the rods, according to each mode for all patients, were obtained. The resulting 63 images were reviewed by four observers. The observers provided a confidence score for the presence of PAS, using a five-point scale ranging from 1 (definitely absent) to 5 (definitely present), with scores of 4 or 5 for PAS being considered as detection. The average scores of PAS, obtained from the three different modes for each observer, were compared using one-way repeated measure ANOVA. The detection rates were compared using a weighted least square method. Statistically, the tissue harmonic mode was significantly superior to the other two modes, when comparing the average scores of PAS for all observers (p < 0.00-1). The detection rate was also highest for the tissue harmonic mode (p < 0.001). Tissue harmonic mode in US appears to be the most suitable in detecting subdermal contraceptive implant rods.

  18. Detecting the Spatially Non-Stationary Relationships between Housing Price and Its Determinants in China: Guide for Housing Market Sustainability

    Directory of Open Access Journals (Sweden)

    Yanchuan Mou

    2017-10-01

    Full Text Available Given the rapidly developing processes in the housing market of China, the significant regional difference in housing prices has become a serious issue that requires a further understanding of the underlying mechanisms. Most of the extant regression models are standard global modeling techniques that do not take spatial non-stationarity into consideration, thereby making them unable to reflect the spatial nature of the data and introducing significant bias into the prediction results. In this study, the geographically weighted regression model (GWR was applied to examine the local association between housing price and its potential determinants, which were selected in view of the housing supply and demand in 338 cities across mainland China. Non-stationary relationships were obtained, and such observation could be summarized as follows: (1 the associations between land price and housing price are all significant and positive yet having different magnitudes; (2 the relationship between supplied amount of residential land and housing price is not statistically significant for 272 of the 338 cities, thereby indicating that the adjustment of supplied land has a slight effect on housing price for most cities; and (3 the significance, direction, and magnitude of the relationships between the other three factors (i.e., urbanization rate, average wage of urban employees, proportion of renters and housing price vary across the 338 cities. Based on these findings, this paper discusses some key issues relating to the spatial variations, combined with local economic conditions and suggests housing regulation policies that could facilitate the sustainable development of the Chinese housing market.

  19. Object-Based Change Detection in Urban Areas from High Spatial Resolution Images Based on Multiple Features and Ensemble Learning

    Directory of Open Access Journals (Sweden)

    Xin Wang

    2018-02-01

    Full Text Available To improve the accuracy of change detection in urban areas using bi-temporal high-resolution remote sensing images, a novel object-based change detection scheme combining multiple features and ensemble learning is proposed in this paper. Image segmentation is conducted to determine the objects in bi-temporal images separately. Subsequently, three kinds of object features, i.e., spectral, shape and texture, are extracted. Using the image differencing process, a difference image is generated and used as the input for nonlinear supervised classifiers, including k-nearest neighbor, support vector machine, extreme learning machine and random forest. Finally, the results of multiple classifiers are integrated using an ensemble rule called weighted voting to generate the final change detection result. Experimental results of two pairs of real high-resolution remote sensing datasets demonstrate that the proposed approach outperforms the traditional methods in terms of overall accuracy and generates change detection maps with a higher number of homogeneous regions in urban areas. Moreover, the influences of segmentation scale and the feature selection strategy on the change detection performance are also analyzed and discussed.

  20. Sharp spatially constrained inversion

    DEFF Research Database (Denmark)

    Vignoli, Giulio G.; Fiandaca, Gianluca G.; Christiansen, Anders Vest C A.V.C.

    2013-01-01

    We present sharp reconstruction of multi-layer models using a spatially constrained inversion with minimum gradient support regularization. In particular, its application to airborne electromagnetic data is discussed. Airborne surveys produce extremely large datasets, traditionally inverted...... by using smoothly varying 1D models. Smoothness is a result of the regularization constraints applied to address the inversion ill-posedness. The standard Occam-type regularized multi-layer inversion produces results where boundaries between layers are smeared. The sharp regularization overcomes...... inversions are compared against classical smooth results and available boreholes. With the focusing approach, the obtained blocky results agree with the underlying geology and allow for easier interpretation by the end-user....

  1. Detection and Classification of Multiple Objects using an RGB-D Sensor and Linear Spatial Pyramid Matching

    OpenAIRE

    Dimitriou, Michalis; Kounalakis, Tsampikos; Vidakis, Nikolaos; Triantafyllidis, Georgios

    2013-01-01

    This paper presents a complete system for multiple object detection and classification in a 3D scene using an RGB-D sensor such as the Microsoft Kinect sensor. Successful multiple object detection and classification are crucial features in many 3D computer vision applications. The main goal is making machines see and understand objects like humans do. To this goal, the new RGB-D sensors can be utilized since they provide real-time depth map which can be used along with the RGB images for our ...

  2. Physical model of dimensional regularization

    Energy Technology Data Exchange (ETDEWEB)

    Schonfeld, Jonathan F.

    2016-12-15

    We explicitly construct fractals of dimension 4-ε on which dimensional regularization approximates scalar-field-only quantum-field theory amplitudes. The construction does not require fractals to be Lorentz-invariant in any sense, and we argue that there probably is no Lorentz-invariant fractal of dimension greater than 2. We derive dimensional regularization's power-law screening first for fractals obtained by removing voids from 3-dimensional Euclidean space. The derivation applies techniques from elementary dielectric theory. Surprisingly, fractal geometry by itself does not guarantee the appropriate power-law behavior; boundary conditions at fractal voids also play an important role. We then extend the derivation to 4-dimensional Minkowski space. We comment on generalization to non-scalar fields, and speculate about implications for quantum gravity. (orig.)

  3. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  4. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yi; Zhao, Shiguang; Gao, Xin

    2014-01-01

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  5. Regularized strings with extrinsic curvature

    International Nuclear Information System (INIS)

    Ambjoern, J.; Durhuus, B.

    1987-07-01

    We analyze models of discretized string theories, where the path integral over world sheet variables is regularized by summing over triangulated surfaces. The inclusion of curvature in the action is a necessity for the scaling of the string tension. We discuss the physical properties of models with extrinsic curvature terms in the action and show that the string tension vanishes at the critical point where the bare extrinsic curvature coupling tends to infinity. Similar results are derived for models with intrinsic curvature. (orig.)

  6. Circuit complexity of regular languages

    Czech Academy of Sciences Publication Activity Database

    Koucký, Michal

    2009-01-01

    Roč. 45, č. 4 (2009), s. 865-879 ISSN 1432-4350 R&D Projects: GA ČR GP201/07/P276; GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10190503 Keywords : regular languages * circuit complexity * upper and lower bounds Subject RIV: BA - General Mathematics Impact factor: 0.726, year: 2009

  7. Common spatial pattern with polarity check for reducing delay latency in detection of MRCP based BCI system

    DEFF Research Database (Denmark)

    Yao, Lin; Chen, Mei Lin; Sheng, Xinjun

    2017-01-01

    Dual tasking refers to the simultaneous execution of two tasks with different demands. In this study, we aimed to investigate the effect of a second task on a main task of motor execution and on the ability to detect the cortical potential related to the main task from non-invasive electroencepha...

  8. Remote detection of fluid-related diagenetic mineralogical variations in the Wingate Sandstone at different spatial and spectral resolutions

    Science.gov (United States)

    Okyay, Unal; Khan, Shuhab D.

    2016-02-01

    Well-exposed eolian units of the Jurassic system on the Colorado Plateau including the Wingate Sandstone, show prominent color variations throughout southeastern Utah due to diagenetic changes that include precipitation and/or removal of iron oxide, clay, and carbonate cement. Spatially variable characteristic diagenetic changes suggest fluid-rock interactions through the sandstone. Distinctive spectral signatures of diagenetic minerals can be used to map diagenetic mineral variability and possibly fluid-flow pathways. The main objective of this work was to identify characteristic diagenetic minerals, and map their spatial variability from regional to outcrop scale in Wingate Sandstone exposures of Lisbon Valley, Utah. Laboratory reflectance spectroscopy analysis of the samples facilitated identification of diagnostic spectral characteristics of the common diagenetic minerals and their relative abundances between altered and unaltered Wingate Sandstone. Comparison of reflectance spectroscopy with satellite, airborne, and ground-based imaging spectroscopy data provided a method for mapping and evaluating spatial variations of diagenetic minerals. The Feature-oriented Principal Component Selection method was used on Advanced Spaceborne Thermal Emission and Reflection Radiometer data so as to map common mineral groups throughout the broader Wingate Sandstone exposure in the area. The Minimum Noise Fraction and Spectral Angle Mapper methods were applied on airborne HyMap and ground-based hyperspectral imaging data to identify and map mineralogical changes. The satellite and airborne data showed that out of 25.55 km2 total exposure of Wingate Sandstone in Lisbon Valley, unaltered sandstone cover 12.55 km2, and altered sandstone cover 8.90 km2 in the northwest flank and 5.09 km2 in the southern flank of the anticline. The ground-based hyperspectral data demonstrated the ability to identify and map mineral assemblages with two-dimensional lateral continuity on near

  9. General inverse problems for regular variation

    DEFF Research Database (Denmark)

    Damek, Ewa; Mikosch, Thomas Valentin; Rosinski, Jan

    2014-01-01

    Regular variation of distributional tails is known to be preserved by various linear transformations of some random structures. An inverse problem for regular variation aims at understanding whether the regular variation of a transformed random object is caused by regular variation of components ...

  10. Hot spots, cluster detection and spatial outlier analysis of teen birth rates in the U.S., 2003–2012

    Science.gov (United States)

    Khan, Diba; Rossen, Lauren M.; Hamilton, Brady E.; He, Yulei; Wei, Rong; Dienes, Erin

    2017-01-01

    Teen birth rates have evidenced a significant decline in the United States over the past few decades. Most of the states in the US have mirrored this national decline, though some reports have illustrated substantial variation in the magnitude of these decreases across the U.S. Importantly, geographic variation at the county level has largely not been explored. We used National Vital Statistics Births data and Hierarchical Bayesian space-time interaction models to produce smoothed estimates of teen birth rates at the county level from 2003–2012. Results indicate that teen birth rates show evidence of clustering, where hot and cold spots occur, and identify spatial outliers. Findings from this analysis may help inform efforts targeting the prevention efforts by illustrating how geographic patterns of teen birth rates have changed over the past decade and where clusters of high or low teen birth rates are evident. PMID:28552189

  11. Hot spots, cluster detection and spatial outlier analysis of teen birth rates in the U.S., 2003-2012.

    Science.gov (United States)

    Khan, Diba; Rossen, Lauren M; Hamilton, Brady E; He, Yulei; Wei, Rong; Dienes, Erin

    2017-06-01

    Teen birth rates have evidenced a significant decline in the United States over the past few decades. Most of the states in the US have mirrored this national decline, though some reports have illustrated substantial variation in the magnitude of these decreases across the U.S. Importantly, geographic variation at the county level has largely not been explored. We used National Vital Statistics Births data and Hierarchical Bayesian space-time interaction models to produce smoothed estimates of teen birth rates at the county level from 2003-2012. Results indicate that teen birth rates show evidence of clustering, where hot and cold spots occur, and identify spatial outliers. Findings from this analysis may help inform efforts targeting the prevention efforts by illustrating how geographic patterns of teen birth rates have changed over the past decade and where clusters of high or low teen birth rates are evident. Published by Elsevier Ltd.

  12. Persistence, spatial distribution and implications for progression detection of blind parts of the visual field in glaucoma: a clinical cohort study.

    Directory of Open Access Journals (Sweden)

    Francisco G Junoy Montolio

    Full Text Available BACKGROUND: Visual field testing is an essential part of glaucoma care. It is hampered by variability related to the disease itself, response errors and fatigue. In glaucoma, blind parts of the visual field contribute to the diagnosis but--once established--not to progression detection; they only increase testing time. The aims of this study were to describe the persistence and spatial distribution of blind test locations in standard automated perimetry in glaucoma and to explore how the omission of presumed blind test locations would affect progression detection. METHODOLOGY/PRINCIPAL FINDINGS: Data from 221 eyes of 221 patients from a cohort study with the Humphrey Field Analyzer with 30-2 grid were used. Patients were stratified according to baseline mean deviation (MD in six strata of 5 dB width each. For one, two, three and four consecutive 0.1 for all strata. Omitting test locations with three consecutive <0 dB sensitivities at baseline did not affect the performance of the MD-based Nonparametric Progression Analysis progression detection algorithm. CONCLUSIONS/SIGNIFICANCE: Test locations that have been shown to be reproducibly blind tend to display a reasonable blindness persistence and do no longer contribute to progression detection. There is no clinically useful universal MD cut-off value beyond which testing can be limited to 10 degree eccentricity.

  13. Phantom experiments using soft-prior regularization EIT for breast cancer imaging.

    Science.gov (United States)

    Murphy, Ethan K; Mahara, Aditya; Wu, Xiaotian; Halter, Ryan J

    2017-06-01

    A soft-prior regularization (SR) electrical impedance tomography (EIT) technique for breast cancer imaging is described, which shows an ability to accurately reconstruct tumor/inclusion conductivity values within a dense breast model investigated using a cylindrical and a breast-shaped tank. The SR-EIT method relies on knowing the spatial location of a suspicious lesion initially detected from a second imaging modality. Standard approaches (using Laplace smoothing and total variation regularization) without prior structural information are unable to accurately reconstruct or detect the tumors. The soft-prior approach represents a very significant improvement to these standard approaches, and has the potential to improve conventional imaging techniques, such as automated whole breast ultrasound (AWB-US), by providing electrical property information of suspicious lesions to improve AWB-US's ability to discriminate benign from cancerous lesions. Specifically, the best soft-regularization technique found average absolute tumor/inclusion errors of 0.015 S m -1 for the cylindrical test and 0.055 S m -1 and 0.080 S m -1 for the breast-shaped tank for 1.8 cm and 2.5 cm inclusions, respectively. The standard approaches were statistically unable to distinguish the tumor from the mammary gland tissue. An analysis of false tumors (benign suspicious lesions) provides extra insight into the potential and challenges EIT has for providing clinically relevant information. The ability to obtain accurate conductivity values of a suspicious lesion (>1.8 cm) detected from another modality (e.g. AWB-US) could significantly reduce false positives and result in a clinically important technology.

  14. Comparison of spatial frequency domain features for the detection of side attack explosive ballistics in synthetic aperture acoustics

    Science.gov (United States)

    Dowdy, Josh; Anderson, Derek T.; Luke, Robert H.; Ball, John E.; Keller, James M.; Havens, Timothy C.

    2016-05-01

    Explosive hazards in current and former conflict zones are a threat to both military and civilian personnel. As a result, much effort has been dedicated to identifying automated algorithms and systems to detect these threats. However, robust detection is complicated due to factors like the varied composition and anatomy of such hazards. In order to solve this challenge, a number of platforms (vehicle-based, handheld, etc.) and sensors (infrared, ground penetrating radar, acoustics, etc.) are being explored. In this article, we investigate the detection of side attack explosive ballistics via a vehicle-mounted acoustic sensor. In particular, we explore three acoustic features, one in the time domain and two on synthetic aperture acoustic (SAA) beamformed imagery. The idea is to exploit the varying acoustic frequency profile of a target due to its unique geometry and material composition with respect to different viewing angles. The first two features build their angle specific frequency information using a highly constrained subset of the signal data and the last feature builds its frequency profile using all available signal data for a given region of interest (centered on the candidate target location). Performance is assessed in the context of receiver operating characteristic (ROC) curves on cross-validation experiments for data collected at a U.S. Army test site on different days with multiple target types and clutter. Our preliminary results are encouraging and indicate that the top performing feature is the unrolled two dimensional discrete Fourier transform (DFT) of SAA beamformed imagery.

  15. Improved sensitivity and limit-of-detection of lateral flow devices using spatial constrictions of the flow-path.

    Science.gov (United States)

    Katis, Ioannis N; He, Peijun J W; Eason, Robert W; Sones, Collin L

    2018-05-03

    We report on the use of a laser-direct write (LDW) technique that allows the fabrication of lateral flow devices with enhanced sensitivity and limit of detection. This manufacturing technique comprises the dispensing of a liquid photopolymer at specific regions of a nitrocellulose membrane and its subsequent photopolymerisation to create impermeable walls inside the volume of the membrane. These polymerised structures are intentionally designed to create fluidic channels which are constricted over a specific length that spans the test zone within which the sample interacts with pre-deposited reagents. Experiments were conducted to show how these constrictions alter the fluid flow rate and the test zone area within the constricted channel geometries. The slower flow rate and smaller test zone area result in the increased sensitivity and lowered limit of detection for these devices. We have quantified these via the improved performance of a C-Reactive Protein (CRP) sandwich assay on our lateral flow devices with constricted flow paths which demonstrate an improvement in its sensitivity by 62x and in its limit of detection by 30x when compared to a standard lateral flow CRP device. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  16. Using the ionospheric response to the solar eclipse on 20 March 2015 to detect spatial structure in the solar corona

    Science.gov (United States)

    Bradford, J.; Bell, S. A.; Wilkinson, J.; Smith, D.; Tudor, S.

    2016-01-01

    The total solar eclipse that occurred over the Arctic region on 20 March 2015 was seen as a partial eclipse over much of Europe. Observations of this eclipse were used to investigate the high time resolution (1 min) decay and recovery of the Earth’s ionospheric E-region above the ionospheric monitoring station in Chilton, UK. At the altitude of this region (100 km), the maximum phase of the eclipse was 88.88% obscuration of the photosphere occurring at 9:29:41.5 UT. In comparison, the ionospheric response revealed a maximum obscuration of 66% (leaving a fraction, Φ, of uneclipsed radiation of 34±4%) occurring at 9:29 UT. The eclipse was re-created using data from the Solar Dynamics Observatory to estimate the fraction of radiation incident on the Earth’s atmosphere throughout the eclipse from nine different emission wavelengths in the extreme ultraviolet (EUV) and X-ray spectrum. These emissions, having varying spatial distributions, were each obscured differently during the eclipse. Those wavelengths associated with coronal emissions (94, 211 and 335 Å) most closely reproduced the time varying fraction of unobscured radiation observed in the ionosphere. These results could enable historic ionospheric eclipse measurements to be interpreted in terms of the distribution of EUV and X-ray emissions on the solar disc. This article is part of the themed issue ‘Atmospheric effects of solar eclipses stimulated by the 2015 UK eclipse’. PMID:27550766

  17. Defect Localization Capabilities of a Global Detection Scheme: Spatial Pattern Recognition Using Full-field Vibration Test Data in Plates

    Science.gov (United States)

    Saleeb, A. F.; Prabhu, M.; Arnold, S. M. (Technical Monitor)

    2002-01-01

    Recently, a conceptually simple approach, based on the notion of defect energy in material space has been developed and extensively studied (from the theoretical and computational standpoints). The present study focuses on its evaluation from the viewpoint of damage localization capabilities in case of two-dimensional plates; i.e., spatial pattern recognition on surfaces. To this end, two different experimental modal test results are utilized; i.e., (1) conventional modal testing using (white noise) excitation and accelerometer-type sensors and (2) pattern recognition using Electronic speckle pattern interferometry (ESPI), a full field method capable of analyzing the mechanical vibration of complex structures. Unlike the conventional modal testing technique (using contacting accelerometers), these emerging ESPI technologies operate in a non-contacting mode, can be used even under hazardous conditions with minimal or no presence of noise and can simultaneously provide measurements for both translations and rotations. Results obtained have clearly demonstrated the robustness and versatility of the global NDE scheme developed. The vectorial character of the indices used, which enabled the extraction of distinct patterns for localizing damages proved very useful. In the context of the targeted pattern recognition paradigm, two algorithms were developed for the interrogation of test measurements; i.e., intensity contour maps for the damaged index, and the associated defect energy vector field plots.

  18. Regularized Statistical Analysis of Anatomy

    DEFF Research Database (Denmark)

    Sjöstrand, Karl

    2007-01-01

    This thesis presents the application and development of regularized methods for the statistical analysis of anatomical structures. Focus is on structure-function relationships in the human brain, such as the connection between early onset of Alzheimer’s disease and shape changes of the corpus...... and mind. Statistics represents a quintessential part of such investigations as they are preluded by a clinical hypothesis that must be verified based on observed data. The massive amounts of image data produced in each examination pose an important and interesting statistical challenge...... efficient algorithms which make the analysis of large data sets feasible, and gives examples of applications....

  19. Regularization methods in Banach spaces

    CERN Document Server

    Schuster, Thomas; Hofmann, Bernd; Kazimierski, Kamil S

    2012-01-01

    Regularization methods aimed at finding stable approximate solutions are a necessary tool to tackle inverse and ill-posed problems. Usually the mathematical model of an inverse problem consists of an operator equation of the first kind and often the associated forward operator acts between Hilbert spaces. However, for numerous problems the reasons for using a Hilbert space setting seem to be based rather on conventions than on an approprimate and realistic model choice, so often a Banach space setting would be closer to reality. Furthermore, sparsity constraints using general Lp-norms or the B

  20. Academic Training Lecture - Regular Programme

    CERN Multimedia

    PH Department

    2011-01-01

    Regular Lecture Programme 9 May 2011 ACT Lectures on Detectors - Inner Tracking Detectors by Pippa Wells (CERN) 10 May 2011 ACT Lectures on Detectors - Calorimeters (2/5) by Philippe Bloch (CERN) 11 May 2011 ACT Lectures on Detectors - Muon systems (3/5) by Kerstin Hoepfner (RWTH Aachen) 12 May 2011 ACT Lectures on Detectors - Particle Identification and Forward Detectors by Peter Krizan (University of Ljubljana and J. Stefan Institute, Ljubljana, Slovenia) 13 May 2011 ACT Lectures on Detectors - Trigger and Data Acquisition (5/5) by Dr. Brian Petersen (CERN) from 11:00 to 12:00 at CERN ( Bldg. 222-R-001 - Filtration Plant )

  1. Detection of major climatic and environmental predictors of liver fluke exposure risk in Ireland using spatial cluster analysis.

    Science.gov (United States)

    Selemetas, Nikolaos; de Waal, Theo

    2015-04-30

    Fasciolosis caused by Fasciola hepatica (liver fluke) can cause significant economic and production losses in dairy cow farms. The aim of the current study was to identify important weather and environmental predictors of the exposure risk to liver fluke by detecting clusters of fasciolosis in Ireland. During autumn 2012, bulk-tank milk samples from 4365 dairy farms were collected throughout Ireland. Using an in-house antibody-detection ELISA, the analysis of BTM samples showed that 83% (n=3602) of dairy farms had been exposed to liver fluke. The Getis-Ord Gi* statistic identified 74 high-risk and 130 low-risk significant (Pclimatic variables (monthly and seasonal mean rainfall and temperatures, total wet days and rain days) and environmental datasets (soil types, enhanced vegetation index and normalised difference vegetation index) were used to investigate dissimilarities in the exposure to liver fluke between clusters. Rainfall, total wet days and rain days, and soil type were the significant classes of climatic and environmental variables explaining the differences between significant clusters. A discriminant function analysis was used to predict the exposure risk to liver fluke using 80% of data for modelling and the remaining subset of 20% for post hoc model validation. The most significant predictors of the model risk function were total rainfall in August and September and total wet days. The risk model presented 100% sensitivity and 91% specificity and an accuracy of 95% correctly classified cases. A risk map of exposure to liver fluke was constructed with higher probability of exposure in western and north-western regions. The results of this study identified differences between clusters of fasciolosis in Ireland regarding climatic and environmental variables and detected significant predictors of the exposure risk to liver fluke. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Detection and spatial characterization of carbon steel pitting corrosion in anaerobic sulpho-genic medium; Detection et caracterisation spatiale de la corrosion localisee des aciers au carbone en milieu anaerobie sulfurogene

    Energy Technology Data Exchange (ETDEWEB)

    Festy, D.; Forest, B. [Institut francais de Recherche pour l' Exploitation de la Mer - IFREMER, Centre de Brest, Service Materiaux et Structures, 29 - Plouzane (France); Keddam, M.; Monfort Moros, N.; Tribollet, B. [UPR 15 du CNRS Lab. de Physique des Liquides et Electrochimie, 75 - Paris (France); Marchal, R.; Monfort Moros, N. [Institut Francais du Petrole (IFP), Div. Chimie et Physico-Chimie Appliquee, Dept. Microbiologie, 92 - Rueil-Malmaison (France)

    2002-07-01

    The bio-film developing on carbon steel surfaces in anaerobic condition may induce localised corrosion. To be able to better understand this type of bio-corrosion, this piper presents a new electrochemical technique, which has been developed in collaboration between IFREMER and the Laboratory for liquid physic and electrochemistry. Focussed on local aspect of this phenomenon, the described technique enables surface torrent density mapping to be performed and anodic or cathodic zones to be identified. A double micro-electrode probe is placed closed to the steel simple surface and potential difference between them is measured. This value is directly connected to ohmic drop within electrolyte and consequently, to local torrent. By scanning the substrate surface, local torrent repartition is visualized and one tan detect and characterise Localised corrosion attacks. After presenting the technique and the calibration procedure, a bio-corrosion phenomenon induced by stripping a bio-film at a carbon steel simple surface is analysed by successively drawing localised torrent maps, included biocide efficiency assessment. (authors)

  3. RES: Regularized Stochastic BFGS Algorithm

    Science.gov (United States)

    Mokhtari, Aryan; Ribeiro, Alejandro

    2014-12-01

    RES, a regularized stochastic version of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton method is proposed to solve convex optimization problems with stochastic objectives. The use of stochastic gradient descent algorithms is widespread, but the number of iterations required to approximate optimal arguments can be prohibitive in high dimensional problems. Application of second order methods, on the other hand, is impracticable because computation of objective function Hessian inverses incurs excessive computational cost. BFGS modifies gradient descent by introducing a Hessian approximation matrix computed from finite gradient differences. RES utilizes stochastic gradients in lieu of deterministic gradients for both, the determination of descent directions and the approximation of the objective function's curvature. Since stochastic gradients can be computed at manageable computational cost RES is realizable and retains the convergence rate advantages of its deterministic counterparts. Convergence results show that lower and upper bounds on the Hessian egeinvalues of the sample functions are sufficient to guarantee convergence to optimal arguments. Numerical experiments showcase reductions in convergence time relative to stochastic gradient descent algorithms and non-regularized stochastic versions of BFGS. An application of RES to the implementation of support vector machines is developed.

  4. Regularized Label Relaxation Linear Regression.

    Science.gov (United States)

    Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung; Fang, Bingwu

    2018-04-01

    Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on -norm and -norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.

  5. Near-real-time radiography detects 0.1% changes in areal density with 1-millimeter spatial resolution

    International Nuclear Information System (INIS)

    Stupin, D.M.

    1988-01-01

    The use of digital subtraction radiography for density and flaw detection was demonstrated using a phantom that duplicates x-ray absorption in an integrated circuit with a plastic case. The phantom consisted of stacked polyethylene sheets, aluminum and silicon films, and thin aluminum circuits etched onto the top of the silicon wafer. An aluminum step wedge was placed over the phantom. An x-ray image of the sample was cast onto a television camera, digitized, and sent to a PDP computer, where the 128-frame average of the phantom image was subtracted from the 128-frame average of the phantom-and-step image. The resulting image was then transferred to a video monitor. Two-micrometer tungsten wires were also imaged. The system and its performance and sensitivity are described and illustrated

  6. Pilot study for supervised target detection applied to spatially registered multiparametric MRI in order to non-invasively score prostate cancer.

    Science.gov (United States)

    Mayer, Rulon; Simone, Charles B; Skinner, William; Turkbey, Baris; Choykey, Peter

    2018-03-01

    Gleason Score (GS) is a validated predictor of prostate cancer (PCa) disease progression and outcomes. GS from invasive needle biopsies suffers from significant inter-observer variability and possible sampling error, leading to underestimating disease severity ("underscoring") and can result in possible complications. A robust non-invasive image-based approach is, therefore, needed. Use spatially registered multi-parametric MRI (MP-MRI), signatures, and supervised target detection algorithms (STDA) to non-invasively GS PCa at the voxel level. This study retrospectively analyzed 26 MP-MRI from The Cancer Imaging Archive. The MP-MRI (T2, Diffusion Weighted, Dynamic Contrast Enhanced) were spatially registered to each other, combined into stacks, and stitched together to form hypercubes. Multi-parametric (or multi-spectral) signatures derived from a training set of registered MP-MRI were transformed using statistics-based Whitening-Dewhitening (WD). Transformed signatures were inserted into STDA (having conical decision surfaces) applied to registered MP-MRI determined the tumor GS. The MRI-derived GS was quantitatively compared to the pathologist's assessment of the histology of sectioned whole mount prostates from patients who underwent radical prostatectomy. In addition, a meta-analysis of 17 studies of needle biopsy determined GS with confusion matrices and was compared to the MRI-determined GS. STDA and histology determined GS are highly correlated (R = 0.86, p < 0.02). STDA more accurately determined GS and reduced GS underscoring of PCa relative to needle biopsy as summarized by meta-analysis (p < 0.05). This pilot study found registered MP-MRI, STDA, and WD transforms of signatures shows promise in non-invasively GS PCa and reducing underscoring with high spatial resolution. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Automated Detection of Cloud and Cloud Shadow in Single-Date Landsat Imagery Using Neural Networks and Spatial Post-Processing

    Directory of Open Access Journals (Sweden)

    M. Joseph Hughes

    2014-05-01

    Full Text Available The use of Landsat data to answer ecological questions is greatly increased by the effective removal of cloud and cloud shadow from satellite images. We develop a novel algorithm to identify and classify clouds and cloud shadow, SPARCS: Spatial Procedures for Automated Removal of Cloud and Shadow. The method uses a neural network approach to determine cloud, cloud shadow, water, snow/ice and clear sky classification memberships of each pixel in a Landsat scene. It then applies a series of spatial procedures to resolve pixels with ambiguous membership by using information, such as the membership values of neighboring pixels and an estimate of cloud shadow locations from cloud and solar geometry. In a comparison with FMask, a high-quality cloud and cloud shadow classification algorithm currently available, SPARCS performs favorably, with substantially lower omission errors for cloud shadow (8.0% and 3.2%, only slightly higher omission errors for clouds (0.9% and 1.3%, respectively and fewer errors of commission (2.6% and 0.3%. Additionally, SPARCS provides a measure of uncertainty in its classification that can be exploited by other algorithms that require clear sky pixels. To illustrate this, we present an application that constructs obstruction-free composites of images acquired on different dates in support of a method for vegetation change detection.

  8. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  9. Regularized Adaptive Notch Filters for Acoustic Howling Suppression

    DEFF Research Database (Denmark)

    Gil-Cacho, Pepe; van Waterschoot, Toon; Moonen, Marc

    2009-01-01

    In this paper, a method for the suppression of acoustic howling is developed, based on adaptive notch filters (ANF) with regularization (RANF). The method features three RANFs working in parallel to achieve frequency tracking, howling detection and suppression. The ANF-based approach to howling...

  10. From inactive to regular jogger

    DEFF Research Database (Denmark)

    Lund-Cramer, Pernille; Brinkmann Løite, Vibeke; Bredahl, Thomas Viskum Gjelstrup

    study was conducted using individual semi-structured interviews on how a successful long-term behavior change had been achieved. Ten informants were purposely selected from participants in the DANO-RUN research project (7 men, 3 women, average age 41.5). Interviews were performed on the basis of Theory...... of Planned Behavior (TPB) and The Transtheoretical Model (TTM). Coding and analysis of interviews were performed using NVivo 10 software. Results TPB: During the behavior change process, the intention to jogging shifted from a focus on weight loss and improved fitness to both physical health, psychological......Title From inactive to regular jogger - a qualitative study of achieved behavioral change among recreational joggers Authors Pernille Lund-Cramer & Vibeke Brinkmann Løite Purpose Despite extensive knowledge of barriers to physical activity, most interventions promoting physical activity have proven...

  11. Detection of viruses and the spatial and temporal spread patterns of viral diseases of cucurbits (Cucurbitaceae spp.) in the coastal savannah zone of Ghana

    International Nuclear Information System (INIS)

    Gyamena, A. E

    2013-07-01

    Cucurbits are susceptible to over 35 plant viruses; each of these viruses is capable of causing total crop failure in a poorly managed virus pathosystem. The objectives of this study were to detect the viruses that infect six cucurbit species in the coastal savannah zone of Ghana and to describe the spatial and temporal spread patterns of virus epidemics in zucchini squash (Cucurbita pepo L.) by the use of mathematical and geostatistical models. Cucumber (Cucumis sativus L.), watermelon (Citrullus lanatus Thunb.), zucchini squash (Cucurbita pepo L.), butternut squash (Cucurbita moschata Duchesne), egushi (Citrullus colocynthis L. Schrad.) and melon (Cucumis melo L.) were grown on an experimental field in the coastal savannah zone of Ghana and were monitored for the expression of virus and virus-like symptoms. The observed symptoms were further confirmed by Double Antibody Sandwich Enzyme-Linked Immunosorbent Assay (DAS ELISA) and mechanical inoculation of indicator plants. The temporal spread patterns of virus disease in zucchini squash were analyzed by exponential logistic, monomolecular and gompertz mechanistic models. The spatial patterns of virus disease spread in zucchini squash field were analyzed by semivariograms and inverse distance weighing (IDW) methods. Cucumber, zucchini squash, melon and butternut squash were infected by both Cucumber mosaic virus (CMW) and Papaya ringspot virus (PRSV-W). Egushi was infected by CMW but not PRSV-W. None of the six cucurbit species were infected by Watermelon mosaic virus (WMV) or Zucchini yellow mosaic virus (ZYMV). The temporal pattern of disease incidence in the zucchini squash field followed the gompertz function with an average apparent infection rate of 0.026 per day. The temporal pattern of disease severity was best described by the exponential model with coefficient of determination of 94.38 % and rate of progress disease severity of 0.114 per day. As at 49 days after planting (DAP), disease incidence and

  12. Tessellating the Sphere with Regular Polygons

    Science.gov (United States)

    Soto-Johnson, Hortensia; Bechthold, Dawn

    2004-01-01

    Tessellations in the Euclidean plane and regular polygons that tessellate the sphere are reviewed. The regular polygons that can possibly tesellate the sphere are spherical triangles, squares and pentagons.

  13. On the equivalence of different regularization methods

    International Nuclear Information System (INIS)

    Brzezowski, S.

    1985-01-01

    The R-circunflex-operation preceded by the regularization procedure is discussed. Some arguments are given, according to which the results may depend on the method of regularization, introduced in order to avoid divergences in perturbation calculations. 10 refs. (author)

  14. The uniqueness of the regularization procedure

    International Nuclear Information System (INIS)

    Brzezowski, S.

    1981-01-01

    On the grounds of the BPHZ procedure, the criteria of correct regularization in perturbation calculations of QFT are given, together with the prescription for dividing the regularized formulas into the finite and infinite parts. (author)

  15. Application of Turchin's method of statistical regularization

    Science.gov (United States)

    Zelenyi, Mikhail; Poliakova, Mariia; Nozik, Alexander; Khudyakov, Alexey

    2018-04-01

    During analysis of experimental data, one usually needs to restore a signal after it has been convoluted with some kind of apparatus function. According to Hadamard's definition this problem is ill-posed and requires regularization to provide sensible results. In this article we describe an implementation of the Turchin's method of statistical regularization based on the Bayesian approach to the regularization strategy.

  16. Regular extensions of some classes of grammars

    NARCIS (Netherlands)

    Nijholt, Antinus

    Culik and Cohen introduced the class of LR-regular grammars, an extension of the LR(k) grammars. In this report we consider the analogous extension of the LL(k) grammers, called the LL-regular grammars. The relations of this class of grammars to other classes of grammars are shown. Every LL-regular

  17. Wave dynamics of regular and chaotic rays

    International Nuclear Information System (INIS)

    McDonald, S.W.

    1983-09-01

    In order to investigate general relationships between waves and rays in chaotic systems, I study the eigenfunctions and spectrum of a simple model, the two-dimensional Helmholtz equation in a stadium boundary, for which the rays are ergodic. Statistical measurements are performed so that the apparent randomness of the stadium modes can be quantitatively contrasted with the familiar regularities observed for the modes in a circular boundary (with integrable rays). The local spatial autocorrelation of the eigenfunctions is constructed in order to indirectly test theoretical predictions for the nature of the Wigner distribution corresponding to chaotic waves. A portion of the large-eigenvalue spectrum is computed and reported in an appendix; the probability distribution of successive level spacings is analyzed and compared with theoretical predictions. The two principal conclusions are: 1) waves associated with chaotic rays may exhibit randomly situated localized regions of high intensity; 2) the Wigner function for these waves may depart significantly from being uniformly distributed over the surface of constant frequency in the ray phase space

  18. 4D PET iterative deconvolution with spatiotemporal regularization for quantitative dynamic PET imaging.

    Science.gov (United States)

    Reilhac, Anthonin; Charil, Arnaud; Wimberley, Catriona; Angelis, Georgios; Hamze, Hasar; Callaghan, Paul; Garcia, Marie-Paule; Boisson, Frederic; Ryder, Will; Meikle, Steven R; Gregoire, Marie-Claude

    2015-09-01

    Quantitative measurements in dynamic PET imaging are usually limited by the poor counting statistics particularly in short dynamic frames and by the low spatial resolution of the detection system, resulting in partial volume effects (PVEs). In this work, we present a fast and easy to implement method for the restoration of dynamic PET images that have suffered from both PVE and noise degradation. It is based on a weighted least squares iterative deconvolution approach of the dynamic PET image with spatial and temporal regularization. Using simulated dynamic [(11)C] Raclopride PET data with controlled biological variations in the striata between scans, we showed that the restoration method provides images which exhibit less noise and better contrast between emitting structures than the original images. In addition, the method is able to recover the true time activity curve in the striata region with an error below 3% while it was underestimated by more than 20% without correction. As a result, the method improves the accuracy and reduces the variability of the kinetic parameter estimates calculated from the corrected images. More importantly it increases the accuracy (from less than 66% to more than 95%) of measured biological variations as well as their statistical detectivity. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  19. Detection and Segmentation of Vine Canopy in Ultra-High Spatial Resolution RGB Imagery Obtained from Unmanned Aerial Vehicle (UAV: A Case Study in a Commercial Vineyard

    Directory of Open Access Journals (Sweden)

    Carlos Poblete-Echeverría

    2017-03-01

    Full Text Available The use of Unmanned Aerial Vehicles (UAVs in viticulture permits the capture of aerial Red-Green-Blue (RGB images with an ultra-high spatial resolution. Recent studies have demonstrated that RGB images can be used to monitor spatial variability of vine biophysical parameters. However, for estimating these parameters, accurate and automated segmentation methods are required to extract relevant information from RGB images. Manual segmentation of aerial images is a laborious and time-consuming process. Traditional classification methods have shown satisfactory results in the segmentation of RGB images for diverse applications and surfaces, however, in the case of commercial vineyards, it is necessary to consider some particularities inherent to canopy size in the vertical trellis systems (VSP such as shadow effect and different soil conditions in inter-rows (mixed information of soil and weeds. Therefore, the objective of this study was to compare the performance of four classification methods (K-means, Artificial Neural Networks (ANN, Random Forest (RForest and Spectral Indices (SI to detect canopy in a vineyard trained on VSP. Six flights were carried out from post-flowering to harvest in a commercial vineyard cv. Carménère using a low-cost UAV equipped with a conventional RGB camera. The results show that the ANN and the simple SI method complemented with the Otsu method for thresholding presented the best performance for the detection of the vine canopy with high overall accuracy values for all study days. Spectral indices presented the best performance in the detection of Plant class (Vine canopy with an overall accuracy of around 0.99. However, considering the performance pixel by pixel, the Spectral indices are not able to discriminate between Soil and Shadow class. The best performance in the classification of three classes (Plant, Soil, and Shadow of vineyard RGB images, was obtained when the SI values were used as input data in trained

  20. Postural instability detection: aging and the complexity of spatial-temporal distributional patterns for virtually contacting the stability boundary in human stance.

    Directory of Open Access Journals (Sweden)

    Melissa C Kilby

    Full Text Available Falls among the older population can severely restrict their functional mobility and even cause death. Therefore, it is crucial to understand the mechanisms and conditions that cause falls, for which it is important to develop a predictive model of falls. One critical quantity for postural instability detection and prediction is the instantaneous stability of quiet upright stance based on motion data. However, well-established measures in the field of motor control that quantify overall postural stability using center-of-pressure (COP or center-of-mass (COM fluctuations are inadequate predictors of instantaneous stability. For this reason, 2D COP/COM virtual-time-to-contact (VTC is investigated to detect the postural stability deficits of healthy older people compared to young adults. VTC predicts the temporal safety margin to the functional stability boundary ( =  limits of the region of feasible COP or COM displacement and, therefore, provides an index of the risk of losing postural stability. The spatial directions with increased instability were also determined using quantities of VTC that have not previously been considered. Further, Lempel-Ziv-Complexity (LZC, a measure suitable for on-line monitoring of stability/instability, was applied to explore the temporal structure or complexity of VTC and the predictability of future postural instability based on previous behavior. These features were examined as a function of age, vision and different load weighting on the legs. The primary findings showed that for old adults the stability boundary was contracted and VTC reduced. Furthermore, the complexity decreased with aging and the direction with highest postural instability also changed in aging compared to the young adults. The findings reveal the sensitivity of the time dependent properties of 2D VTC to the detection of postural instability in aging, availability of visual information and postural stance and potential applicability as a

  1. Research for obtaining a detection system with high spatial and temporal resolution for a tomograph with positron emission (PET-Tomography)

    International Nuclear Information System (INIS)

    Cruceru, Ilie; Bartos, Daniel; Stanescu, Daniela

    2002-01-01

    This report describes a new type of detector for a tomograph system with positron emission. The detector has a new design with detection characteristics better than other detectors used currently in tomographic systems. We have in view the detectors like NaI(Tl), CsI(Tl), BGO and others. The new detector is based on discharge in gases and the interaction of gamma radiation - generated in the annihilation processes of positrons - with the mixture of gases within detector. The main novelty is the structure of electrodes with central readout microstrip plate. This structure is composed from two identical chambers. Each of these chambers have two glass resistive electrodes and one metallic electrode (cathode). One of the glass electrodes is separated from the metallic electrode while the other one is in contact with the central readout microstrip plate. In this way to gaps of 0.3 mm are generated. The gas mixture flows between these gaps. The electric charges generated in this gas are collected on the strips under the influence of the electric field applied between cathode and the anode of the detector.The arrangement of electrodes is shown. The structure of electrodes is mounted into a metallic box of special construction which allows the gas to flow through the detector and collects the electric charges generated in the detector. At present the detector is in the stage of a laboratory model and the tests carried out led to the following detection parameters: detection efficiency, 95%; spatial resolution, 3 mm; time resolution, 82 ps. The measurements were performed in coincidence using two similar detectors and the source of positrons was located between detectors. In the next stage of research will be defined the final constructive solution of the experimental model, built and tested for this positron source. The mixture of gases used for tests contained 85%C 2 H 2 F 4 + 10%SF 6 + 5%C 4 H 10 (isobutane). (authors)

  2. Class of regular bouncing cosmologies

    Science.gov (United States)

    Vasilić, Milovan

    2017-06-01

    In this paper, I construct a class of everywhere regular geometric sigma models that possess bouncing solutions. Precisely, I show that every bouncing metric can be made a solution of such a model. My previous attempt to do so by employing one scalar field has failed due to the appearance of harmful singularities near the bounce. In this work, I use four scalar fields to construct a class of geometric sigma models which are free of singularities. The models within the class are parametrized by their background geometries. I prove that, whatever background is chosen, the dynamics of its small perturbations is classically stable on the whole time axis. Contrary to what one expects from the structure of the initial Lagrangian, the physics of background fluctuations is found to carry two tensor, two vector, and two scalar degrees of freedom. The graviton mass, which naturally appears in these models, is shown to be several orders of magnitude smaller than its experimental bound. I provide three simple examples to demonstrate how this is done in practice. In particular, I show that graviton mass can be made arbitrarily small.

  3. Land drainage system detection using IR and visual imagery taken from autonomous mapping airship and evaluation of physical and spatial parameters of suggested method

    Science.gov (United States)

    Koska, Bronislav; Křemen, Tomáš; Štroner, Martin; Pospíšil, Jiří; Jirka, Vladimír.

    2014-10-01

    An experimental approach to the land drainage system detection and its physical and spatial parameters evaluation by the form of pilot project is presented in this paper. The novelty of the approach is partly based on using of unique unmanned aerial vehicle - airship with some specific properties. The most important parameters are carrying capacity (15 kg) and long flight time (3 hours). A special instrumentation was installed for physical characteristic testing in the locality too. The most important is 30 meter high mast with 3 meter length bracket at the top with sensors recording absolute and comparative temperature, humidity and wind speed and direction in several heights of the mast. There were also installed several measuring units recording local condition in the area. Recorded data were compared with IR images taken from airship platform. The locality is situated around village Domanín in the Czech Republic and has size about 1.8 x 1.5 km. There was build a land drainage system during the 70-ties of the last century which is made from burnt ceramic blocks placed about 70 cm below surface. The project documentation of the land drainage system exists but real state surveying haveńt been never realized. The aim of the project was land surveying of land drainage system based on infrared, visual and its combination high resolution orthophotos (10 cm for VIS and 30 cm for IR) and spatial and physical parameters evaluation of the presented procedure. The orthophoto in VIS and IR spectrum and its combination seems to be suitable for the task.

  4. Identifying differences in brain activities and an accurate detection of autism spectrum disorder using resting state functional-magnetic resonance imaging : A spatial filtering approach.

    Science.gov (United States)

    Subbaraju, Vigneshwaran; Suresh, Mahanand Belathur; Sundaram, Suresh; Narasimhan, Sundararajan

    2017-01-01

    This paper presents a new approach for detecting major differences in brain activities between Autism Spectrum Disorder (ASD) patients and neurotypical subjects using the resting state fMRI. Further the method also extracts discriminative features for an accurate diagnosis of ASD. The proposed approach determines a spatial filter that projects the covariance matrices of the Blood Oxygen Level Dependent (BOLD) time-series signals from both the ASD patients and neurotypical subjects in orthogonal directions such that they are highly separable. The inverse of this filter also provides a spatial pattern map within the brain that highlights those regions responsible for the distinguishable activities between the ASD patients and neurotypical subjects. For a better classification, highly discriminative log-variance features providing the maximum separation between the two classes are extracted from the projected BOLD time-series data. A detailed study has been carried out using the publicly available data from the Autism Brain Imaging Data Exchange (ABIDE) consortium for the different gender and age-groups. The study results indicate that for all the above categories, the regional differences in resting state activities are more commonly found in the right hemisphere compared to the left hemisphere of the brain. Among males, a clear shift in activities to the prefrontal cortex is observed for ASD patients while other parts of the brain show diminished activities compared to neurotypical subjects. Among females, such a clear shift is not evident; however, several regions, especially in the posterior and medial portions of the brain show diminished activities due to ASD. Finally, the classification performance obtained using the log-variance features is found to be better when compared to earlier studies in the literature. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. A visibility-based approach using regularization for imaging-spectroscopy in solar X-ray astronomy

    Energy Technology Data Exchange (ETDEWEB)

    Prato, M; Massone, A M; Piana, M [CNR - INFM LAMIA, Via Dodecaneso 33 1-16146 Genova (Italy); Emslie, A G [Department of Physics, Oklahoma State University, Stillwater, OK 74078 (United States); Hurford, G J [Space Sciences Laboratory, University of California at Berkeley, 8 Gauss Way, Berkeley, CA 94720-7450 (United States); Kontar, E P [Department of Physics and Astronomy, The University, Glasgow G12 8QQ, Scotland (United Kingdom); Schwartz, R A [CUA - Catholic University and LSSP at NASA Goddard Space Flight Center, code 671.1 Greenbelt, MD 20771 (United States)], E-mail: massone@ge.infm.it

    2008-11-01

    The Reuven Ramaty High-Energy Solar Spectroscopic Imager (RHESSI) is a nine-collimators satellite detecting X-rays and {gamma}-rays emitted by the Sun during flares. As the spacecraft rotates, imaging information is encoded as rapid time-variations of the detected flux. We recently proposed a method for the construction of electron flux maps at different electron energies from sets of count visibilities (i.e., direct, calibrated measurements of specific Fourier components of the source spatial structure) measured by RHESSI. The method requires the application of regularized inversion for the synthesis of electron visibility spectra and of imaging techniques for the reconstruction of two-dimensional electron flux maps. The method, already tested on real events registered by RHESSI, is validated in this paper by means of simulated realistic data.

  6. Statistical regularities in art: Relations with visual coding and perception.

    Science.gov (United States)

    Graham, Daniel J; Redies, Christoph

    2010-07-21

    Since at least 1935, vision researchers have used art stimuli to test human response to complex scenes. This is sensible given the "inherent interestingness" of art and its relation to the natural visual world. The use of art stimuli has remained popular, especially in eye tracking studies. Moreover, stimuli in common use by vision scientists are inspired by the work of famous artists (e.g., Mondrians). Artworks are also popular in vision science as illustrations of a host of visual phenomena, such as depth cues and surface properties. However, until recently, there has been scant consideration of the spatial, luminance, and color statistics of artwork, and even less study of ways that regularities in such statistics could affect visual processing. Furthermore, the relationship between regularities in art images and those in natural scenes has received little or no attention. In the past few years, there has been a concerted effort to study statistical regularities in art as they relate to neural coding and visual perception, and art stimuli have begun to be studied in rigorous ways, as natural scenes have been. In this minireview, we summarize quantitative studies of links between regular statistics in artwork and processing in the visual stream. The results of these studies suggest that art is especially germane to understanding human visual coding and perception, and it therefore warrants wider study. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. Development of a spatial analysis method using ground-based repeat photography to detect changes in the alpine treeline ecotone, Glacier National Park, Montana, U.S.A.

    Science.gov (United States)

    Roush, W.; Munroe, Jeffrey S.; Fagre, D.B.

    2007-01-01

    Repeat photography is a powerful tool for detection of landscape change over decadal timescales. Here a novel method is presented that applies spatial analysis software to digital photo-pairs, allowing vegetation change to be categorized and quantified. This method is applied to 12 sites within the alpine treeline ecotone of Glacier National Park, Montana, and is used to examine vegetation changes over timescales ranging from 71 to 93 years. Tree cover at the treeline ecotone increased in 10 out of the 12 photo-pairs (mean increase of 60%). Establishment occurred at all sites, infilling occurred at 11 sites. To demonstrate the utility of this method, patterns of tree establishment at treeline are described and the possible causes of changes within the treeline ecotone are discussed. Local factors undoubtedly affect the magnitude and type of the observed changes, however the ubiquity of the increase in tree cover implies a common forcing mechanism. Mean minimum summer temperatures have increased by 1.5??C over the past century and, coupled with variations in the amount of early spring snow water equivalent, likely account for much of the increase in tree cover at the treeline ecotone. Lastly, shortcomings of this method are presented along with possible solutions and areas for future research. ?? 2007 Regents of the University of Colorado.

  8. Adaptive regularization of noisy linear inverse problems

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Madsen, Kristoffer Hougaard; Lehn-Schiøler, Tue

    2006-01-01

    In the Bayesian modeling framework there is a close relation between regularization and the prior distribution over parameters. For prior distributions in the exponential family, we show that the optimal hyper-parameter, i.e., the optimal strength of regularization, satisfies a simple relation: T......: The expectation of the regularization function, i.e., takes the same value in the posterior and prior distribution. We present three examples: two simulations, and application in fMRI neuroimaging....

  9. Higher derivative regularization and chiral anomaly

    International Nuclear Information System (INIS)

    Nagahama, Yoshinori.

    1985-02-01

    A higher derivative regularization which automatically leads to the consistent chiral anomaly is analyzed in detail. It explicitly breaks all the local gauge symmetry but preserves global chiral symmetry and leads to the chirally symmetric consistent anomaly. This regularization thus clarifies the physics content contained in the consistent anomaly. We also briefly comment on the application of this higher derivative regularization to massless QED. (author)

  10. Regularity effect in prospective memory during aging

    Directory of Open Access Journals (Sweden)

    Geoffrey Blondelle

    2016-10-01

    Full Text Available Background: Regularity effect can affect performance in prospective memory (PM, but little is known on the cognitive processes linked to this effect. Moreover, its impacts with regard to aging remain unknown. To our knowledge, this study is the first to examine regularity effect in PM in a lifespan perspective, with a sample of young, intermediate, and older adults. Objective and design: Our study examined the regularity effect in PM in three groups of participants: 28 young adults (18–30, 16 intermediate adults (40–55, and 25 older adults (65–80. The task, adapted from the Virtual Week, was designed to manipulate the regularity of the various activities of daily life that were to be recalled (regular repeated activities vs. irregular non-repeated activities. We examine the role of several cognitive functions including certain dimensions of executive functions (planning, inhibition, shifting, and binding, short-term memory, and retrospective episodic memory to identify those involved in PM, according to regularity and age. Results: A mixed-design ANOVA showed a main effect of task regularity and an interaction between age and regularity: an age-related difference in PM performances was found for irregular activities (older < young, but not for regular activities. All participants recalled more regular activities than irregular ones with no age effect. It appeared that recalling of regular activities only involved planning for both intermediate and older adults, while recalling of irregular ones were linked to planning, inhibition, short-term memory, binding, and retrospective episodic memory. Conclusion: Taken together, our data suggest that planning capacities seem to play a major role in remembering to perform intended actions with advancing age. Furthermore, the age-PM-paradox may be attenuated when the experimental design is adapted by implementing a familiar context through the use of activities of daily living. The clinical

  11. Regularity effect in prospective memory during aging

    OpenAIRE

    Blondelle, Geoffrey; Hainselin, Mathieu; Gounden, Yannick; Heurley, Laurent; Voisin, Hélène; Megalakaki, Olga; Bressous, Estelle; Quaglino, Véronique

    2016-01-01

    Background: Regularity effect can affect performance in prospective memory (PM), but little is known on the cognitive processes linked to this effect. Moreover, its impacts with regard to aging remain unknown. To our knowledge, this study is the first to examine regularity effect in PM in a lifespan perspective, with a sample of young, intermediate, and older adults.Objective and design: Our study examined the regularity effect in PM in three groups of participants: 28 young adults (18–30), 1...

  12. TRANSIENT LUNAR PHENOMENA: REGULARITY AND REALITY

    International Nuclear Information System (INIS)

    Crotts, Arlin P. S.

    2009-01-01

    Transient lunar phenomena (TLPs) have been reported for centuries, but their nature is largely unsettled, and even their existence as a coherent phenomenon is controversial. Nonetheless, TLP data show regularities in the observations; a key question is whether this structure is imposed by processes tied to the lunar surface, or by terrestrial atmospheric or human observer effects. I interrogate an extensive catalog of TLPs to gauge how human factors determine the distribution of TLP reports. The sample is grouped according to variables which should produce differing results if determining factors involve humans, and not reflecting phenomena tied to the lunar surface. Features dependent on human factors can then be excluded. Regardless of how the sample is split, the results are similar: ∼50% of reports originate from near Aristarchus, ∼16% from Plato, ∼6% from recent, major impacts (Copernicus, Kepler, Tycho, and Aristarchus), plus several at Grimaldi. Mare Crisium produces a robust signal in some cases (however, Crisium is too large for a 'feature' as defined). TLP count consistency for these features indicates that ∼80% of these may be real. Some commonly reported sites disappear from the robust averages, including Alphonsus, Ross D, and Gassendi. These reports begin almost exclusively after 1955, when TLPs became widely known and many more (and inexperienced) observers searched for TLPs. In a companion paper, we compare the spatial distribution of robust TLP sites to transient outgassing (seen by Apollo and Lunar Prospector instruments). To a high confidence, robust TLP sites and those of lunar outgassing correlate strongly, further arguing for the reality of TLPs.

  13. Regularization and error assignment to unfolded distributions

    CERN Document Server

    Zech, Gunter

    2011-01-01

    The commonly used approach to present unfolded data only in graphical formwith the diagonal error depending on the regularization strength is unsatisfac-tory. It does not permit the adjustment of parameters of theories, the exclusionof theories that are admitted by the observed data and does not allow the com-bination of data from different experiments. We propose fixing the regulariza-tion strength by a p-value criterion, indicating the experimental uncertaintiesindependent of the regularization and publishing the unfolded data in additionwithout regularization. These considerations are illustrated with three differentunfolding and smoothing approaches applied to a toy example.

  14. Iterative Regularization with Minimum-Residual Methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2007-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  15. Iterative regularization with minimum-residual methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2006-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES - their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  16. The perception of regularity in an isochronous stimulus in zebra finches (Taeniopygia guttata) and humans

    NARCIS (Netherlands)

    van der Aa, J.; Honing, H.; ten Cate, C.

    2015-01-01

    Perceiving temporal regularity in an auditory stimulus is considered one of the basic features of musicality. Here we examine whether zebra finches can detect regularity in an isochronous stimulus. Using a go/no go paradigm we show that zebra finches are able to distinguish between an isochronous

  17. Chord length distributions between hard disks and spheres in regular, semi-regular, and quasi-random structures

    International Nuclear Information System (INIS)

    Olson, Gordon L.

    2008-01-01

    In binary stochastic media in two- and three-dimensions consisting of randomly placed impenetrable disks or spheres, the chord lengths in the background material between disks and spheres closely follow exponential distributions if the disks and spheres occupy less than 10% of the medium. This work demonstrates that for regular spatial structures of disks and spheres, the tails of the chord length distributions (CLDs) follow power laws rather than exponentials. In dilute media, when the disks and spheres are widely spaced, the slope of the power law seems to be independent of the details of the structure. When approaching a close-packed arrangement, the exact placement of the spheres can make a significant difference. When regular structures are perturbed by small random displacements, the CLDs become power laws with steeper slopes. An example CLD from a quasi-random distribution of spheres in clusters shows a modified exponential distribution

  18. Chord length distributions between hard disks and spheres in regular, semi-regular, and quasi-random structures

    Energy Technology Data Exchange (ETDEWEB)

    Olson, Gordon L. [Computer and Computational Sciences Division (CCS-2), Los Alamos National Laboratory, 5 Foxglove Circle, Madison, WI 53717 (United States)], E-mail: olson99@tds.net

    2008-11-15

    In binary stochastic media in two- and three-dimensions consisting of randomly placed impenetrable disks or spheres, the chord lengths in the background material between disks and spheres closely follow exponential distributions if the disks and spheres occupy less than 10% of the medium. This work demonstrates that for regular spatial structures of disks and spheres, the tails of the chord length distributions (CLDs) follow power laws rather than exponentials. In dilute media, when the disks and spheres are widely spaced, the slope of the power law seems to be independent of the details of the structure. When approaching a close-packed arrangement, the exact placement of the spheres can make a significant difference. When regular structures are perturbed by small random displacements, the CLDs become power laws with steeper slopes. An example CLD from a quasi-random distribution of spheres in clusters shows a modified exponential distribution.

  19. A regularized stationary mean-field game

    KAUST Repository

    Yang, Xianjin

    2016-01-01

    In the thesis, we discuss the existence and numerical approximations of solutions of a regularized mean-field game with a low-order regularization. In the first part, we prove a priori estimates and use the continuation method to obtain the existence of a solution with a positive density. Finally, we introduce the monotone flow method and solve the system numerically.

  20. A regularized stationary mean-field game

    KAUST Repository

    Yang, Xianjin

    2016-04-19

    In the thesis, we discuss the existence and numerical approximations of solutions of a regularized mean-field game with a low-order regularization. In the first part, we prove a priori estimates and use the continuation method to obtain the existence of a solution with a positive density. Finally, we introduce the monotone flow method and solve the system numerically.

  1. On infinite regular and chiral maps

    OpenAIRE

    Arredondo, John A.; Valdez, Camilo Ramírez y Ferrán

    2015-01-01

    We prove that infinite regular and chiral maps take place on surfaces with at most one end. Moreover, we prove that an infinite regular or chiral map on an orientable surface with genus can only be realized on the Loch Ness monster, that is, the topological surface of infinite genus with one end.

  2. From recreational to regular drug use

    DEFF Research Database (Denmark)

    Järvinen, Margaretha; Ravn, Signe

    2011-01-01

    This article analyses the process of going from recreational use to regular and problematic use of illegal drugs. We present a model containing six career contingencies relevant for young people’s progress from recreational to regular drug use: the closing of social networks, changes in forms...

  3. Automating InDesign with Regular Expressions

    CERN Document Server

    Kahrel, Peter

    2006-01-01

    If you need to make automated changes to InDesign documents beyond what basic search and replace can handle, you need regular expressions, and a bit of scripting to make them work. This Short Cut explains both how to write regular expressions, so you can find and replace the right things, and how to use them in InDesign specifically.

  4. Regularization modeling for large-eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Holm, D.D.

    2003-01-01

    A new modeling approach for large-eddy simulation (LES) is obtained by combining a "regularization principle" with an explicit filter and its inversion. This regularization approach allows a systematic derivation of the implied subgrid model, which resolves the closure problem. The central role of

  5. 29 CFR 779.18 - Regular rate.

    Science.gov (United States)

    2010-07-01

    ... employee under subsection (a) or in excess of the employee's normal working hours or regular working hours... Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR STATEMENTS OF GENERAL POLICY OR... not less than one and one-half times their regular rates of pay. Section 7(e) of the Act defines...

  6. Response-only method for damage detection of beam-like structures using high accuracy frequencies with auxiliary mass spatial probing

    Science.gov (United States)

    Zhong, Shuncong; Oyadiji, S. Olutunde; Ding, Kang

    2008-04-01

    This paper proposes a new approach based on auxiliary mass spatial probing using spectral centre correction method (SCCM), to provide a simple solution for damage detection by just using the response time history of beam-like structures. The natural frequencies of a damaged beam with a traversing auxiliary mass change due to change in the inertia of the beam as the auxiliary mass is traversed along the beam, as well as the point-to-point variations in the flexibility of the beam. Therefore the auxiliary mass can enhance the effects of the crack on the dynamics of the beam and, therefore, facilitate the identification and location of damage in the beam. That is, the auxiliary mass can be used to probe the dynamic characteristic of the beam by traversing the mass from one end of the beam to the other. However, it is impossible to obtain accurate modal frequencies by the direct operation of the fast Fourier transform (FFT) of the response data of the structure because the frequency spectrum can be only calculated from limited sampled time data which results in the well-known leakage effect. SCCM is identical to the energy centrobaric correction method (ECCM) which is a practical and effective method used in rotating mechanical fault diagnosis and which resolves the shortcoming of FFT and can provide high accuracy estimate of frequency, amplitude and phase. In the present work, the modal responses of damaged simply supported beams with auxiliary mass are computed using the finite element method (FEM). The graphical plots of the natural frequencies calculated by SCCM versus axial location of auxiliary mass are obtained. However, it is difficult to locate the crack directly from the curve of natural frequencies. A simple and fast method, the derivatives of natural frequency curve, is proposed in the paper which can provide crack information for damage detection of beam-like structures. The efficiency and practicability of the proposed method is illustrated via numerical

  7. An Idle-State Detection Algorithm for SSVEP-Based Brain-Computer Interfaces Using a Maximum Evoked Response Spatial Filter.

    Science.gov (United States)

    Zhang, Dan; Huang, Bisheng; Wu, Wei; Li, Siliang

    2015-11-01

    Although accurate recognition of the idle state is essential for the application of brain-computer interfaces (BCIs) in real-world situations, it remains a challenging task due to the variability of the idle state. In this study, a novel algorithm was proposed for the idle state detection in a steady-state visual evoked potential (SSVEP)-based BCI. The proposed algorithm aims to solve the idle state detection problem by constructing a better model of the control states. For feature extraction, a maximum evoked response (MER) spatial filter was developed to extract neurophysiologically plausible SSVEP responses, by finding the combination of multi-channel electroencephalogram (EEG) signals that maximized the evoked responses while suppressing the unrelated background EEGs. The extracted SSVEP responses at the frequencies of both the attended and the unattended stimuli were then used to form feature vectors and a series of binary classifiers for recognition of each control state and the idle state were constructed. EEG data from nine subjects in a three-target SSVEP BCI experiment with a variety of idle state conditions were used to evaluate the proposed algorithm. Compared to the most popular canonical correlation analysis-based algorithm and the conventional power spectrum-based algorithm, the proposed algorithm outperformed them by achieving an offline control state classification accuracy of 88.0 ± 11.1% and idle state false positive rates (FPRs) ranging from 7.4 ± 5.6% to 14.2 ± 10.1%, depending on the specific idle state conditions. Moreover, the online simulation reported BCI performance close to practical use: 22.0 ± 2.9 out of the 24 control commands were correctly recognized and the FPRs achieved as low as approximately 0.5 event/min in the idle state conditions with eye open and 0.05 event/min in the idle state condition with eye closed. These results demonstrate the potential of the proposed algorithm for implementing practical SSVEP BCI systems.

  8. Improved resolution and reliability in dynamic PET using Bayesian regularization of MRTM2

    DEFF Research Database (Denmark)

    Agn, Mikael; Svarer, Claus; Frokjaer, Vibe G.

    2014-01-01

    This paper presents a mathematical model that regularizes dynamic PET data by using a Bayesian framework. We base the model on the well known two-parameter multilinear reference tissue method MRTM2 and regularize on the assumption that spatially close regions have similar parameters. The developed...... model is compared to the conventional approach of improving the low signal-to-noise ratio of PET data, i.e., spatial filtering of each time frame independently by a Gaussian kernel. We show that the model handles high levels of noise better than the conventional approach, while at the same time...

  9. An iterative method for Tikhonov regularization with a general linear regularization operator

    NARCIS (Netherlands)

    Hochstenbach, M.E.; Reichel, L.

    2010-01-01

    Tikhonov regularization is one of the most popular approaches to solve discrete ill-posed problems with error-contaminated data. A regularization operator and a suitable value of a regularization parameter have to be chosen. This paper describes an iterative method, based on Golub-Kahan

  10. Multiple graph regularized protein domain ranking

    KAUST Repository

    Wang, Jim Jing-Yan

    2012-11-19

    Background: Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods.Results: To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods.Conclusion: The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. 2012 Wang et al; licensee BioMed Central Ltd.

  11. Multiple graph regularized protein domain ranking

    KAUST Repository

    Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin

    2012-01-01

    Background: Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods.Results: To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods.Conclusion: The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. 2012 Wang et al; licensee BioMed Central Ltd.

  12. Hierarchical regular small-world networks

    International Nuclear Information System (INIS)

    Boettcher, Stefan; Goncalves, Bruno; Guclu, Hasan

    2008-01-01

    Two new networks are introduced that resemble small-world properties. These networks are recursively constructed but retain a fixed, regular degree. They possess a unique one-dimensional lattice backbone overlaid by a hierarchical sequence of long-distance links, mixing real-space and small-world features. Both networks, one 3-regular and the other 4-regular, lead to distinct behaviors, as revealed by renormalization group studies. The 3-regular network is planar, has a diameter growing as √N with system size N, and leads to super-diffusion with an exact, anomalous exponent d w = 1.306..., but possesses only a trivial fixed point T c = 0 for the Ising ferromagnet. In turn, the 4-regular network is non-planar, has a diameter growing as ∼2 √(log 2 N 2 ) , exhibits 'ballistic' diffusion (d w = 1), and a non-trivial ferromagnetic transition, T c > 0. It suggests that the 3-regular network is still quite 'geometric', while the 4-regular network qualifies as a true small world with mean-field properties. As an engineering application we discuss synchronization of processors on these networks. (fast track communication)

  13. Multiple graph regularized protein domain ranking.

    Science.gov (United States)

    Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin

    2012-11-19

    Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.

  14. Coupling regularizes individual units in noisy populations

    International Nuclear Information System (INIS)

    Ly Cheng; Ermentrout, G. Bard

    2010-01-01

    The regularity of a noisy system can modulate in various ways. It is well known that coupling in a population can lower the variability of the entire network; the collective activity is more regular. Here, we show that diffusive (reciprocal) coupling of two simple Ornstein-Uhlenbeck (O-U) processes can regularize the individual, even when it is coupled to a noisier process. In cellular networks, the regularity of individual cells is important when a select few play a significant role. The regularizing effect of coupling surprisingly applies also to general nonlinear noisy oscillators. However, unlike with the O-U process, coupling-induced regularity is robust to different kinds of coupling. With two coupled noisy oscillators, we derive an asymptotic formula assuming weak noise and coupling for the variance of the period (i.e., spike times) that accurately captures this effect. Moreover, we find that reciprocal coupling can regularize the individual period of higher dimensional oscillators such as the Morris-Lecar and Brusselator models, even when coupled to noisier oscillators. Coupling can have a counterintuitive and beneficial effect on noisy systems. These results have implications for the role of connectivity with noisy oscillators and the modulation of variability of individual oscillators.

  15. Multiple graph regularized protein domain ranking

    Directory of Open Access Journals (Sweden)

    Wang Jim

    2012-11-01

    Full Text Available Abstract Background Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. Results To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. Conclusion The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.

  16. Diagrammatic methods in phase-space regularization

    International Nuclear Information System (INIS)

    Bern, Z.; Halpern, M.B.; California Univ., Berkeley

    1987-11-01

    Using the scalar prototype and gauge theory as the simplest possible examples, diagrammatic methods are developed for the recently proposed phase-space form of continuum regularization. A number of one-loop and all-order applications are given, including general diagrammatic discussions of the nogrowth theorem and the uniqueness of the phase-space stochastic calculus. The approach also generates an alternate derivation of the equivalence of the large-β phase-space regularization to the more conventional coordinate-space regularization. (orig.)

  17. J-regular rings with injectivities

    OpenAIRE

    Shen, Liang

    2010-01-01

    A ring $R$ is called a J-regular ring if R/J(R) is von Neumann regular, where J(R) is the Jacobson radical of R. It is proved that if R is J-regular, then (i) R is right n-injective if and only if every homomorphism from an $n$-generated small right ideal of $R$ to $R_{R}$ can be extended to one from $R_{R}$ to $R_{R}$; (ii) R is right FP-injective if and only if R is right (J, R)-FP-injective. Some known results are improved.

  18. Infants use temporal regularities to chunk objects in memory.

    Science.gov (United States)

    Kibbe, Melissa M; Feigenson, Lisa

    2016-01-01

    Infants, like adults, can maintain only a few items in working memory, but can overcome this limit by creating more efficient representations, or "chunks." Previous research shows that infants can form chunks using shared features or spatial proximity between objects. Here we asked whether infants also can create chunked representations using regularities that unfold over time. Thirteen-month old infants first were familiarized with four objects of different shapes and colors, presented in successive pairs. For some infants, the identities of objects in each pair varied randomly across familiarization (Experiment 1). For others, the objects within a pair always co-occurred, either in consistent relative spatial positions (Experiment 2a) or varying spatial positions (Experiment 2b). Following familiarization, infants saw all four objects hidden behind a screen and then saw the screen lifted to reveal either four objects or only three. Infants in Experiment 1, who had been familiarized with random object pairings, failed to look longer at the unexpected 3-object outcome; they showed the same inability to concurrently represent four objects as in other studies of infant working memory. In contrast, infants in Experiments 2a and 2b, who had been familiarized with regularly co-occurring pairs, looked longer at the unexpected outcome. These infants apparently used the co-occurrence between individual objects during familiarization to form chunked representations that were later deployed to track the objects as they were hidden at test. In Experiment 3, we confirmed that the familiarization affected infants' ability to remember the occluded objects rather than merely establishing longer-term memory for object pairs. Following familiarization to consistent pairs, infants who were not shown a hiding event (but merely saw the same test outcomes as in Experiments 2a and b) showed no preference for arrays of three versus four objects. Finally, in Experiments 4 and 5, we asked

  19. Regularized quasinormal modes for plasmonic resonators and open cavities

    Science.gov (United States)

    Kamandar Dezfouli, Mohsen; Hughes, Stephen

    2018-03-01

    Optical mode theory and analysis of open cavities and plasmonic particles is an essential component of optical resonator physics, offering considerable insight and efficiency for connecting to classical and quantum optical properties such as the Purcell effect. However, obtaining the dissipative modes in normalized form for arbitrarily shaped open-cavity systems is notoriously difficult, often involving complex spatial integrations, even after performing the necessary full space solutions to Maxwell's equations. The formal solutions are termed quasinormal modes, which are known to diverge in space, and additional techniques are frequently required to obtain more accurate field representations in the far field. In this work, we introduce a finite-difference time-domain technique that can be used to obtain normalized quasinormal modes using a simple dipole-excitation source, and an inverse Green function technique, in real frequency space, without having to perform any spatial integrations. Moreover, we show how these modes are naturally regularized to ensure the correct field decay behavior in the far field, and thus can be used at any position within and outside the resonator. We term these modes "regularized quasinormal modes" and show the reliability and generality of the theory by studying the generalized Purcell factor of dipole emitters near metallic nanoresonators, hybrid devices with metal nanoparticles coupled to dielectric waveguides, as well as coupled cavity-waveguides in photonic crystals slabs. We also directly compare our results with full-dipole simulations of Maxwell's equations without any approximations, and show excellent agreement.

  20. Regular pattern formation in real ecosystems

    NARCIS (Netherlands)

    Rietkerk, Max; Koppel, Johan van de

    2008-01-01

    Localized ecological interactions can generate striking large-scale spatial patterns in ecosystems through spatial self-organization. Possible mechanisms include oscillating consumer–resource interactions, localized disturbance-recovery processes and scale-dependent feedback. Despite abundant

  1. Generalized regular genus for manifolds with boundary

    Directory of Open Access Journals (Sweden)

    Paola Cristofori

    2003-05-01

    Full Text Available We introduce a generalization of the regular genus, a combinatorial invariant of PL manifolds ([10], which is proved to be strictly related, in dimension three, to generalized Heegaard splittings defined in [12].

  2. Geometric regularizations and dual conifold transitions

    International Nuclear Information System (INIS)

    Landsteiner, Karl; Lazaroiu, Calin I.

    2003-01-01

    We consider a geometric regularization for the class of conifold transitions relating D-brane systems on noncompact Calabi-Yau spaces to certain flux backgrounds. This regularization respects the SL(2,Z) invariance of the flux superpotential, and allows for computation of the relevant periods through the method of Picard-Fuchs equations. The regularized geometry is a noncompact Calabi-Yau which can be viewed as a monodromic fibration, with the nontrivial monodromy being induced by the regulator. It reduces to the original, non-monodromic background when the regulator is removed. Using this regularization, we discuss the simple case of the local conifold, and show how the relevant field-theoretic information can be extracted in this approach. (author)

  3. Fast and compact regular expression matching

    DEFF Research Database (Denmark)

    Bille, Philip; Farach-Colton, Martin

    2008-01-01

    We study 4 problems in string matching, namely, regular expression matching, approximate regular expression matching, string edit distance, and subsequence indexing, on a standard word RAM model of computation that allows logarithmic-sized words to be manipulated in constant time. We show how...... to improve the space and/or remove a dependency on the alphabet size for each problem using either an improved tabulation technique of an existing algorithm or by combining known algorithms in a new way....

  4. Regular-fat dairy and human health

    DEFF Research Database (Denmark)

    Astrup, Arne; Bradley, Beth H Rice; Brenna, J Thomas

    2016-01-01

    In recent history, some dietary recommendations have treated dairy fat as an unnecessary source of calories and saturated fat in the human diet. These assumptions, however, have recently been brought into question by current research on regular fat dairy products and human health. In an effort to......, cheese and yogurt, can be important components of an overall healthy dietary pattern. Systematic examination of the effects of dietary patterns that include regular-fat milk, cheese and yogurt on human health is warranted....

  5. Deterministic automata for extended regular expressions

    Directory of Open Access Journals (Sweden)

    Syzdykov Mirzakhmet

    2017-12-01

    Full Text Available In this work we present the algorithms to produce deterministic finite automaton (DFA for extended operators in regular expressions like intersection, subtraction and complement. The method like “overriding” of the source NFA(NFA not defined with subset construction rules is used. The past work described only the algorithm for AND-operator (or intersection of regular languages; in this paper the construction for the MINUS-operator (and complement is shown.

  6. Regularities of intermediate adsorption complex relaxation

    International Nuclear Information System (INIS)

    Manukova, L.A.

    1982-01-01

    The experimental data, characterizing the regularities of intermediate adsorption complex relaxation in the polycrystalline Mo-N 2 system at 77 K are given. The method of molecular beam has been used in the investigation. The analytical expressions of change regularity in the relaxation process of full and specific rates - of transition from intermediate state into ''non-reversible'', of desorption into the gas phase and accumUlation of the particles in the intermediate state are obtained

  7. Online Manifold Regularization by Dual Ascending Procedure

    OpenAIRE

    Sun, Boliang; Li, Guohui; Jia, Li; Zhang, Hui

    2013-01-01

    We propose a novel online manifold regularization framework based on the notion of duality in constrained optimization. The Fenchel conjugate of hinge functions is a key to transfer manifold regularization from offline to online in this paper. Our algorithms are derived by gradient ascent in the dual function. For practical purpose, we propose two buffering strategies and two sparse approximations to reduce the computational complexity. Detailed experiments verify the utility of our approache...

  8. Spatially modulated imaging system

    International Nuclear Information System (INIS)

    Barrett, H.H.

    1975-01-01

    Noncoherent radiation, such as x-rays, is spatially coded, directed through an object and spatially detected to form a spatially coded pattern, from which an image of the object may be reconstructed. The x-ray source may be formed by x-ray fluorescence and substration of the holographic images formed by two sources having energy levels predominantly above and below the maximum absorption range of an agent in the object may be used to enhance contrast in the reproduced image. (Patent Office Record)

  9. Use of regularized algebraic methods in tomographic reconstruction

    International Nuclear Information System (INIS)

    Koulibaly, P.M.; Darcourt, J.; Blanc-Ferraud, L.; Migneco, O.; Barlaud, M.

    1997-01-01

    The algebraic methods are used in emission tomography to facilitate the compensation of attenuation and of Compton scattering. We have tested on a phantom the use of a regularization (a priori introduction of information), as well as the taking into account of spatial resolution variation with the depth (SRVD). Hence, we have compared the performances of the two methods by back-projection filtering (BPF) and of the two algebraic methods (AM) in terms of FWHM (by means of a point source), of the reduction of background noise (σ/m) on the homogeneous part of Jaszczak's phantom and of reconstruction speed (time unit = BPF). The BPF methods make use of a grade filter (maximal resolution, no noise treatment), single or associated with a Hann's low-pass (f c = 0.4), as well as of an attenuation correction. The AM which embody attenuation and scattering corrections are, on one side, the OS EM (Ordered Subsets, partitioning and rearranging of the projection matrix; Expectation Maximization) without regularization or SRVD correction, and, on the other side, the OS MAP EM (Maximum a posteriori), regularized and embodying the SRVD correction. A table is given containing for each used method (grade, Hann, OS EM and OS MAP EM) the values of FWHM, σ/m and time, respectively. One can observe that the OS MAP EM algebraic method allows ameliorating both the resolution, by taking into account the SRVD in the reconstruction process and noise treatment by regularization. In addition, due to the OS technique the reconstruction times are acceptable

  10. Optimal Design of the Adaptive Normalized Matched Filter Detector using Regularized Tyler Estimators

    KAUST Repository

    Kammoun, Abla; Couillet, Romain; Pascal, Frederic; Alouini, Mohamed-Slim

    2017-01-01

    This article addresses improvements on the design of the adaptive normalized matched filter (ANMF) for radar detection. It is well-acknowledged that the estimation of the noise-clutter covariance matrix is a fundamental step in adaptive radar detection. In this paper, we consider regularized estimation methods which force by construction the eigenvalues of the covariance estimates to be greater than a positive regularization parameter ρ. This makes them more suitable for high dimensional problems with a limited number of secondary data samples than traditional sample covariance estimates. The motivation behind this work is to understand the effect and properly set the value of ρthat improves estimate conditioning while maintaining a low estimation bias. More specifically, we consider the design of the ANMF detector for two kinds of regularized estimators, namely the regularized sample covariance matrix (RSCM), the regularized Tyler estimator (RTE). The rationale behind this choice is that the RTE is efficient in mitigating the degradation caused by the presence of impulsive noises while inducing little loss when the noise is Gaussian. Based on asymptotic results brought by recent tools from random matrix theory, we propose a design for the regularization parameter that maximizes the asymptotic detection probability under constant asymptotic false alarm rates. Provided Simulations support the efficiency of the proposed method, illustrating its gain over conventional settings of the regularization parameter.

  11. Optimal Design of the Adaptive Normalized Matched Filter Detector using Regularized Tyler Estimators

    KAUST Repository

    Kammoun, Abla

    2017-10-25

    This article addresses improvements on the design of the adaptive normalized matched filter (ANMF) for radar detection. It is well-acknowledged that the estimation of the noise-clutter covariance matrix is a fundamental step in adaptive radar detection. In this paper, we consider regularized estimation methods which force by construction the eigenvalues of the covariance estimates to be greater than a positive regularization parameter ρ. This makes them more suitable for high dimensional problems with a limited number of secondary data samples than traditional sample covariance estimates. The motivation behind this work is to understand the effect and properly set the value of ρthat improves estimate conditioning while maintaining a low estimation bias. More specifically, we consider the design of the ANMF detector for two kinds of regularized estimators, namely the regularized sample covariance matrix (RSCM), the regularized Tyler estimator (RTE). The rationale behind this choice is that the RTE is efficient in mitigating the degradation caused by the presence of impulsive noises while inducing little loss when the noise is Gaussian. Based on asymptotic results brought by recent tools from random matrix theory, we propose a design for the regularization parameter that maximizes the asymptotic detection probability under constant asymptotic false alarm rates. Provided Simulations support the efficiency of the proposed method, illustrating its gain over conventional settings of the regularization parameter.

  12. Regular Expression Matching and Operational Semantics

    Directory of Open Access Journals (Sweden)

    Asiri Rathnayake

    2011-08-01

    Full Text Available Many programming languages and tools, ranging from grep to the Java String library, contain regular expression matchers. Rather than first translating a regular expression into a deterministic finite automaton, such implementations typically match the regular expression on the fly. Thus they can be seen as virtual machines interpreting the regular expression much as if it were a program with some non-deterministic constructs such as the Kleene star. We formalize this implementation technique for regular expression matching using operational semantics. Specifically, we derive a series of abstract machines, moving from the abstract definition of matching to increasingly realistic machines. First a continuation is added to the operational semantics to describe what remains to be matched after the current expression. Next, we represent the expression as a data structure using pointers, which enables redundant searches to be eliminated via testing for pointer equality. From there, we arrive both at Thompson's lockstep construction and a machine that performs some operations in parallel, suitable for implementation on a large number of cores, such as a GPU. We formalize the parallel machine using process algebra and report some preliminary experiments with an implementation on a graphics processor using CUDA.

  13. Regularities, Natural Patterns and Laws of Nature

    Directory of Open Access Journals (Sweden)

    Stathis Psillos

    2014-02-01

    Full Text Available  The goal of this paper is to sketch an empiricist metaphysics of laws of nature. The key idea is that there are regularities without regularity-enforcers. Differently put, there are natural laws without law-makers of a distinct metaphysical kind. This sketch will rely on the concept of a natural pattern and more significantly on the existence of a network of natural patterns in nature. The relation between a regularity and a pattern will be analysed in terms of mereology.  Here is the road map. In section 2, I will briefly discuss the relation between empiricism and metaphysics, aiming to show that an empiricist metaphysics is possible. In section 3, I will offer arguments against stronger metaphysical views of laws. Then, in section 4 I will motivate nomic objectivism. In section 5, I will address the question ‘what is a regularity?’ and will develop a novel answer to it, based on the notion of a natural pattern. In section 6, I will raise the question: ‘what is a law of nature?’, the answer to which will be: a law of nature is a regularity that is characterised by the unity of a natural pattern.

  14. Regular non-twisting S-branes

    International Nuclear Information System (INIS)

    Obregon, Octavio; Quevedo, Hernando; Ryan, Michael P.

    2004-01-01

    We construct a family of time and angular dependent, regular S-brane solutions which corresponds to a simple analytical continuation of the Zipoy-Voorhees 4-dimensional vacuum spacetime. The solutions are asymptotically flat and turn out to be free of singularities without requiring a twist in space. They can be considered as the simplest non-singular generalization of the singular S0-brane solution. We analyze the properties of a representative of this family of solutions and show that it resembles to some extent the asymptotic properties of the regular Kerr S-brane. The R-symmetry corresponds, however, to the general lorentzian symmetry. Several generalizations of this regular solution are derived which include a charged S-brane and an additional dilatonic field. (author)

  15. Online Manifold Regularization by Dual Ascending Procedure

    Directory of Open Access Journals (Sweden)

    Boliang Sun

    2013-01-01

    Full Text Available We propose a novel online manifold regularization framework based on the notion of duality in constrained optimization. The Fenchel conjugate of hinge functions is a key to transfer manifold regularization from offline to online in this paper. Our algorithms are derived by gradient ascent in the dual function. For practical purpose, we propose two buffering strategies and two sparse approximations to reduce the computational complexity. Detailed experiments verify the utility of our approaches. An important conclusion is that our online MR algorithms can handle the settings where the target hypothesis is not fixed but drifts with the sequence of examples. We also recap and draw connections to earlier works. This paper paves a way to the design and analysis of online manifold regularization algorithms.

  16. Task-based detectability in CT image reconstruction by filtered backprojection and penalized likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Gang, Grace J. [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (Canada); Stayman, J. Webster; Zbijewski, Wojciech [Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (United States); Siewerdsen, Jeffrey H., E-mail: jeff.siewerdsen@jhu.edu [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland 21205 (United States)

    2014-08-15

    Purpose: Nonstationarity is an important aspect of imaging performance in CT and cone-beam CT (CBCT), especially for systems employing iterative reconstruction. This work presents a theoretical framework for both filtered-backprojection (FBP) and penalized-likelihood (PL) reconstruction that includes explicit descriptions of nonstationary noise, spatial resolution, and task-based detectability index. Potential utility of the model was demonstrated in the optimal selection of regularization parameters in PL reconstruction. Methods: Analytical models for local modulation transfer function (MTF) and noise-power spectrum (NPS) were investigated for both FBP and PL reconstruction, including explicit dependence on the object and spatial location. For FBP, a cascaded systems analysis framework was adapted to account for nonstationarity by separately calculating fluence and system gains for each ray passing through any given voxel. For PL, the point-spread function and covariance were derived using the implicit function theorem and first-order Taylor expansion according toFessler [“Mean and variance of implicitly defined biased estimators (such as penalized maximum likelihood): Applications to tomography,” IEEE Trans. Image Process. 5(3), 493–506 (1996)]. Detectability index was calculated for a variety of simple tasks. The model for PL was used in selecting the regularization strength parameter to optimize task-based performance, with both a constant and a spatially varying regularization map. Results: Theoretical models of FBP and PL were validated in 2D simulated fan-beam data and found to yield accurate predictions of local MTF and NPS as a function of the object and the spatial location. The NPS for both FBP and PL exhibit similar anisotropic nature depending on the pathlength (and therefore, the object and spatial location within the object) traversed by each ray, with the PL NPS experiencing greater smoothing along directions with higher noise. The MTF of FBP

  17. Spatial Operations

    Directory of Open Access Journals (Sweden)

    Anda VELICANU

    2010-09-01

    Full Text Available This paper contains a brief description of the most important operations that can be performed on spatial data such as spatial queries, create, update, insert, delete operations, conversions, operations on the map or analysis on grid cells. Each operation has a graphical example and some of them have code examples in Oracle and PostgreSQL.

  18. Spatializing Time

    DEFF Research Database (Denmark)

    Thomsen, Bodil Marie Stavning

    2011-01-01

    The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations.......The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations....

  19. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  20. Stochastic dynamic modeling of regular and slow earthquakes

    Science.gov (United States)

    Aso, N.; Ando, R.; Ide, S.

    2017-12-01

    Both regular and slow earthquakes are slip phenomena on plate boundaries and are simulated by a (quasi-)dynamic modeling [Liu and Rice, 2005]. In these numerical simulations, spatial heterogeneity is usually considered not only for explaining real physical properties but also for evaluating the stability of the calculations or the sensitivity of the results on the condition. However, even though we discretize the model space with small grids, heterogeneity at smaller scales than the grid size is not considered in the models with deterministic governing equations. To evaluate the effect of heterogeneity at the smaller scales we need to consider stochastic interactions between slip and stress in a dynamic modeling. Tidal stress is known to trigger or affect both regular and slow earthquakes [Yabe et al., 2015; Ide et al., 2016], and such an external force with fluctuation can also be considered as a stochastic external force. A healing process of faults may also be stochastic, so we introduce stochastic friction law. In the present study, we propose a stochastic dynamic model to explain both regular and slow earthquakes. We solve mode III problem, which corresponds to the rupture propagation along the strike direction. We use BIEM (boundary integral equation method) scheme to simulate slip evolution, but we add stochastic perturbations in the governing equations, which is usually written in a deterministic manner. As the simplest type of perturbations, we adopt Gaussian deviations in the formulation of the slip-stress kernel, external force, and friction. By increasing the amplitude of perturbations of the slip-stress kernel, we reproduce complicated rupture process of regular earthquakes including unilateral and bilateral ruptures. By perturbing external force, we reproduce slow rupture propagation at a scale of km/day. The slow propagation generated by a combination of fast interaction at S-wave velocity is analogous to the kinetic theory of gasses: thermal

  1. A simple homogeneous model for regular and irregular metallic wire media samples

    Science.gov (United States)

    Kosulnikov, S. Y.; Mirmoosa, M. S.; Simovski, C. R.

    2018-02-01

    To simplify the solution of electromagnetic problems with wire media samples, it is reasonable to treat them as the samples of a homogeneous material without spatial dispersion. The account of spatial dispersion implies additional boundary conditions and makes the solution of boundary problems difficult especially if the sample is not an infinitely extended layer. Moreover, for a novel type of wire media - arrays of randomly tilted wires - a spatially dispersive model has not been developed. Here, we introduce a simplistic heuristic model of wire media samples shaped as bricks. Our model covers WM of both regularly and irregularly stretched wires.

  2. Regular transport dynamics produce chaotic travel times.

    Science.gov (United States)

    Villalobos, Jorge; Muñoz, Víctor; Rogan, José; Zarama, Roberto; Johnson, Neil F; Toledo, Benjamín; Valdivia, Juan Alejandro

    2014-06-01

    In the hope of making passenger travel times shorter and more reliable, many cities are introducing dedicated bus lanes (e.g., Bogota, London, Miami). Here we show that chaotic travel times are actually a natural consequence of individual bus function, and hence of public transport systems more generally, i.e., chaotic dynamics emerge even when the route is empty and straight, stops and lights are equidistant and regular, and loading times are negligible. More generally, our findings provide a novel example of chaotic dynamics emerging from a single object following Newton's laws of motion in a regularized one-dimensional system.

  3. Regularity of difference equations on Banach spaces

    CERN Document Server

    Agarwal, Ravi P; Lizama, Carlos

    2014-01-01

    This work introduces readers to the topic of maximal regularity for difference equations. The authors systematically present the method of maximal regularity, outlining basic linear difference equations along with relevant results. They address recent advances in the field, as well as basic semigroup and cosine operator theories in the discrete setting. The authors also identify some open problems that readers may wish to take up for further research. This book is intended for graduate students and researchers in the area of difference equations, particularly those with advance knowledge of and interest in functional analysis.

  4. PET regularization by envelope guided conjugate gradients

    International Nuclear Information System (INIS)

    Kaufman, L.; Neumaier, A.

    1996-01-01

    The authors propose a new way to iteratively solve large scale ill-posed problems and in particular the image reconstruction problem in positron emission tomography by exploiting the relation between Tikhonov regularization and multiobjective optimization to obtain iteratively approximations to the Tikhonov L-curve and its corner. Monitoring the change of the approximate L-curves allows us to adjust the regularization parameter adaptively during a preconditioned conjugate gradient iteration, so that the desired solution can be reconstructed with a small number of iterations

  5. Matrix regularization of embedded 4-manifolds

    International Nuclear Information System (INIS)

    Trzetrzelewski, Maciej

    2012-01-01

    We consider products of two 2-manifolds such as S 2 ×S 2 , embedded in Euclidean space and show that the corresponding 4-volume preserving diffeomorphism algebra can be approximated by a tensor product SU(N)⊗SU(N) i.e. functions on a manifold are approximated by the Kronecker product of two SU(N) matrices. A regularization of the 4-sphere is also performed by constructing N 2 ×N 2 matrix representations of the 4-algebra (and as a byproduct of the 3-algebra which makes the regularization of S 3 also possible).

  6. Remote Sensing-Based Detection and Spatial Pattern Analysis for Geo-Ecological Niche Modeling of Tillandsia SPP. In the Atacama, Chile

    Science.gov (United States)

    Wolf, N.; Siegmund, A.; del Río, C.; Osses, P.; García, J. L.

    2016-06-01

    In the coastal Atacama Desert in Northern Chile plant growth is constrained to so-called `fog oases' dominated by monospecific stands of the genus Tillandsia. Adapted to the hyperarid environmental conditions, these plants specialize on the foliar uptake of fog as main water and nutrient source. It is this characteristic that leads to distinctive macro- and micro-scale distribution patterns, reflecting complex geo-ecological gradients, mainly affected by the spatiotemporal occurrence of coastal fog respectively the South Pacific Stratocumulus clouds reaching inlands. The current work employs remote sensing, machine learning and spatial pattern/GIS analysis techniques to acquire detailed information on the presence and state of Tillandsia spp. in the Tarapacá region as a base to better understand the bioclimatic and topographic constraints determining the distribution patterns of Tillandsia spp. Spatial and spectral predictors extracted from WorldView-3 satellite data are used to map present Tillandsia vegetation in the Tarapaca region. Regression models on Vegetation Cover Fraction (VCF) are generated combining satellite-based as well as topographic variables and using aggregated high spatial resolution information on vegetation cover derived from UAV flight campaigns as a reference. The results are a first step towards mapping and modelling the topographic as well as bioclimatic factors explaining the spatial distribution patterns of Tillandsia fog oases in the Atacama, Chile.

  7. REMOTE SENSING-BASED DETECTION AND SPATIAL PATTERN ANALYSIS FOR GEO-ECOLOGICAL NICHE MODELING OF TILLANDSIA SPP. IN THE ATACAMA, CHILE

    Directory of Open Access Journals (Sweden)

    N. Wolf

    2016-06-01

    Full Text Available In the coastal Atacama Desert in Northern Chile plant growth is constrained to so-called ‘fog oases’ dominated by monospecific stands of the genus Tillandsia. Adapted to the hyperarid environmental conditions, these plants specialize on the foliar uptake of fog as main water and nutrient source. It is this characteristic that leads to distinctive macro- and micro-scale distribution patterns, reflecting complex geo-ecological gradients, mainly affected by the spatiotemporal occurrence of coastal fog respectively the South Pacific Stratocumulus clouds reaching inlands. The current work employs remote sensing, machine learning and spatial pattern/GIS analysis techniques to acquire detailed information on the presence and state of Tillandsia spp. in the Tarapacá region as a base to better understand the bioclimatic and topographic constraints determining the distribution patterns of Tillandsia spp. Spatial and spectral predictors extracted from WorldView-3 satellite data are used to map present Tillandsia vegetation in the Tarapaca region. Regression models on Vegetation Cover Fraction (VCF are generated combining satellite-based as well as topographic variables and using aggregated high spatial resolution information on vegetation cover derived from UAV flight campaigns as a reference. The results are a first step towards mapping and modelling the topographic as well as bioclimatic factors explaining the spatial distribution patterns of Tillandsia fog oases in the Atacama, Chile.

  8. Interactive facades analysis and synthesis of semi-regular facades

    KAUST Repository

    AlHalawani, Sawsan; Yang, Yongliang; Liu, Han; Mitra, Niloy J.

    2013-01-01

    Urban facades regularly contain interesting variations due to allowed deformations of repeated elements (e.g., windows in different open or close positions) posing challenges to state-of-the-art facade analysis algorithms. We propose a semi-automatic framework to recover both repetition patterns of the elements and their individual deformation parameters to produce a factored facade representation. Such a representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  9. Interactive facades analysis and synthesis of semi-regular facades

    KAUST Repository

    AlHalawani, Sawsan

    2013-05-01

    Urban facades regularly contain interesting variations due to allowed deformations of repeated elements (e.g., windows in different open or close positions) posing challenges to state-of-the-art facade analysis algorithms. We propose a semi-automatic framework to recover both repetition patterns of the elements and their individual deformation parameters to produce a factored facade representation. Such a representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  10. Salient Object Detection via Structured Matrix Decomposition.

    Science.gov (United States)

    Peng, Houwen; Li, Bing; Ling, Haibin; Hu, Weiming; Xiong, Weihua; Maybank, Stephen J

    2016-05-04

    Low-rank recovery models have shown potential for salient object detection, where a matrix is decomposed into a low-rank matrix representing image background and a sparse matrix identifying salient objects. Two deficiencies, however, still exist. First, previous work typically assumes the elements in the sparse matrix are mutually independent, ignoring the spatial and pattern relations of image regions. Second, when the low-rank and sparse matrices are relatively coherent, e.g., when there are similarities between the salient objects and background or when the background is complicated, it is difficult for previous models to disentangle them. To address these problems, we propose a novel structured matrix decomposition model with two structural regularizations: (1) a tree-structured sparsity-inducing regularization that captures the image structure and enforces patches from the same object to have similar saliency values, and (2) a Laplacian regularization that enlarges the gaps between salient objects and the background in feature space. Furthermore, high-level priors are integrated to guide the matrix decomposition and boost the detection. We evaluate our model for salient object detection on five challenging datasets including single object, multiple objects and complex scene images, and show competitive results as compared with 24 state-of-the-art methods in terms of seven performance metrics.

  11. On a correspondence between regular and non-regular operator monotone functions

    DEFF Research Database (Denmark)

    Gibilisco, P.; Hansen, Frank; Isola, T.

    2009-01-01

    We prove the existence of a bijection between the regular and the non-regular operator monotone functions satisfying a certain functional equation. As an application we give a new proof of the operator monotonicity of certain functions related to the Wigner-Yanase-Dyson skew information....

  12. Spatial capture-recapture: a promising method for analyzing data collected using artificial cover objects

    Science.gov (United States)

    Sutherland, Chris; Munoz, David; Miller, David A.W.; Grant, Evan H. Campbell

    2016-01-01

    Spatial capture–recapture (SCR) is a relatively recent development in ecological statistics that provides a spatial context for estimating abundance and space use patterns, and improves inference about absolute population density. SCR has been applied to individual encounter data collected noninvasively using methods such as camera traps, hair snares, and scat surveys. Despite the widespread use of capture-based surveys to monitor amphibians and reptiles, there are few applications of SCR in the herpetological literature. We demonstrate the utility of the application of SCR for studies of reptiles and amphibians by analyzing capture–recapture data from Red-Backed Salamanders, Plethodon cinereus, collected using artificial cover boards. Using SCR to analyze spatial encounter histories of marked individuals, we found evidence that density differed little among four sites within the same forest (on average, 1.59 salamanders/m2) and that salamander detection probability peaked in early October (Julian day 278) reflecting expected surface activity patterns of the species. The spatial scale of detectability, a measure of space use, indicates that the home range size for this population of Red-Backed Salamanders in autumn was 16.89 m2. Surveying reptiles and amphibians using artificial cover boards regularly generates spatial encounter history data of known individuals, which can readily be analyzed using SCR methods, providing estimates of absolute density and inference about the spatial scale of habitat use.

  13. Regularity and irreversibility of weekly travel behavior

    NARCIS (Netherlands)

    Kitamura, R.; van der Hoorn, A.I.J.M.

    1987-01-01

    Dynamic characteristics of travel behavior are analyzed in this paper using weekly travel diaries from two waves of panel surveys conducted six months apart. An analysis of activity engagement indicates the presence of significant regularity in weekly activity participation between the two waves.

  14. Regular and context-free nominal traces

    DEFF Research Database (Denmark)

    Degano, Pierpaolo; Ferrari, Gian-Luigi; Mezzetti, Gianluca

    2017-01-01

    Two kinds of automata are presented, for recognising new classes of regular and context-free nominal languages. We compare their expressive power with analogous proposals in the literature, showing that they express novel classes of languages. Although many properties of classical languages hold ...

  15. Faster 2-regular information-set decoding

    NARCIS (Netherlands)

    Bernstein, D.J.; Lange, T.; Peters, C.P.; Schwabe, P.; Chee, Y.M.

    2011-01-01

    Fix positive integers B and w. Let C be a linear code over F 2 of length Bw. The 2-regular-decoding problem is to find a nonzero codeword consisting of w length-B blocks, each of which has Hamming weight 0 or 2. This problem appears in attacks on the FSB (fast syndrome-based) hash function and

  16. Complexity in union-free regular languages

    Czech Academy of Sciences Publication Activity Database

    Jirásková, G.; Masopust, Tomáš

    2011-01-01

    Roč. 22, č. 7 (2011), s. 1639-1653 ISSN 0129-0541 Institutional research plan: CEZ:AV0Z10190503 Keywords : Union-free regular language * one-cycle-free-path automaton * descriptional complexity Subject RIV: BA - General Mathematics Impact factor: 0.379, year: 2011 http://www.worldscinet.com/ijfcs/22/2207/S0129054111008933.html

  17. Regular Gleason Measures and Generalized Effect Algebras

    Science.gov (United States)

    Dvurečenskij, Anatolij; Janda, Jiří

    2015-12-01

    We study measures, finitely additive measures, regular measures, and σ-additive measures that can attain even infinite values on the quantum logic of a Hilbert space. We show when particular classes of non-negative measures can be studied in the frame of generalized effect algebras.

  18. Regularization of finite temperature string theories

    International Nuclear Information System (INIS)

    Leblanc, Y.; Knecht, M.; Wallet, J.C.

    1990-01-01

    The tachyonic divergences occurring in the free energy of various string theories at finite temperature are eliminated through the use of regularization schemes and analytic continuations. For closed strings, we obtain finite expressions which, however, develop an imaginary part above the Hagedorn temperature, whereas open string theories are still plagued with dilatonic divergences. (orig.)

  19. A Sim(2 invariant dimensional regularization

    Directory of Open Access Journals (Sweden)

    J. Alfaro

    2017-09-01

    Full Text Available We introduce a Sim(2 invariant dimensional regularization of loop integrals. Then we can compute the one loop quantum corrections to the photon self energy, electron self energy and vertex in the Electrodynamics sector of the Very Special Relativity Standard Model (VSRSM.

  20. Continuum regularized Yang-Mills theory

    International Nuclear Information System (INIS)

    Sadun, L.A.

    1987-01-01

    Using the machinery of stochastic quantization, Z. Bern, M. B. Halpern, C. Taubes and I recently proposed a continuum regularization technique for quantum field theory. This regularization may be implemented by applying a regulator to either the (d + 1)-dimensional Parisi-Wu Langevin equation or, equivalently, to the d-dimensional second order Schwinger-Dyson (SD) equations. This technique is non-perturbative, respects all gauge and Lorentz symmetries, and is consistent with a ghost-free gauge fixing (Zwanziger's). This thesis is a detailed study of this regulator, and of regularized Yang-Mills theory, using both perturbative and non-perturbative techniques. The perturbative analysis comes first. The mechanism of stochastic quantization is reviewed, and a perturbative expansion based on second-order SD equations is developed. A diagrammatic method (SD diagrams) for evaluating terms of this expansion is developed. We apply the continuum regulator to a scalar field theory. Using SD diagrams, we show that all Green functions can be rendered finite to all orders in perturbation theory. Even non-renormalizable theories can be regularized. The continuum regulator is then applied to Yang-Mills theory, in conjunction with Zwanziger's gauge fixing. A perturbative expansion of the regulator is incorporated into the diagrammatic method. It is hoped that the techniques discussed in this thesis will contribute to the construction of a renormalized Yang-Mills theory is 3 and 4 dimensions

  1. Gravitational lensing by a regular black hole

    International Nuclear Information System (INIS)

    Eiroa, Ernesto F; Sendra, Carlos M

    2011-01-01

    In this paper, we study a regular Bardeen black hole as a gravitational lens. We find the strong deflection limit for the deflection angle, from which we obtain the positions and magnifications of the relativistic images. As an example, we apply the results to the particular case of the supermassive black hole at the center of our galaxy.

  2. Gravitational lensing by a regular black hole

    Energy Technology Data Exchange (ETDEWEB)

    Eiroa, Ernesto F; Sendra, Carlos M, E-mail: eiroa@iafe.uba.ar, E-mail: cmsendra@iafe.uba.ar [Instituto de Astronomia y Fisica del Espacio, CC 67, Suc. 28, 1428, Buenos Aires (Argentina)

    2011-04-21

    In this paper, we study a regular Bardeen black hole as a gravitational lens. We find the strong deflection limit for the deflection angle, from which we obtain the positions and magnifications of the relativistic images. As an example, we apply the results to the particular case of the supermassive black hole at the center of our galaxy.

  3. Analytic stochastic regularization and gange invariance

    International Nuclear Information System (INIS)

    Abdalla, E.; Gomes, M.; Lima-Santos, A.

    1986-05-01

    A proof that analytic stochastic regularization breaks gauge invariance is presented. This is done by an explicit one loop calculation of the vaccum polarization tensor in scalar electrodynamics, which turns out not to be transversal. The counterterm structure, Langevin equations and the construction of composite operators in the general framework of stochastic quantization, are also analysed. (Author) [pt

  4. Annotation of regular polysemy and underspecification

    DEFF Research Database (Denmark)

    Martínez Alonso, Héctor; Pedersen, Bolette Sandford; Bel, Núria

    2013-01-01

    We present the result of an annotation task on regular polysemy for a series of seman- tic classes or dot types in English, Dan- ish and Spanish. This article describes the annotation process, the results in terms of inter-encoder agreement, and the sense distributions obtained with two methods...

  5. Stabilization, pole placement, and regular implementability

    NARCIS (Netherlands)

    Belur, MN; Trentelman, HL

    In this paper, we study control by interconnection of linear differential systems. We give necessary and sufficient conditions for regular implementability of a-given linear, differential system. We formulate the problems of stabilization and pole placement as problems of finding a suitable,

  6. 12 CFR 725.3 - Regular membership.

    Science.gov (United States)

    2010-01-01

    ... UNION ADMINISTRATION CENTRAL LIQUIDITY FACILITY § 725.3 Regular membership. (a) A natural person credit....5(b) of this part, and forwarding with its completed application funds equal to one-half of this... 1, 1979, is not required to forward these funds to the Facility until October 1, 1979. (3...

  7. Supervised scale-regularized linear convolutionary filters

    DEFF Research Database (Denmark)

    Loog, Marco; Lauze, Francois Bernard

    2017-01-01

    also be solved relatively efficient. All in all, the idea is to properly control the scale of a trained filter, which we solve by introducing a specific regularization term into the overall objective function. We demonstrate, on an artificial filter learning problem, the capabil- ities of our basic...

  8. On regular riesz operators | Raubenheimer | Quaestiones ...

    African Journals Online (AJOL)

    The r-asymptotically quasi finite rank operators on Banach lattices are examples of regular Riesz operators. We characterise Riesz elements in a subalgebra of a Banach algebra in terms of Riesz elements in the Banach algebra. This enables us to characterise r-asymptotically quasi finite rank operators in terms of adjoint ...

  9. Regularized Discriminant Analysis: A Large Dimensional Study

    KAUST Repository

    Yang, Xiaoke

    2018-04-28

    In this thesis, we focus on studying the performance of general regularized discriminant analysis (RDA) classifiers. The data used for analysis is assumed to follow Gaussian mixture model with different means and covariances. RDA offers a rich class of regularization options, covering as special cases the regularized linear discriminant analysis (RLDA) and the regularized quadratic discriminant analysis (RQDA) classi ers. We analyze RDA under the double asymptotic regime where the data dimension and the training size both increase in a proportional way. This double asymptotic regime allows for application of fundamental results from random matrix theory. Under the double asymptotic regime and some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that only depends on the data statistical parameters and dimensions. This result not only implicates some mathematical relations between the misclassification error and the class statistics, but also can be leveraged to select the optimal parameters that minimize the classification error, thus yielding the optimal classifier. Validation results on the synthetic data show a good accuracy of our theoretical findings. We also construct a general consistent estimator to approximate the true classification error in consideration of the unknown previous statistics. We benchmark the performance of our proposed consistent estimator against classical estimator on synthetic data. The observations demonstrate that the general estimator outperforms others in terms of mean squared error (MSE).

  10. Complexity in union-free regular languages

    Czech Academy of Sciences Publication Activity Database

    Jirásková, G.; Masopust, Tomáš

    2011-01-01

    Roč. 22, č. 7 (2011), s. 1639-1653 ISSN 0129-0541 Institutional research plan: CEZ:AV0Z10190503 Keywords : Union-free regular language * one-cycle-free- path automaton * descriptional complexity Subject RIV: BA - General Mathematics Impact factor: 0.379, year: 2011 http://www.worldscinet.com/ijfcs/22/2207/S0129054111008933.html

  11. Bit-coded regular expression parsing

    DEFF Research Database (Denmark)

    Nielsen, Lasse; Henglein, Fritz

    2011-01-01

    the DFA-based parsing algorithm due to Dub ´e and Feeley to emit the bits of the bit representation without explicitly materializing the parse tree itself. We furthermore show that Frisch and Cardelli’s greedy regular expression parsing algorithm can be straightforwardly modified to produce bit codings...

  12. Bias correction for magnetic resonance images via joint entropy regularization.

    Science.gov (United States)

    Wang, Shanshan; Xia, Yong; Dong, Pei; Luo, Jianhua; Huang, Qiu; Feng, Dagan; Li, Yuanxiang

    2014-01-01

    Due to the imperfections of the radio frequency (RF) coil or object-dependent electrodynamic interactions, magnetic resonance (MR) images often suffer from a smooth and biologically meaningless bias field, which causes severe troubles for subsequent processing and quantitative analysis. To effectively restore the original signal, this paper simultaneously exploits the spatial and gradient features of the corrupted MR images for bias correction via the joint entropy regularization. With both isotropic and anisotropic total variation (TV) considered, two nonparametric bias correction algorithms have been proposed, namely IsoTVBiasC and AniTVBiasC. These two methods have been applied to simulated images under various noise levels and bias field corruption and also tested on real MR data. The test results show that the proposed two methods can effectively remove the bias field and also present comparable performance compared to the state-of-the-art methods.

  13. Efficient regularization with wavelet sparsity constraints in photoacoustic tomography

    Science.gov (United States)

    Frikel, Jürgen; Haltmeier, Markus

    2018-02-01

    In this paper, we consider the reconstruction problem of photoacoustic tomography (PAT) with a flat observation surface. We develop a direct reconstruction method that employs regularization with wavelet sparsity constraints. To that end, we derive a wavelet-vaguelette decomposition (WVD) for the PAT forward operator and a corresponding explicit reconstruction formula in the case of exact data. In the case of noisy data, we combine the WVD reconstruction formula with soft-thresholding, which yields a spatially adaptive estimation method. We demonstrate that our method is statistically optimal for white random noise if the unknown function is assumed to lie in any Besov-ball. We present generalizations of this approach and, in particular, we discuss the combination of PAT-vaguelette soft-thresholding with a total variation (TV) prior. We also provide an efficient implementation of the PAT-vaguelette transform that leads to fast image reconstruction algorithms supported by numerical results.

  14. Group-regularized individual prediction: theory and application to pain.

    Science.gov (United States)

    Lindquist, Martin A; Krishnan, Anjali; López-Solà, Marina; Jepma, Marieke; Woo, Choong-Wan; Koban, Leonie; Roy, Mathieu; Atlas, Lauren Y; Schmidt, Liane; Chang, Luke J; Reynolds Losin, Elizabeth A; Eisenbarth, Hedwig; Ashar, Yoni K; Delk, Elizabeth; Wager, Tor D

    2017-01-15

    Multivariate pattern analysis (MVPA) has become an important tool for identifying brain representations of psychological processes and clinical outcomes using fMRI and related methods. Such methods can be used to predict or 'decode' psychological states in individual subjects. Single-subject MVPA approaches, however, are limited by the amount and quality of individual-subject data. In spite of higher spatial resolution, predictive accuracy from single-subject data often does not exceed what can be accomplished using coarser, group-level maps, because single-subject patterns are trained on limited amounts of often-noisy data. Here, we present a method that combines population-level priors, in the form of biomarker patterns developed on prior samples, with single-subject MVPA maps to improve single-subject prediction. Theoretical results and simulations motivate a weighting based on the relative variances of biomarker-based prediction-based on population-level predictive maps from prior groups-and individual-subject, cross-validated prediction. Empirical results predicting pain using brain activity on a trial-by-trial basis (single-trial prediction) across 6 studies (N=180 participants) confirm the theoretical predictions. Regularization based on a population-level biomarker-in this case, the Neurologic Pain Signature (NPS)-improved single-subject prediction accuracy compared with idiographic maps based on the individuals' data alone. The regularization scheme that we propose, which we term group-regularized individual prediction (GRIP), can be applied broadly to within-person MVPA-based prediction. We also show how GRIP can be used to evaluate data quality and provide benchmarks for the appropriateness of population-level maps like the NPS for a given individual or study. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Tetravalent one-regular graphs of order 4p2

    DEFF Research Database (Denmark)

    Feng, Yan-Quan; Kutnar, Klavdija; Marusic, Dragan

    2014-01-01

    A graph is one-regular if its automorphism group acts regularly on the set of its arcs. In this paper tetravalent one-regular graphs of order 4p2, where p is a prime, are classified.......A graph is one-regular if its automorphism group acts regularly on the set of its arcs. In this paper tetravalent one-regular graphs of order 4p2, where p is a prime, are classified....

  16. The Evolution of Reputation-Based Cooperation in Regular Networks

    Directory of Open Access Journals (Sweden)

    Tatsuya Sasaki

    2017-01-01

    Full Text Available Despite recent advances in reputation technologies, it is not clear how reputation systems can affect human cooperation in social networks. Although it is known that two of the major mechanisms in the evolution of cooperation are spatial selection and reputation-based reciprocity, theoretical study of the interplay between both mechanisms remains almost uncharted. Here, we present a new individual-based model for the evolution of reciprocal cooperation between reputation and networks. We comparatively analyze four of the leading moral assessment rules—shunning, image scoring, stern judging, and simple standing—and base the model on the giving game in regular networks for Cooperators, Defectors, and Discriminators. Discriminators rely on a proper moral assessment rule. By using individual-based models, we show that the four assessment rules are differently characterized in terms of how cooperation evolves, depending on the benefit-to-cost ratio, the network-node degree, and the observation and error conditions. Our findings show that the most tolerant rule—simple standing—is the most robust among the four assessment rules in promoting cooperation in regular networks.

  17. Spatial Theography

    OpenAIRE

    van Noppen, Jean Pierre

    1995-01-01

    Descriptive theology («theography») frequently resorts to metaphorical modes of meaning. Among these metaphors, the spatial language of localization and orientation plays an important role to delineate tentative insights into the relationship between the human and the divine. These spatial metaphors are presumably based on the universal human experience of interaction between the body and its environment. It is dangerous, however, to postulate universal agreement on meanings associated with s...

  18. Spiking Regularity and Coherence in Complex Hodgkin–Huxley Neuron Networks

    International Nuclear Information System (INIS)

    Zhi-Qiang, Sun; Ping, Xie; Wei, Li; Peng-Ye, Wang

    2010-01-01

    We study the effects of the strength of coupling between neurons on the spiking regularity and coherence in a complex network with randomly connected Hodgkin–Huxley neurons driven by colored noise. It is found that for the given topology realization and colored noise correlation time, there exists an optimal strength of coupling, at which the spiking regularity of the network reaches the best level. Moreover, when the temporal regularity reaches the best level, the spatial coherence of the system has already increased to a relatively high level. In addition, for the given number of neurons and noise correlation time, the values of average regularity and spatial coherence at the optimal strength of coupling are nearly independent of the topology realization. Furthermore, there exists an optimal value of colored noise correlation time at which the spiking regularity can reach its best level. These results may be helpful for understanding of the real neuron world. (cross-disciplinary physics and related areas of science and technology)

  19. Regular Topographic Patterning of Karst Depressions Suggests Landscape Self-Organization

    Science.gov (United States)

    Quintero, C.; Cohen, M. J.

    2017-12-01

    Thousands of wetland depressions that are commonly host to cypress domes dot the sub-tropical limestone landscape of South Florida. The origin of these depression features has been the topic of debate. Here we build upon the work of previous surveyors of this landscape to analyze the morphology and spatial distribution of depressions on the Big Cypress landscape. We took advantage of the emergence and availability of high resolution Light Direction and Ranging (LiDAR) technology and ArcMap GIS software to analyze the structure and regularity of landscape features with methods unavailable to past surveyors. Six 2.25 km2 LiDAR plots within the preserve were selected for remote analysis and one depression feature within each plot was selected for more intensive sediment and water depth surveying. Depression features on the Big Cypress landscape were found to show strong evidence of regular spatial patterning. Periodicity, a feature of regularly patterned landscapes, is apparent in both Variograms and Radial Spectrum Analyses. Size class distributions of the identified features indicate constrained feature sizes while Average Nearest Neighbor analyses support the inference of dispersed features with non-random spacing. The presence of regular patterning on this landscape strongly implies biotic reinforcement of spatial structure by way of the scale dependent feedback. In characterizing the structure of this wetland landscape we add to the growing body of work dedicated to documenting how water, life and geology may interact to shape the natural landscapes we see today.

  20. Effective action for scalar fields and generalized zeta-function regularization

    International Nuclear Information System (INIS)

    Cognola, Guido; Zerbini, Sergio

    2004-01-01

    Motivated by the study of quantum fields in a Friedmann-Robertson-Walker space-time, the one-loop effective action for a scalar field defined in the ultrastatic manifold RxH 3 /Γ, H 3 /Γ being the finite volume, noncompact, hyperbolic spatial section, is investigated by a generalization of zeta-function regularization. It is shown that additional divergences may appear at the one-loop level. The one-loop renormalizability of the model is discussed and, making use of a generalization of zeta-function regularization, the one-loop renormalization group equations are derived

  1. Extreme values, regular variation and point processes

    CERN Document Server

    Resnick, Sidney I

    1987-01-01

    Extremes Values, Regular Variation and Point Processes is a readable and efficient account of the fundamental mathematical and stochastic process techniques needed to study the behavior of extreme values of phenomena based on independent and identically distributed random variables and vectors It presents a coherent treatment of the distributional and sample path fundamental properties of extremes and records It emphasizes the core primacy of three topics necessary for understanding extremes the analytical theory of regularly varying functions; the probabilistic theory of point processes and random measures; and the link to asymptotic distribution approximations provided by the theory of weak convergence of probability measures in metric spaces The book is self-contained and requires an introductory measure-theoretic course in probability as a prerequisite Almost all sections have an extensive list of exercises which extend developments in the text, offer alternate approaches, test mastery and provide for enj...

  2. Stream Processing Using Grammars and Regular Expressions

    DEFF Research Database (Denmark)

    Rasmussen, Ulrik Terp

    disambiguation. The first algorithm operates in two passes in a semi-streaming fashion, using a constant amount of working memory and an auxiliary tape storage which is written in the first pass and consumed by the second. The second algorithm is a single-pass and optimally streaming algorithm which outputs...... as much of the parse tree as is semantically possible based on the input prefix read so far, and resorts to buffering as many symbols as is required to resolve the next choice. Optimality is obtained by performing a PSPACE-complete pre-analysis on the regular expression. In the second part we present...... Kleenex, a language for expressing high-performance streaming string processing programs as regular grammars with embedded semantic actions, and its compilation to streaming string transducers with worst-case linear-time performance. Its underlying theory is based on transducer decomposition into oracle...

  3. Describing chaotic attractors: Regular and perpetual points

    Science.gov (United States)

    Dudkowski, Dawid; Prasad, Awadhesh; Kapitaniak, Tomasz

    2018-03-01

    We study the concepts of regular and perpetual points for describing the behavior of chaotic attractors in dynamical systems. The idea of these points, which have been recently introduced to theoretical investigations, is thoroughly discussed and extended into new types of models. We analyze the correlation between regular and perpetual points, as well as their relation with phase space, showing the potential usefulness of both types of points in the qualitative description of co-existing states. The ability of perpetual points in finding attractors is indicated, along with its potential cause. The location of chaotic trajectories and sets of considered points is investigated and the study on the stability of systems is shown. The statistical analysis of the observing desired states is performed. We focus on various types of dynamical systems, i.e., chaotic flows with self-excited and hidden attractors, forced mechanical models, and semiconductor superlattices, exhibiting the universality of appearance of the observed patterns and relations.

  4. Chaos regularization of quantum tunneling rates

    International Nuclear Information System (INIS)

    Pecora, Louis M.; Wu Dongho; Lee, Hoshik; Antonsen, Thomas; Lee, Ming-Jer; Ott, Edward

    2011-01-01

    Quantum tunneling rates through a barrier separating two-dimensional, symmetric, double-well potentials are shown to depend on the classical dynamics of the billiard trajectories in each well and, hence, on the shape of the wells. For shapes that lead to regular (integrable) classical dynamics the tunneling rates fluctuate greatly with eigenenergies of the states sometimes by over two orders of magnitude. Contrarily, shapes that lead to completely chaotic trajectories lead to tunneling rates whose fluctuations are greatly reduced, a phenomenon we call regularization of tunneling rates. We show that a random-plane-wave theory of tunneling accounts for the mean tunneling rates and the small fluctuation variances for the chaotic systems.

  5. Least square regularized regression in sum space.

    Science.gov (United States)

    Xu, Yong-Li; Chen, Di-Rong; Li, Han-Xiong; Liu, Lu

    2013-04-01

    This paper proposes a least square regularized regression algorithm in sum space of reproducing kernel Hilbert spaces (RKHSs) for nonflat function approximation, and obtains the solution of the algorithm by solving a system of linear equations. This algorithm can approximate the low- and high-frequency component of the target function with large and small scale kernels, respectively. The convergence and learning rate are analyzed. We measure the complexity of the sum space by its covering number and demonstrate that the covering number can be bounded by the product of the covering numbers of basic RKHSs. For sum space of RKHSs with Gaussian kernels, by choosing appropriate parameters, we tradeoff the sample error and regularization error, and obtain a polynomial learning rate, which is better than that in any single RKHS. The utility of this method is illustrated with two simulated data sets and five real-life databases.

  6. Thin accretion disk around regular black hole

    Directory of Open Access Journals (Sweden)

    QIU Tianqi

    2014-08-01

    Full Text Available The Penrose′s cosmic censorship conjecture says that naked singularities do not exist in nature.So,it seems reasonable to further conjecture that not even a singularity exists in nature.In this paper,a regular black hole without singularity is studied in detail,especially on its thin accretion disk,energy flux,radiation temperature and accretion efficiency.It is found that the interaction of regular black hole is stronger than that of the Schwarzschild black hole. Furthermore,the thin accretion will be more efficiency to lost energy while the mass of black hole decreased. These particular properties may be used to distinguish between black holes.

  7. Convex nonnegative matrix factorization with manifold regularization.

    Science.gov (United States)

    Hu, Wenjun; Choi, Kup-Sze; Wang, Peiliang; Jiang, Yunliang; Wang, Shitong

    2015-03-01

    Nonnegative Matrix Factorization (NMF) has been extensively applied in many areas, including computer vision, pattern recognition, text mining, and signal processing. However, nonnegative entries are usually required for the data matrix in NMF, which limits its application. Besides, while the basis and encoding vectors obtained by NMF can represent the original data in low dimension, the representations do not always reflect the intrinsic geometric structure embedded in the data. Motivated by manifold learning and Convex NMF (CNMF), we propose a novel matrix factorization method called Graph Regularized and Convex Nonnegative Matrix Factorization (GCNMF) by introducing a graph regularized term into CNMF. The proposed matrix factorization technique not only inherits the intrinsic low-dimensional manifold structure, but also allows the processing of mixed-sign data matrix. Clustering experiments on nonnegative and mixed-sign real-world data sets are conducted to demonstrate the effectiveness of the proposed method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. A short proof of increased parabolic regularity

    Directory of Open Access Journals (Sweden)

    Stephen Pankavich

    2015-08-01

    Full Text Available We present a short proof of the increased regularity obtained by solutions to uniformly parabolic partial differential equations. Though this setting is fairly introductory, our new method of proof, which uses a priori estimates and an inductive method, can be extended to prove analogous results for problems with time-dependent coefficients, advection-diffusion or reaction diffusion equations, and nonlinear PDEs even when other tools, such as semigroup methods or the use of explicit fundamental solutions, are unavailable.

  9. Regular black hole in three dimensions

    OpenAIRE

    Myung, Yun Soo; Yoon, Myungseok

    2008-01-01

    We find a new black hole in three dimensional anti-de Sitter space by introducing an anisotropic perfect fluid inspired by the noncommutative black hole. This is a regular black hole with two horizons. We compare thermodynamics of this black hole with that of non-rotating BTZ black hole. The first-law of thermodynamics is not compatible with the Bekenstein-Hawking entropy.

  10. Sparse regularization for force identification using dictionaries

    Science.gov (United States)

    Qiao, Baijie; Zhang, Xingwu; Wang, Chenxi; Zhang, Hang; Chen, Xuefeng

    2016-04-01

    The classical function expansion method based on minimizing l2-norm of the response residual employs various basis functions to represent the unknown force. Its difficulty lies in determining the optimum number of basis functions. Considering the sparsity of force in the time domain or in other basis space, we develop a general sparse regularization method based on minimizing l1-norm of the coefficient vector of basis functions. The number of basis functions is adaptively determined by minimizing the number of nonzero components in the coefficient vector during the sparse regularization process. First, according to the profile of the unknown force, the dictionary composed of basis functions is determined. Second, a sparsity convex optimization model for force identification is constructed. Third, given the transfer function and the operational response, Sparse reconstruction by separable approximation (SpaRSA) is developed to solve the sparse regularization problem of force identification. Finally, experiments including identification of impact and harmonic forces are conducted on a cantilever thin plate structure to illustrate the effectiveness and applicability of SpaRSA. Besides the Dirac dictionary, other three sparse dictionaries including Db6 wavelets, Sym4 wavelets and cubic B-spline functions can also accurately identify both the single and double impact forces from highly noisy responses in a sparse representation frame. The discrete cosine functions can also successfully reconstruct the harmonic forces including the sinusoidal, square and triangular forces. Conversely, the traditional Tikhonov regularization method with the L-curve criterion fails to identify both the impact and harmonic forces in these cases.

  11. Analytic stochastic regularization and gauge theories

    International Nuclear Information System (INIS)

    Abdalla, E.; Gomes, M.; Lima-Santos, A.

    1987-04-01

    We prove that analytic stochatic regularization braks gauge invariance. This is done by an explicit one loop calculation of the two three and four point vertex functions of the gluon field in scalar chromodynamics, which turns out not to be geuge invariant. We analyse the counter term structure, Langevin equations and the construction of composite operators in the general framework of stochastic quantization. (author) [pt

  12. Preconditioners for regularized saddle point matrices

    Czech Academy of Sciences Publication Activity Database

    Axelsson, Owe

    2011-01-01

    Roč. 19, č. 2 (2011), s. 91-112 ISSN 1570-2820 Institutional research plan: CEZ:AV0Z30860518 Keywords : saddle point matrices * preconditioning * regularization * eigenvalue clustering Subject RIV: BA - General Mathematics Impact factor: 0.533, year: 2011 http://www.degruyter.com/view/j/jnma.2011.19.issue-2/jnum.2011.005/jnum.2011.005. xml

  13. Analytic stochastic regularization: gauge and supersymmetry theories

    International Nuclear Information System (INIS)

    Abdalla, M.C.B.

    1988-01-01

    Analytic stochastic regularization for gauge and supersymmetric theories is considered. Gauge invariance in spinor and scalar QCD is verified to brak fown by an explicit one loop computation of the two, theree and four point vertex function of the gluon field. As a result, non gauge invariant counterterms must be added. However, in the supersymmetric multiplets there is a cancellation rendering the counterterms gauge invariant. The calculation is considered at one loop order. (author) [pt

  14. Minimal length uncertainty relation and ultraviolet regularization

    Science.gov (United States)

    Kempf, Achim; Mangano, Gianpiero

    1997-06-01

    Studies in string theory and quantum gravity suggest the existence of a finite lower limit Δx0 to the possible resolution of distances, at the latest on the scale of the Planck length of 10-35 m. Within the framework of the Euclidean path integral we explicitly show ultraviolet regularization in field theory through this short distance structure. Both rotation and translation invariance can be preserved. An example is studied in detail.

  15. Regularity and chaos in cavity QED

    International Nuclear Information System (INIS)

    Bastarrachea-Magnani, Miguel Angel; López-del-Carpio, Baldemar; Chávez-Carlos, Jorge; Lerma-Hernández, Sergio; Hirsch, Jorge G

    2017-01-01

    The interaction of a quantized electromagnetic field in a cavity with a set of two-level atoms inside it can be described with algebraic Hamiltonians of increasing complexity, from the Rabi to the Dicke models. Their algebraic character allows, through the use of coherent states, a semiclassical description in phase space, where the non-integrable Dicke model has regions associated with regular and chaotic motion. The appearance of classical chaos can be quantified calculating the largest Lyapunov exponent over the whole available phase space for a given energy. In the quantum regime, employing efficient diagonalization techniques, we are able to perform a detailed quantitative study of the regular and chaotic regions, where the quantum participation ratio (P R ) of coherent states on the eigenenergy basis plays a role equivalent to the Lyapunov exponent. It is noted that, in the thermodynamic limit, dividing the participation ratio by the number of atoms leads to a positive value in chaotic regions, while it tends to zero in the regular ones. (paper)

  16. Solution path for manifold regularized semisupervised classification.

    Science.gov (United States)

    Wang, Gang; Wang, Fei; Chen, Tao; Yeung, Dit-Yan; Lochovsky, Frederick H

    2012-04-01

    Traditional learning algorithms use only labeled data for training. However, labeled examples are often difficult or time consuming to obtain since they require substantial human labeling efforts. On the other hand, unlabeled data are often relatively easy to collect. Semisupervised learning addresses this problem by using large quantities of unlabeled data with labeled data to build better learning algorithms. In this paper, we use the manifold regularization approach to formulate the semisupervised learning problem where a regularization framework which balances a tradeoff between loss and penalty is established. We investigate different implementations of the loss function and identify the methods which have the least computational expense. The regularization hyperparameter, which determines the balance between loss and penalty, is crucial to model selection. Accordingly, we derive an algorithm that can fit the entire path of solutions for every value of the hyperparameter. Its computational complexity after preprocessing is quadratic only in the number of labeled examples rather than the total number of labeled and unlabeled examples.

  17. Regularizations: different recipes for identical situations

    International Nuclear Information System (INIS)

    Gambin, E.; Lobo, C.O.; Battistel, O.A.

    2004-03-01

    We present a discussion where the choice of the regularization procedure and the routing for the internal lines momenta are put at the same level of arbitrariness in the analysis of Ward identities involving simple and well-known problems in QFT. They are the complex self-interacting scalar field and two simple models where the SVV and AVV process are pertinent. We show that, in all these problems, the conditions to symmetry relations preservation are put in terms of the same combination of divergent Feynman integrals, which are evaluated in the context of a very general calculational strategy, concerning the manipulations and calculations involving divergences. Within the adopted strategy, all the arbitrariness intrinsic to the problem are still maintained in the final results and, consequently, a perfect map can be obtained with the corresponding results of the traditional regularization techniques. We show that, when we require an universal interpretation for the arbitrariness involved, in order to get consistency with all stated physical constraints, a strong condition is imposed for regularizations which automatically eliminates the ambiguities associated to the routing of the internal lines momenta of loops. The conclusion is clean and sound: the association between ambiguities and unavoidable symmetry violations in Ward identities cannot be maintained if an unique recipe is required for identical situations in the evaluation of divergent physical amplitudes. (author)

  18. Beamforming Through Regularized Inverse Problems in Ultrasound Medical Imaging.

    Science.gov (United States)

    Szasz, Teodora; Basarab, Adrian; Kouame, Denis

    2016-12-01

    Beamforming (BF) in ultrasound (US) imaging has significant impact on the quality of the final image, controlling its resolution and contrast. Despite its low spatial resolution and contrast, delay-and-sum (DAS) is still extensively used nowadays in clinical applications, due to its real-time capabilities. The most common alternatives are minimum variance (MV) method and its variants, which overcome the drawbacks of DAS, at the cost of higher computational complexity that limits its utilization in real-time applications. In this paper, we propose to perform BF in US imaging through a regularized inverse problem based on a linear model relating the reflected echoes to the signal to be recovered. Our approach presents two major advantages: 1) its flexibility in the choice of statistical assumptions on the signal to be beamformed (Laplacian and Gaussian statistics are tested herein) and 2) its robustness to a reduced number of pulse emissions. The proposed framework is flexible and allows for choosing the right tradeoff between noise suppression and sharpness of the resulted image. We illustrate the performance of our approach on both simulated and experimental data, with in vivo examples of carotid and thyroid. Compared with DAS, MV, and two other recently published BF techniques, our method offers better spatial resolution, respectively contrast, when using Laplacian and Gaussian priors.

  19. Mixed Total Variation and L1 Regularization Method for Optical Tomography Based on Radiative Transfer Equation

    Directory of Open Access Journals (Sweden)

    Jinping Tang

    2017-01-01

    Full Text Available Optical tomography is an emerging and important molecular imaging modality. The aim of optical tomography is to reconstruct optical properties of human tissues. In this paper, we focus on reconstructing the absorption coefficient based on the radiative transfer equation (RTE. It is an ill-posed parameter identification problem. Regularization methods have been broadly applied to reconstruct the optical coefficients, such as the total variation (TV regularization and the L1 regularization. In order to better reconstruct the piecewise constant and sparse coefficient distributions, TV and L1 norms are combined as the regularization. The forward problem is discretized with the discontinuous Galerkin method on the spatial space and the finite element method on the angular space. The minimization problem is solved by a Jacobian-based Levenberg-Marquardt type method which is equipped with a split Bregman algorithms for the L1 regularization. We use the adjoint method to compute the Jacobian matrix which dramatically improves the computation efficiency. By comparing with the other imaging reconstruction methods based on TV and L1 regularizations, the simulation results show the validity and efficiency of the proposed method.

  20. Intrinsic spatial resolution limitations due to differences between positron emission position and annihilation detection localization; Limitacoes da resolucao espacial intrinseca devido as diferencas entre a posicao da emissao do positron e a deteccao da localizacao de aniquilacao

    Energy Technology Data Exchange (ETDEWEB)

    Perez, Pedro; Malano, Francisco; Valente, Mauro, E-mail: valente@famaf.unc.edu.ar [Universidad Nacional de Cordoba, Cordoba (Argentina). Fac. de Matematica, Astronomia y Fisica (FaMAF)

    2012-07-01

    Since its successful implementation for clinical diagnostic, positron emission tomography (PET) represents the most promising medical imaging technique. The recent major growth of PET imaging is mainly due to its ability to trace the biologic pathways of different compounds in the patient's body, assuming the patient can be labeled with some PET isotope. Regardless of the type of isotope, the PET imaging method is based on the detection of two 511-keV gamma photons being emitted in opposite directions, with almost 180 deg between them, as a consequence of electron-positron annihilation. Therefore, this imaging method is intrinsically limited by random uncertainties in spatial resolutions, related with differences between the actual position of positron emission and the location of the detected annihilation. This study presents an approach with the Monte Carlo method to analyze the influence of this effect on different isotopes of potential implementation in PET. (author)

  1. Sparsity regularization for parameter identification problems

    International Nuclear Information System (INIS)

    Jin, Bangti; Maass, Peter

    2012-01-01

    The investigation of regularization schemes with sparsity promoting penalty terms has been one of the dominant topics in the field of inverse problems over the last years, and Tikhonov functionals with ℓ p -penalty terms for 1 ⩽ p ⩽ 2 have been studied extensively. The first investigations focused on regularization properties of the minimizers of such functionals with linear operators and on iteration schemes for approximating the minimizers. These results were quickly transferred to nonlinear operator equations, including nonsmooth operators and more general function space settings. The latest results on regularization properties additionally assume a sparse representation of the true solution as well as generalized source conditions, which yield some surprising and optimal convergence rates. The regularization theory with ℓ p sparsity constraints is relatively complete in this setting; see the first part of this review. In contrast, the development of efficient numerical schemes for approximating minimizers of Tikhonov functionals with sparsity constraints for nonlinear operators is still ongoing. The basic iterated soft shrinkage approach has been extended in several directions and semi-smooth Newton methods are becoming applicable in this field. In particular, the extension to more general non-convex, non-differentiable functionals by variational principles leads to a variety of generalized iteration schemes. We focus on such iteration schemes in the second part of this review. A major part of this survey is devoted to applying sparsity constrained regularization techniques to parameter identification problems for partial differential equations, which we regard as the prototypical setting for nonlinear inverse problems. Parameter identification problems exhibit different levels of complexity and we aim at characterizing a hierarchy of such problems. The operator defining these inverse problems is the parameter-to-state mapping. We first summarize some

  2. Complex sparse spatial filter for decoding mixed frequency and phase coded steady-state visually evoked potentials.

    Science.gov (United States)

    Morikawa, Naoki; Tanaka, Toshihisa; Islam, Md Rabiul

    2018-07-01

    Mixed frequency and phase coding (FPC) can achieve the significant increase of the number of commands in steady-state visual evoked potential-based brain-computer interface (SSVEP-BCI). However, the inconsistent phases of the SSVEP over channels in a trial and the existence of non-contributing channels due to noise effects can decrease accurate detection of stimulus frequency. We propose a novel command detection method based on a complex sparse spatial filter (CSSF) by solving ℓ 1 - and ℓ 2,1 -regularization problems for a mixed-coded SSVEP-BCI. In particular, ℓ 2,1 -regularization (aka group sparsification) can lead to the rejection of electrodes that are not contributing to the SSVEP detection. A calibration data based canonical correlation analysis (CCA) and CSSF with ℓ 1 - and ℓ 2,1 -regularization cases were demonstrated for a 16-target stimuli with eleven subjects. The results of statistical test suggest that the proposed method with ℓ 1 - and ℓ 2,1 -regularization significantly achieved the highest ITR. The proposed approaches do not need any reference signals, automatically select prominent channels, and reduce the computational cost compared to the other mixed frequency-phase coding (FPC)-based BCIs. The experimental results suggested that the proposed method can be usable implementing BCI effectively with reduce visual fatigue. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Persistence, Spatial Distribution and Implications for Progression Detection of Blind Parts of the Visual Field in Glaucoma : A Clinical Cohort Study

    NARCIS (Netherlands)

    Montolio, Francisco G. Junoy; Wesselink, Christiaan; Jansonius, Nomdo M.

    2012-01-01

    Background: Visual field testing is an essential part of glaucoma care. It is hampered by variability related to the disease itself, response errors and fatigue. In glaucoma, blind parts of the visual field contribute to the diagnosis but - once established - not to progression detection; they only

  4. Learning Sparse Visual Representations with Leaky Capped Norm Regularizers

    OpenAIRE

    Wangni, Jianqiao; Lin, Dahua

    2017-01-01

    Sparsity inducing regularization is an important part for learning over-complete visual representations. Despite the popularity of $\\ell_1$ regularization, in this paper, we investigate the usage of non-convex regularizations in this problem. Our contribution consists of three parts. First, we propose the leaky capped norm regularization (LCNR), which allows model weights below a certain threshold to be regularized more strongly as opposed to those above, therefore imposes strong sparsity and...

  5. Temporal regularity of the environment drives time perception

    OpenAIRE

    van Rijn, H; Rhodes, D; Di Luca, M

    2016-01-01

    It’s reasonable to assume that a regularly paced sequence should be perceived as regular, but here we show that perceived regularity depends on the context in which the sequence is embedded. We presented one group of participants with perceptually regularly paced sequences, and another group of participants with mostly irregularly paced sequences (75% irregular, 25% regular). The timing of the final stimulus in each sequence could be var- ied. In one experiment, we asked whether the last stim...

  6. Regular-, irregular-, and pseudo-character processing in Chinese: The regularity effect in normal adult readers

    Directory of Open Access Journals (Sweden)

    Dustin Kai Yan Lau

    2014-03-01

    Full Text Available Background Unlike alphabetic languages, Chinese uses a logographic script. However, the pronunciation of many character’s phonetic radical has the same pronunciation as the character as a whole. These are considered regular characters and can be read through a lexical non-semantic route (Weekes & Chen, 1999. Pseudocharacters are another way to study this non-semantic route. A pseudocharacter is the combination of existing semantic and phonetic radicals in their legal positions resulting in a non-existing character (Ho, Chan, Chung, Lee, & Tsang, 2007. Pseudocharacters can be pronounced by direct derivation from the sound of its phonetic radical. Conversely, if the pronunciation of a character does not follow that of the phonetic radical, it is considered as irregular and can only be correctly read through the lexical-semantic route. The aim of the current investigation was to examine reading aloud in normal adults. We hypothesized that the regularity effect, previously described for alphabetical scripts and acquired dyslexic patients of Chinese (Weekes & Chen, 1999; Wu, Liu, Sun, Chromik, & Zhang, 2014, would also be present in normal adult Chinese readers. Method Participants. Thirty (50% female native Hong Kong Cantonese speakers with a mean age of 19.6 years and a mean education of 12.9 years. Stimuli. Sixty regular-, 60 irregular-, and 60 pseudo-characters (with at least 75% of name agreement in Chinese were matched by initial phoneme, number of strokes and family size. Additionally, regular- and irregular-characters were matched by frequency (low and consistency. Procedure. Each participant was asked to read aloud the stimuli presented on a laptop using the DMDX software. The order of stimuli presentation was randomized. Data analysis. ANOVAs were carried out by participants and items with RTs and errors as dependent variables and type of stimuli (regular-, irregular- and pseudo-character as repeated measures (F1 or between subject

  7. Spatial networks

    Science.gov (United States)

    Barthélemy, Marc

    2011-02-01

    Complex systems are very often organized under the form of networks where nodes and edges are embedded in space. Transportation and mobility networks, Internet, mobile phone networks, power grids, social and contact networks, and neural networks, are all examples where space is relevant and where topology alone does not contain all the information. Characterizing and understanding the structure and the evolution of spatial networks is thus crucial for many different fields, ranging from urbanism to epidemiology. An important consequence of space on networks is that there is a cost associated with the length of edges which in turn has dramatic effects on the topological structure of these networks. We will thoroughly explain the current state of our understanding of how the spatial constraints affect the structure and properties of these networks. We will review the most recent empirical observations and the most important models of spatial networks. We will also discuss various processes which take place on these spatial networks, such as phase transitions, random walks, synchronization, navigation, resilience, and disease spread.

  8. Spatial interpolation

    NARCIS (Netherlands)

    Stein, A.

    1991-01-01

    The theory and practical application of techniques of statistical interpolation are studied in this thesis, and new developments in multivariate spatial interpolation and the design of sampling plans are discussed. Several applications to studies in soil science are

  9. Particle detector spatial resolution

    International Nuclear Information System (INIS)

    Perez-Mendez, V.

    1992-01-01

    Method and apparatus for producing separated columns of scintillation layer material, for use in detection of X-rays and high energy charged particles with improved spatial resolution is disclosed. A pattern of ridges or projections is formed on one surface of a substrate layer or in a thin polyimide layer, and the scintillation layer is grown at controlled temperature and growth rate on the ridge-containing material. The scintillation material preferentially forms cylinders or columns, separated by gaps conforming to the pattern of ridges, and these columns direct most of the light produced in the scintillation layer along individual columns for subsequent detection in a photodiode layer. The gaps may be filled with a light-absorbing material to further enhance the spatial resolution of the particle detector. 12 figs

  10. A multiresolution method for solving the Poisson equation using high order regularization

    DEFF Research Database (Denmark)

    Hejlesen, Mads Mølholm; Walther, Jens Honore

    2016-01-01

    We present a novel high order multiresolution Poisson solver based on regularized Green's function solutions to obtain exact free-space boundary conditions while using fast Fourier transforms for computational efficiency. Multiresolution is a achieved through local refinement patches and regulari......We present a novel high order multiresolution Poisson solver based on regularized Green's function solutions to obtain exact free-space boundary conditions while using fast Fourier transforms for computational efficiency. Multiresolution is a achieved through local refinement patches...... and regularized Green's functions corresponding to the difference in the spatial resolution between the patches. The full solution is obtained utilizing the linearity of the Poisson equation enabling super-position of solutions. We show that the multiresolution Poisson solver produces convergence rates...

  11. Efficient L1 regularization-based reconstruction for fluorescent molecular tomography using restarted nonlinear conjugate gradient.

    Science.gov (United States)

    Shi, Junwei; Zhang, Bin; Liu, Fei; Luo, Jianwen; Bai, Jing

    2013-09-15

    For the ill-posed fluorescent molecular tomography (FMT) inverse problem, the L1 regularization can protect the high-frequency information like edges while effectively reduce the image noise. However, the state-of-the-art L1 regularization-based algorithms for FMT reconstruction are expensive in memory, especially for large-scale problems. An efficient L1 regularization-based reconstruction algorithm based on nonlinear conjugate gradient with restarted strategy is proposed to increase the computational speed with low memory consumption. The reconstruction results from phantom experiments demonstrate that the proposed algorithm can obtain high spatial resolution and high signal-to-noise ratio, as well as high localization accuracy for fluorescence targets.

  12. Convergence and fluctuations of Regularized Tyler estimators

    KAUST Repository

    Kammoun, Abla; Couillet, Romain; Pascal, Frederic; Alouini, Mohamed-Slim

    2015-01-01

    This article studies the behavior of regularized Tyler estimators (RTEs) of scatter matrices. The key advantages of these estimators are twofold. First, they guarantee by construction a good conditioning of the estimate and second, being a derivative of robust Tyler estimators, they inherit their robustness properties, notably their resilience to the presence of outliers. Nevertheless, one major problem that poses the use of RTEs in practice is represented by the question of setting the regularization parameter p. While a high value of p is likely to push all the eigenvalues away from zero, it comes at the cost of a larger bias with respect to the population covariance matrix. A deep understanding of the statistics of RTEs is essential to come up with appropriate choices for the regularization parameter. This is not an easy task and might be out of reach, unless one considers asymptotic regimes wherein the number of observations n and/or their size N increase together. First asymptotic results have recently been obtained under the assumption that N and n are large and commensurable. Interestingly, no results concerning the regime of n going to infinity with N fixed exist, even though the investigation of this assumption has usually predated the analysis of the most difficult N and n large case. This motivates our work. In particular, we prove in the present paper that the RTEs converge to a deterministic matrix when n → ∞ with N fixed, which is expressed as a function of the theoretical covariance matrix. We also derive the fluctuations of the RTEs around this deterministic matrix and establish that these fluctuations converge in distribution to a multivariate Gaussian distribution with zero mean and a covariance depending on the population covariance and the parameter.

  13. Convergence and fluctuations of Regularized Tyler estimators

    KAUST Repository

    Kammoun, Abla

    2015-10-26

    This article studies the behavior of regularized Tyler estimators (RTEs) of scatter matrices. The key advantages of these estimators are twofold. First, they guarantee by construction a good conditioning of the estimate and second, being a derivative of robust Tyler estimators, they inherit their robustness properties, notably their resilience to the presence of outliers. Nevertheless, one major problem that poses the use of RTEs in practice is represented by the question of setting the regularization parameter p. While a high value of p is likely to push all the eigenvalues away from zero, it comes at the cost of a larger bias with respect to the population covariance matrix. A deep understanding of the statistics of RTEs is essential to come up with appropriate choices for the regularization parameter. This is not an easy task and might be out of reach, unless one considers asymptotic regimes wherein the number of observations n and/or their size N increase together. First asymptotic results have recently been obtained under the assumption that N and n are large and commensurable. Interestingly, no results concerning the regime of n going to infinity with N fixed exist, even though the investigation of this assumption has usually predated the analysis of the most difficult N and n large case. This motivates our work. In particular, we prove in the present paper that the RTEs converge to a deterministic matrix when n → ∞ with N fixed, which is expressed as a function of the theoretical covariance matrix. We also derive the fluctuations of the RTEs around this deterministic matrix and establish that these fluctuations converge in distribution to a multivariate Gaussian distribution with zero mean and a covariance depending on the population covariance and the parameter.

  14. Outlier detection using autoencoders

    CERN Document Server

    Lyudchik, Olga

    2016-01-01

    Outlier detection is a crucial part of any data analysis applications. The goal of outlier detection is to separate a core of regular observations from some polluting ones, called “outliers”. We propose an outlier detection method using deep autoencoder. In our research the invented method was applied to detect outlier points in the MNIST dataset of handwriting digits. The experimental results show that the proposed method has a potential to be used for anomaly detection.

  15. The use of regularization in inferential measurements

    International Nuclear Information System (INIS)

    Hines, J. Wesley; Gribok, Andrei V.; Attieh, Ibrahim; Uhrig, Robert E.

    1999-01-01

    Inferential sensing is the prediction of a plant variable through the use of correlated plant variables. A correct prediction of the variable can be used to monitor sensors for drift or other failures making periodic instrument calibrations unnecessary. This move from periodic to condition based maintenance can reduce costs and increase the reliability of the instrument. Having accurate, reliable measurements is important for signals that may impact safety or profitability. This paper investigates how collinearity adversely affects inferential sensing by making the results inconsistent and unrepeatable; and presents regularization as a potential solution (author) (ml)

  16. Regularization ambiguities in loop quantum gravity

    International Nuclear Information System (INIS)

    Perez, Alejandro

    2006-01-01

    One of the main achievements of loop quantum gravity is the consistent quantization of the analog of the Wheeler-DeWitt equation which is free of ultraviolet divergences. However, ambiguities associated to the intermediate regularization procedure lead to an apparently infinite set of possible theories. The absence of an UV problem--the existence of well-behaved regularization of the constraints--is intimately linked with the ambiguities arising in the quantum theory. Among these ambiguities is the one associated to the SU(2) unitary representation used in the diffeomorphism covariant 'point-splitting' regularization of the nonlinear functionals of the connection. This ambiguity is labeled by a half-integer m and, here, it is referred to as the m ambiguity. The aim of this paper is to investigate the important implications of this ambiguity. We first study 2+1 gravity (and more generally BF theory) quantized in the canonical formulation of loop quantum gravity. Only when the regularization of the quantum constraints is performed in terms of the fundamental representation of the gauge group does one obtain the usual topological quantum field theory as a result. In all other cases unphysical local degrees of freedom arise at the level of the regulated theory that conspire against the existence of the continuum limit. This shows that there is a clear-cut choice in the quantization of the constraints in 2+1 loop quantum gravity. We then analyze the effects of the ambiguity in 3+1 gravity exhibiting the existence of spurious solutions for higher representation quantizations of the Hamiltonian constraint. Although the analysis is not complete in 3+1 dimensions - due to the difficulties associated to the definition of the physical inner product - it provides evidence supporting the definitions quantum dynamics of loop quantum gravity in terms of the fundamental representation of the gauge group as the only consistent possibilities. If the gauge group is SO(3) we find

  17. New regularities in mass spectra of hadrons

    International Nuclear Information System (INIS)

    Kajdalov, A.B.

    1989-01-01

    The properties of bosonic and baryonic Regge trajectories for hadrons composed of light quarks are considered. Experimental data agree with an existence of daughter trajectories consistent with string models. It is pointed out that the parity doubling for baryonic trajectories, observed experimentally, is not understood in the existing quark models. Mass spectrum of bosons and baryons indicates to an approximate supersymmetry in the mass region M>1 GeV. These regularities indicates to a high degree of symmetry for the dynamics in the confinement region. 8 refs.; 5 figs

  18. Total-variation regularization with bound constraints

    International Nuclear Information System (INIS)

    Chartrand, Rick; Wohlberg, Brendt

    2009-01-01

    We present a new algorithm for bound-constrained total-variation (TV) regularization that in comparison with its predecessors is simple, fast, and flexible. We use a splitting approach to decouple TV minimization from enforcing the constraints. Consequently, existing TV solvers can be employed with minimal alteration. This also makes the approach straightforward to generalize to any situation where TV can be applied. We consider deblurring of images with Gaussian or salt-and-pepper noise, as well as Abel inversion of radiographs with Poisson noise. We incorporate previous iterative reweighting algorithms to solve the TV portion.

  19. Bayesian regularization of diffusion tensor images

    DEFF Research Database (Denmark)

    Frandsen, Jesper; Hobolth, Asger; Østergaard, Leif

    2007-01-01

    Diffusion tensor imaging (DTI) is a powerful tool in the study of the course of nerve fibre bundles in the human brain. Using DTI, the local fibre orientation in each image voxel can be described by a diffusion tensor which is constructed from local measurements of diffusion coefficients along...... several directions. The measured diffusion coefficients and thereby the diffusion tensors are subject to noise, leading to possibly flawed representations of the three dimensional fibre bundles. In this paper we develop a Bayesian procedure for regularizing the diffusion tensor field, fully utilizing...

  20. Indefinite metric and regularization of electrodynamics

    International Nuclear Information System (INIS)

    Gaudin, M.

    1984-06-01

    The invariant regularization of Pauli and Villars in quantum electrodynamics can be considered as deriving from a local and causal lagrangian theory for spin 1/2 bosons, by introducing an indefinite metric and a condition on the allowed states similar to the Lorentz condition. The consequences are the asymptotic freedom of the photon's propagator. We present a calcultion of the effective charge to the fourth order in the coupling as a function of the auxiliary masses, the theory avoiding all mass divergencies to this order [fr