WorldWideScience

Sample records for verifying temporal regular

  1. Split Dimensional Regularization for the Temporal Gauge

    OpenAIRE

    Chen, Yaw-Hwang; Hsieh, Ron-Jou; Lin, Chilong

    1996-01-01

    A split dimensional regularization, which was introduced for the Coulomb gauge by Leibbrandt and Williams, is used to regularize the spurious singularities of Yang-Mills theory in the temporal gauge. Typical one-loop split dimensionally regularized temporal gauge integrals, and hence the renormalization structure of the theory are shown to be the same as those calculated with some nonprincipal-value prescriptions.

  2. Temporal regularity in speech perception: Is regularity beneficial or deleterious?

    Science.gov (United States)

    Geiser, Eveline; Shattuck-Hufnagel, Stefanie

    2012-04-01

    Speech rhythm has been proposed to be of crucial importance for correct speech perception and language learning. This study investigated the influence of speech rhythm in second language processing. German pseudo-sentences were presented to participants in two conditions: 'naturally regular speech rhythm' and an 'emphasized regular rhythm'. Nine expert English speakers with 3.5±1.6 years of German training repeated each sentence after hearing it once over headphones. Responses were transcribed using the International Phonetic Alphabet and analyzed for the number of correct, false and missing consonants as well as for consonant additions. The over-all number of correct reproductions of consonants did not differ between the two experimental conditions. However, speech rhythmicization significantly affected the serial position curve of correctly reproduced syllables. The results of this pilot study are consistent with the view that speech rhythm is important for speech perception.

  3. Partial Correlation between Spatial and Temporal Regularities of Human Mobility.

    Science.gov (United States)

    Geng, Wei; Yang, Guang

    2017-07-24

    The regularity of human mobility has been extensively studied because of its prominent applications in a considerable number of important areas. Entropy, in addition to many other measures, has long been used to quantify the regularity of human mobility. We adopt the commonly used spatial entropy and develop an analogical temporal entropy to separately investigate the spatial and temporal regularities of human mobility. The underlying data are from an automated transit fare collection system operated by a metropolitan public transit authority in China. The distributions of both spatial and temporal entropies and their dependences on several widely used statistics are examined. The spatial and temporal entropies present a statistically significant correlation, which has not previously been reported to the best of our knowledge.

  4. Boosting Maintenance in Working Memory With Temporal Regularities.

    Science.gov (United States)

    Plancher, Gaën; Lévêque, Yohana; Fanuel, Lison; Piquandet, Gaëlle; Tillmann, Barbara

    2017-11-02

    Music cognition research has provided evidence for the benefit of temporally regular structures guiding attention over time. The present study investigated whether maintenance in working memory can benefit from an isochronous rhythm. Participants were asked to remember series of 6 letters for serial recall. In the rhythm condition of Experiment 1A, a wood block sound was presented 6 times with a regular stimulus-onset-asynchrony during the delay between encoding and recall. In the silent condition, no sound was presented. The presence of the regular rhythm resulted in improved memory performance (Experiment 1A.), an effect also observed under articulatory suppression (Experiment 2), suggesting that temporal regularities can enhance maintenance in working memory including attentional refreshing. Experiment 1B confirmed this interpretation by showing that the presentation of a nonisochronous rhythm did not result in improved memory performance in comparison to a silent condition. The findings are discussed in relation to current working memory models and the theoretical framework of dynamic attending. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Infants use temporal regularities to chunk objects in memory.

    Science.gov (United States)

    Kibbe, Melissa M; Feigenson, Lisa

    2016-01-01

    whether infants also remembered the specific identities of the objects in each chunk. In Experiment 4, we confirmed that infants remembered objects' identities in smaller arrays that did not require chunking. Next, in Experiment 5, we asked whether infants also remembered objects' identities in larger arrays that had been chunked on the basis of temporal regularities. Following a familiarization phase identical to that in Experiment 2a, we hid all four objects and then revealed either these same four objects, or four objects of which two had unexpectedly changed shape and color. Surprisingly, infants failed to look longer at the identity change outcome. Taken together, our results suggest that infants can use temporal regularities between objects to increase memory for objects' existence, but not necessarily for objects' identities. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Attention increases the temporal precision of conscious perception: verifying the Neural-ST Model.

    Directory of Open Access Journals (Sweden)

    Srivas Chennu

    2009-11-01

    Full Text Available What role does attention play in ensuring the temporal precision of visual perception? Behavioural studies have investigated feature selection and binding in time using fleeting sequences of stimuli in the Rapid Serial Visual Presentation (RSVP paradigm, and found that temporal accuracy is reduced when attentional control is diminished. To reduce the efficacy of attentional deployment, these studies have employed the Attentional Blink (AB phenomenon. In this article, we use electroencephalography (EEG to directly investigate the temporal dynamics of conscious perception. Specifically, employing a combination of experimental analysis and neural network modelling, we test the hypothesis that the availability of attention reduces temporal jitter in the latency between a target's visual onset and its consolidation into working memory. We perform time-frequency analysis on data from an AB study to compare the EEG trials underlying the P3 ERPs (Event-related Potential evoked by targets seen outside vs. inside the AB time window. We find visual differences in phase-sorted ERPimages and statistical differences in the variance of the P3 phase distributions. These results argue for increased variation in the latency of conscious perception during the AB. This experimental analysis is complemented by a theoretical exploration of temporal attention and target processing. Using activation traces from the Neural-ST(2 model, we generate virtual ERPs and virtual ERPimages. These are compared to their human counterparts to propose an explanation of how target consolidation in the context of the AB influences the temporal variability of selective attention. The AB provides us with a suitable phenomenon with which to investigate the interplay between attention and perception. The combination of experimental and theoretical elucidation in this article contributes to converging evidence for the notion that the AB reflects a reduction in the temporal acuity of

  7. Verifying a medical protocol with temporal graphs: the case of a nosocomial disease.

    Science.gov (United States)

    Kamsu-Foguem, Bernard; Tchuenté-Foguem, Germaine; Foguem, Clovis

    2014-08-01

    Our contribution focuses on the implementation of a formal verification approach for medical protocols with graphical temporal reasoning paths to facilitate the understanding of verification steps. Formal medical guideline specifications and background knowledge are represented through conceptual graphs, and reasoning is based on graph homomorphism. These materials explain the underlying principles or rationale that guide the functioning of verifications. An illustration of this proposal is made using a medical protocol defining guidelines for the monitoring and prevention of nosocomial infections. Such infections, which are acquired in the hospital, increase morbidity and mortality and add noticeably to economic burden. An evaluation of the use of the graphical verification found that this method aids in the improvement of both clinical knowledge and the quality of actions made. As conceptual graphs, representations based on diagrams can be translated into computational tree logic. However, diagrams are much more natural and explicitly human, emphasizing a theoretical and practical consistency. The proposed approach allows for the visual modeling of temporal reasoning and a formalization of knowledge that can assist in the diagnosis and treatment of nosocomial infections and some clinical problems. This is the first time that one emphasizes the temporal situation modeling in conceptual graphs. It will also deliver a formal verification method for clinical guideline analyses. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Auditory temporal-regularity processing correlates with language and literacy skill in early adulthood.

    Science.gov (United States)

    Grube, Manon; Cooper, Freya E; Griffiths, Timothy D

    2013-01-01

    This work tests the hypothesis that language skill depends on the ability to incorporate streams of sound into an accurate temporal framework. We tested the ability of young English-speaking adults to process single time intervals and rhythmic sequences of such intervals, hypothesized to be relevant to the analysis of the temporal structure of language. The data implicate a specific role for the ability to process beat-based temporal regularities in phonological language and literacy skill.

  9. Social origins of rhythm? Synchrony and temporal regularity in human vocalization.

    Directory of Open Access Journals (Sweden)

    Daniel L Bowling

    Full Text Available Humans have a capacity to perceive and synchronize with rhythms. This is unusual in that only a minority of other species exhibit similar behavior. Study of synchronizing species (particularly anurans and insects suggests that simultaneous signal production by different individuals may play a critical role in the development of regular temporal signaling. Accordingly, we investigated the link between simultaneous signal production and temporal regularity in our own species. Specifically, we asked whether inter-individual synchronization of a behavior that is typically irregular in time, speech, could lead to evenly-paced or "isochronous" temporal patterns. Participants read nonsense phrases aloud with and without partners, and we found that synchronous reading resulted in greater regularity of durational intervals between words. Comparison of same-gender pairings showed that males and females were able to synchronize their temporal speech patterns with equal skill. These results demonstrate that the shared goal of synchronization can lead to the development of temporal regularity in vocalizations, suggesting that the origins of musical rhythm may lie in cooperative social interaction rather than in sexual selection.

  10. The encoding of temporally irregular and regular visual patterns in the human brain.

    Directory of Open Access Journals (Sweden)

    Semir Zeki

    2008-05-01

    Full Text Available In the work reported here, we set out to study the neural systems that detect predictable temporal patterns and departures from them. We used functional magnetic resonance imaging (fMRI to locate activity in the brains of subjects when they viewed temporally regular and irregular patterns produced by letters, numbers, colors and luminance. Activity induced by irregular sequences was located within dorsolateral prefrontal cortex, including an area that was responsive to irregular patterns regardless of the type of visual stimuli producing them. Conversely, temporally regular arrangements resulted in activity in the right frontal lobe (medial frontal gyrus, in the left orbito-frontal cortex and in the left pallidum. The results show that there is an abstractive system in the brain for detecting temporal irregularity, regardless of the source producing it.

  11. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    Science.gov (United States)

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour

  12. Temporally Regular Musical Primes Facilitate Subsequent Syntax Processing in Children with Specific Language Impairment.

    Science.gov (United States)

    Bedoin, Nathalie; Brisseau, Lucie; Molinier, Pauline; Roch, Didier; Tillmann, Barbara

    2016-01-01

    Children with developmental language disorders have been shown to be also impaired in rhythm and meter perception. Temporal processing and its link to language processing can be understood within the dynamic attending theory. An external stimulus can stimulate internal oscillators, which orient attention over time and drive speech signal segmentation to provide benefits for syntax processing, which is impaired in various patient populations. For children with Specific Language Impairment (SLI) and dyslexia, previous research has shown the influence of an external rhythmic stimulation on subsequent language processing by comparing the influence of a temporally regular musical prime to that of a temporally irregular prime. Here we tested whether the observed rhythmic stimulation effect is indeed due to a benefit provided by the regular musical prime (rather than a cost subsequent to the temporally irregular prime). Sixteen children with SLI and 16 age-matched controls listened to either a regular musical prime sequence or an environmental sound scene (without temporal regularities in event occurrence; i.e., referred to as "baseline condition") followed by grammatically correct and incorrect sentences. They were required to perform grammaticality judgments for each auditorily presented sentence. Results revealed that performance for the grammaticality judgments was better after the regular prime sequences than after the baseline sequences. Our findings are interpreted in the theoretical framework of the dynamic attending theory (Jones, 1976) and the temporal sampling (oscillatory) framework for developmental language disorders (Goswami, 2011). Furthermore, they encourage the use of rhythmic structures (even in non-verbal materials) to boost linguistic structure processing and outline perspectives for rehabilitation.

  13. Spatio-Temporal Regularization for Longitudinal Registration to Subject-Specific 3d Template.

    Directory of Open Access Journals (Sweden)

    Nicolas Guizard

    Full Text Available Neurodegenerative diseases such as Alzheimer's disease present subtle anatomical brain changes before the appearance of clinical symptoms. Manual structure segmentation is long and tedious and although automatic methods exist, they are often performed in a cross-sectional manner where each time-point is analyzed independently. With such analysis methods, bias, error and longitudinal noise may be introduced. Noise due to MR scanners and other physiological effects may also introduce variability in the measurement. We propose to use 4D non-linear registration with spatio-temporal regularization to correct for potential longitudinal inconsistencies in the context of structure segmentation. The major contribution of this article is the use of individual template creation with spatio-temporal regularization of the deformation fields for each subject. We validate our method with different sets of real MRI data, compare it to available longitudinal methods such as FreeSurfer, SPM12, QUARC, TBM, and KNBSI, and demonstrate that spatially local temporal regularization yields more consistent rates of change of global structures resulting in better statistical power to detect significant changes over time and between populations.

  14. Spatio-Temporal Regularization for Longitudinal Registration to Subject-Specific 3d Template.

    Science.gov (United States)

    Guizard, Nicolas; Fonov, Vladimir S; García-Lorenzo, Daniel; Nakamura, Kunio; Aubert-Broche, Bérengère; Collins, D Louis

    2015-01-01

    Neurodegenerative diseases such as Alzheimer's disease present subtle anatomical brain changes before the appearance of clinical symptoms. Manual structure segmentation is long and tedious and although automatic methods exist, they are often performed in a cross-sectional manner where each time-point is analyzed independently. With such analysis methods, bias, error and longitudinal noise may be introduced. Noise due to MR scanners and other physiological effects may also introduce variability in the measurement. We propose to use 4D non-linear registration with spatio-temporal regularization to correct for potential longitudinal inconsistencies in the context of structure segmentation. The major contribution of this article is the use of individual template creation with spatio-temporal regularization of the deformation fields for each subject. We validate our method with different sets of real MRI data, compare it to available longitudinal methods such as FreeSurfer, SPM12, QUARC, TBM, and KNBSI, and demonstrate that spatially local temporal regularization yields more consistent rates of change of global structures resulting in better statistical power to detect significant changes over time and between populations.

  15. Pronunciation Difficulty, Temporal Regularity, and the Speech-to-Song Illusion

    Directory of Open Access Journals (Sweden)

    Elizabeth Hellmuth eMargulis

    2015-01-01

    Full Text Available The speech-to-song illusion (Deutsch, Henthorn & Lapidis, 2011 tracks the perceptual transformation from speech to song across repetitions of a brief spoken utterance. Because it involves no change in the stimulus itself, but a dramatic change in its perceived affiliation to speech or to music, it presents a unique opportunity to comparatively investigate the processing of language and music. In this study, native English speaking participants were presented with brief spoken utterances that were subsequently repeated ten times. The utterances were drawn either from languages that are relatively difficult for a native English speaker to pronounce, or languages that are relatively easy for a native English speaker to pronounce. Moreover, the repetition could occur at regular temporal intervals, allowing the emergence of a sort of meter, or at irregular temporal intervals, making the emergence of meter impossible. Participants rated the utterances before and after the repetitions on a 5-point Likert-like scale ranging from sounds exactly like speech to sounds exactly like singing. The difference in ratings before and after was taken as a measure of the strength of the speech-to-song illusion in each case. The speech-to-song illusion occurred regardless of whether the repetitions were spaced at regular temporal intervals or not; however, it occurred more readily if the utterance was spoken in a language difficult for a native English speaker to pronounce.

  16. Quantitative Evaluation of Temporal Regularizers in Compressed Sensing Dynamic Contrast Enhanced MRI of the Breast

    Directory of Open Access Journals (Sweden)

    Dong Wang

    2017-01-01

    Full Text Available Purpose. Dynamic contrast enhanced magnetic resonance imaging (DCE-MRI is used in cancer imaging to probe tumor vascular properties. Compressed sensing (CS theory makes it possible to recover MR images from randomly undersampled k-space data using nonlinear recovery schemes. The purpose of this paper is to quantitatively evaluate common temporal sparsity-promoting regularizers for CS DCE-MRI of the breast. Methods. We considered five ubiquitous temporal regularizers on 4.5x retrospectively undersampled Cartesian in vivo breast DCE-MRI data: Fourier transform (FT, Haar wavelet transform (WT, total variation (TV, second-order total generalized variation (TGVα2, and nuclear norm (NN. We measured the signal-to-error ratio (SER of the reconstructed images, the error in tumor mean, and concordance correlation coefficients (CCCs of the derived pharmacokinetic parameters Ktrans (volume transfer constant and ve (extravascular-extracellular volume fraction across a population of random sampling schemes. Results. NN produced the lowest image error (SER: 29.1, while TV/TGVα2 produced the most accurate Ktrans (CCC: 0.974/0.974 and ve (CCC: 0.916/0.917. WT produced the highest image error (SER: 21.8, while FT produced the least accurate Ktrans (CCC: 0.842 and ve (CCC: 0.799. Conclusion. TV/TGVα2 should be used as temporal constraints for CS DCE-MRI of the breast.

  17. Quantitative Evaluation of Temporal Regularizers in Compressed Sensing Dynamic Contrast Enhanced MRI of the Breast.

    Science.gov (United States)

    Wang, Dong; Arlinghaus, Lori R; Yankeelov, Thomas E; Yang, Xiaoping; Smith, David S

    2017-01-01

    Dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used in cancer imaging to probe tumor vascular properties. Compressed sensing (CS) theory makes it possible to recover MR images from randomly undersampled k-space data using nonlinear recovery schemes. The purpose of this paper is to quantitatively evaluate common temporal sparsity-promoting regularizers for CS DCE-MRI of the breast. We considered five ubiquitous temporal regularizers on 4.5x retrospectively undersampled Cartesian in vivo breast DCE-MRI data: Fourier transform (FT), Haar wavelet transform (WT), total variation (TV), second-order total generalized variation (TGV α(2)), and nuclear norm (NN). We measured the signal-to-error ratio (SER) of the reconstructed images, the error in tumor mean, and concordance correlation coefficients (CCCs) of the derived pharmacokinetic parameters K(trans) (volume transfer constant) and ve (extravascular-extracellular volume fraction) across a population of random sampling schemes. NN produced the lowest image error (SER: 29.1), while TV/TGV α(2) produced the most accurate K(trans) (CCC: 0.974/0.974) and ve (CCC: 0.916/0.917). WT produced the highest image error (SER: 21.8), while FT produced the least accurate K(trans) (CCC: 0.842) and ve (CCC: 0.799). TV/TGV α(2) should be used as temporal constraints for CS DCE-MRI of the breast.

  18. EEG/MEG Source Reconstruction with Spatial-Temporal Two-Way Regularized Regression

    KAUST Repository

    Tian, Tian Siva

    2013-07-11

    In this work, we propose a spatial-temporal two-way regularized regression method for reconstructing neural source signals from EEG/MEG time course measurements. The proposed method estimates the dipole locations and amplitudes simultaneously through minimizing a single penalized least squares criterion. The novelty of our methodology is the simultaneous consideration of three desirable properties of the reconstructed source signals, that is, spatial focality, spatial smoothness, and temporal smoothness. The desirable properties are achieved by using three separate penalty functions in the penalized regression framework. Specifically, we impose a roughness penalty in the temporal domain for temporal smoothness, and a sparsity-inducing penalty and a graph Laplacian penalty in the spatial domain for spatial focality and smoothness. We develop a computational efficient multilevel block coordinate descent algorithm to implement the method. Using a simulation study with several settings of different spatial complexity and two real MEG examples, we show that the proposed method outperforms existing methods that use only a subset of the three penalty functions. © 2013 Springer Science+Business Media New York.

  19. Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization

    Directory of Open Access Journals (Sweden)

    Chunyuan Zhang

    2016-01-01

    Full Text Available By combining with sparse kernel methods, least-squares temporal difference (LSTD algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i online sparsification, which can cope with unknown state regions and be used for online learning, (ii L2 and L1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v the fixed-point subiteration and online pruning, which can make L1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms.

  20. Variability in Regularity: Mining Temporal Mobility Patterns in London, Singapore and Beijing Using Smart-Card Data.

    Directory of Open Access Journals (Sweden)

    Chen Zhong

    Full Text Available To discover regularities in human mobility is of fundamental importance to our understanding of urban dynamics, and essential to city and transport planning, urban management and policymaking. Previous research has revealed universal regularities at mainly aggregated spatio-temporal scales but when we zoom into finer scales, considerable heterogeneity and diversity is observed instead. The fundamental question we address in this paper is at what scales are the regularities we detect stable, explicable, and sustainable. This paper thus proposes a basic measure of variability to assess the stability of such regularities focusing mainly on changes over a range of temporal scales. We demonstrate this by comparing regularities in the urban mobility patterns in three world cities, namely London, Singapore and Beijing using one-week of smart-card data. The results show that variations in regularity scale as non-linear functions of the temporal resolution, which we measure over a scale from 1 minute to 24 hours thus reflecting the diurnal cycle of human mobility. A particularly dramatic increase in variability occurs up to the temporal scale of about 15 minutes in all three cities and this implies that limits exist when we look forward or backward with respect to making short-term predictions. The degree of regularity varies in fact from city to city with Beijing and Singapore showing higher regularity in comparison to London across all temporal scales. A detailed discussion is provided, which relates the analysis to various characteristics of the three cities. In summary, this work contributes to a deeper understanding of regularities in patterns of transit use from variations in volumes of travellers entering subway stations, it establishes a generic analytical framework for comparative studies using urban mobility data, and it provides key points for the management of variability by policy-makers intent on for making the travel experience more

  1. Variability in Regularity: Mining Temporal Mobility Patterns in London, Singapore and Beijing Using Smart-Card Data

    Science.gov (United States)

    Zhong, Chen; Batty, Michael; Manley, Ed; Wang, Jiaqiu; Wang, Zijia; Chen, Feng; Schmitt, Gerhard

    2016-01-01

    To discover regularities in human mobility is of fundamental importance to our understanding of urban dynamics, and essential to city and transport planning, urban management and policymaking. Previous research has revealed universal regularities at mainly aggregated spatio-temporal scales but when we zoom into finer scales, considerable heterogeneity and diversity is observed instead. The fundamental question we address in this paper is at what scales are the regularities we detect stable, explicable, and sustainable. This paper thus proposes a basic measure of variability to assess the stability of such regularities focusing mainly on changes over a range of temporal scales. We demonstrate this by comparing regularities in the urban mobility patterns in three world cities, namely London, Singapore and Beijing using one-week of smart-card data. The results show that variations in regularity scale as non-linear functions of the temporal resolution, which we measure over a scale from 1 minute to 24 hours thus reflecting the diurnal cycle of human mobility. A particularly dramatic increase in variability occurs up to the temporal scale of about 15 minutes in all three cities and this implies that limits exist when we look forward or backward with respect to making short-term predictions. The degree of regularity varies in fact from city to city with Beijing and Singapore showing higher regularity in comparison to London across all temporal scales. A detailed discussion is provided, which relates the analysis to various characteristics of the three cities. In summary, this work contributes to a deeper understanding of regularities in patterns of transit use from variations in volumes of travellers entering subway stations, it establishes a generic analytical framework for comparative studies using urban mobility data, and it provides key points for the management of variability by policy-makers intent on for making the travel experience more amenable. PMID:26872333

  2. Why Movement Is Captured by Music, but Less by Speech: Role of Temporal Regularity

    Science.gov (United States)

    Dalla Bella, Simone; Białuńska, Anita; Sowiński, Jakub

    2013-01-01

    Music has a pervasive tendency to rhythmically engage our body. In contrast, synchronization with speech is rare. Music’s superiority over speech in driving movement probably results from isochrony of musical beats, as opposed to irregular speech stresses. Moreover, the presence of regular patterns of embedded periodicities (i.e., meter) may be critical in making music particularly conducive to movement. We investigated these possibilities by asking participants to synchronize with isochronous auditory stimuli (target), while music and speech distractors were presented at one of various phase relationships with respect to the target. In Exp. 1, familiar musical excerpts and fragments of children poetry were used as distractors. The stimuli were manipulated in terms of beat/stress isochrony and average pitch to achieve maximum comparability. In Exp. 2, the distractors were well-known songs performed with lyrics, on a reiterated syllable, and spoken lyrics, all having the same meter. Music perturbed synchronization with the target stimuli more than speech fragments. However, music superiority over speech disappeared when distractors shared isochrony and the same meter. Music’s peculiar and regular temporal structure is likely to be the main factor fostering tight coupling between sound and movement. PMID:23936534

  3. Why movement is captured by music, but less by speech: role of temporal regularity.

    Directory of Open Access Journals (Sweden)

    Simone Dalla Bella

    Full Text Available Music has a pervasive tendency to rhythmically engage our body. In contrast, synchronization with speech is rare. Music's superiority over speech in driving movement probably results from isochrony of musical beats, as opposed to irregular speech stresses. Moreover, the presence of regular patterns of embedded periodicities (i.e., meter may be critical in making music particularly conducive to movement. We investigated these possibilities by asking participants to synchronize with isochronous auditory stimuli (target, while music and speech distractors were presented at one of various phase relationships with respect to the target. In Exp. 1, familiar musical excerpts and fragments of children poetry were used as distractors. The stimuli were manipulated in terms of beat/stress isochrony and average pitch to achieve maximum comparability. In Exp. 2, the distractors were well-known songs performed with lyrics, on a reiterated syllable, and spoken lyrics, all having the same meter. Music perturbed synchronization with the target stimuli more than speech fragments. However, music superiority over speech disappeared when distractors shared isochrony and the same meter. Music's peculiar and regular temporal structure is likely to be the main factor fostering tight coupling between sound and movement.

  4. Why movement is captured by music, but less by speech: role of temporal regularity.

    Science.gov (United States)

    Dalla Bella, Simone; Białuńska, Anita; Sowiński, Jakub

    2013-01-01

    Music has a pervasive tendency to rhythmically engage our body. In contrast, synchronization with speech is rare. Music's superiority over speech in driving movement probably results from isochrony of musical beats, as opposed to irregular speech stresses. Moreover, the presence of regular patterns of embedded periodicities (i.e., meter) may be critical in making music particularly conducive to movement. We investigated these possibilities by asking participants to synchronize with isochronous auditory stimuli (target), while music and speech distractors were presented at one of various phase relationships with respect to the target. In Exp. 1, familiar musical excerpts and fragments of children poetry were used as distractors. The stimuli were manipulated in terms of beat/stress isochrony and average pitch to achieve maximum comparability. In Exp. 2, the distractors were well-known songs performed with lyrics, on a reiterated syllable, and spoken lyrics, all having the same meter. Music perturbed synchronization with the target stimuli more than speech fragments. However, music superiority over speech disappeared when distractors shared isochrony and the same meter. Music's peculiar and regular temporal structure is likely to be the main factor fostering tight coupling between sound and movement.

  5. Boosting syntax training with temporally regular musical primes in children with cochlear implants.

    Science.gov (United States)

    Bedoin, N; Besombes, A-M; Escande, E; Dumont, A; Lalitte, P; Tillmann, B

    2017-05-11

    Previous research has suggested the use of rhythmic structures (implemented in musical material) to improve linguistic structure processing (i.e., syntax processing), in particular for populations showing deficits in syntax and temporal processing (e.g., children with developmental language disorders). The present study proposes a long-term training program to improve syntax processing in children with cochlear implants, a population showing syntax processing deficits in perception and production. The training program consisted of morphosyntactic training exercises (based on speech processing) that were primed by musical regular primes (8 sessions) or neutral baseline primes (environmental sounds) (8 sessions). A crossover design was used to train 10 deaf children with cochlear implants. Performance in grammatical processing, non-word repetition, attention and memory was assessed before and after training. Training increased performance for syntax comprehension after both prime types but for grammaticality judgements and non-word repetition only when musical primes were used during training. For the far-transfer tests, some effects were also observed for attention tasks, especially if fast and precise sequential analysis (sequencing) was required, but not for memory tasks. The findings extend the previously observed beneficial short-term effects of regular musical primes in the laboratory to long-term training effects. Results suggest that the musical primes improved the processing of the syntactic training material, thus enhancing the training effects on grammatical processing as well as phonological processing and sequencing of speech signals. The findings can be interpreted within the dynamic attending theory (postulating the modulation of attention over time) and associated oscillatory brain activity. Furthermore, the findings encourage the use of rhythmic structures (even in non-verbal materials) in language training programs and outline perspectives for

  6. Temporal regularization of ultrasound-based liver motion estimation for image-guided radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    O’Shea, Tuathan P., E-mail: tuathan.oshea@icr.ac.uk; Bamber, Jeffrey C.; Harris, Emma J. [Joint Department of Physics, The Institute of Cancer Research and The Royal Marsden NHS foundation Trust, Sutton, London SM2 5PT (United Kingdom)

    2016-01-15

    Purpose: Ultrasound-based motion estimation is an expanding subfield of image-guided radiation therapy. Although ultrasound can detect tissue motion that is a fraction of a millimeter, its accuracy is variable. For controlling linear accelerator tracking and gating, ultrasound motion estimates must remain highly accurate throughout the imaging sequence. This study presents a temporal regularization method for correlation-based template matching which aims to improve the accuracy of motion estimates. Methods: Liver ultrasound sequences (15–23 Hz imaging rate, 2.5–5.5 min length) from ten healthy volunteers under free breathing were used. Anatomical features (blood vessels) in each sequence were manually annotated for comparison with normalized cross-correlation based template matching. Five sequences from a Siemens Acuson™ scanner were used for algorithm development (training set). Results from incremental tracking (IT) were compared with a temporal regularization method, which included a highly specific similarity metric and state observer, known as the α–β filter/similarity threshold (ABST). A further five sequences from an Elekta Clarity™ system were used for validation, without alteration of the tracking algorithm (validation set). Results: Overall, the ABST method produced marked improvements in vessel tracking accuracy. For the training set, the mean and 95th percentile (95%) errors (defined as the difference from manual annotations) were 1.6 and 1.4 mm, respectively (compared to 6.2 and 9.1 mm, respectively, for IT). For each sequence, the use of the state observer leads to improvement in the 95% error. For the validation set, the mean and 95% errors for the ABST method were 0.8 and 1.5 mm, respectively. Conclusions: Ultrasound-based motion estimation has potential to monitor liver translation over long time periods with high accuracy. Nonrigid motion (strain) and the quality of the ultrasound data are likely to have an impact on tracking

  7. Learning About Time Within the Spinal Cord II: Evidence that Temporal Regularity is Encoded by a Spinal Oscillator

    OpenAIRE

    Kuan Hsien Lee; Yung-Jen eHuang; Grau, James W.

    2016-01-01

    How a stimulus impacts spinal cord function depends upon temporal relations. When intermittent noxious stimulation (shock) is applied and the interval between shock pulses is varied (unpredictable), it induces a lasting alteration that inhibits adaptive learning. If the same stimulus is applied in a temporally regular (predictable) manner, the capacity to learn is preserved and a protective/restorative effect is engaged that counters the adverse effect of variable stimulation. Sensitivity to ...

  8. Long-period seismic events with strikingly regular temporal patterns on Katla volcano's south flank (Iceland)

    CERN Document Server

    Sgattoni, Giulia; Guðmundsson, Ólafur; Einarsson, Páll; Tryggvason, Ari; Lund, Björn; Lucchi, Federico

    2015-01-01

    Katla is a threatening volcano in Iceland, partly covered by the M\\'yrdalsj\\"okull ice cap. The volcano has a large caldera with several active geothermal areas. A peculiar cluster of long-period seismic events started on Katla's south flank in July 2011, during an unrest episode in the caldera that culminated in a glacier outburst. The seismic events were tightly clustered at shallow depth in the Gvendarfell area, 4 km south of the caldera, under a small glacier stream on the southern margin of M\\'yrdalsj\\"okull. No seismic events were known to have occurred in this area before. The most striking feature of this seismic cluster is its temporal pattern, characterized by regular intervals between repeating seismic events, modulated by a seasonal variation. Remarkable is also the stability of both the time and waveform features over a long time period, around 3.5 years. No comparable examples have been found in the literature. Both volcanic and glacial processes can produce similar waveforms and therefore have ...

  9. HYBRID APPROACHES TO THE FORMALISATION OF EXPERT KNOWLEDGE CONCERNING TEMPORAL REGULARITIES IN THE TIME SERIES GROUP OF A SYSTEM MONITORING DATABASE

    Directory of Open Access Journals (Sweden)

    E. S. Staricov

    2016-01-01

    Full Text Available Objectives. The presented research problem concerns data regularities for an unspecified time series based on an approach to the expert formalisation of knowledge integrated into a decision-making mechanism. Method. A context-free grammar, consisting of a modification of universal temporal grammar, is used to describe regularities. Using the rules of the developed grammar, an expert can describe patterns in the group of time series. A multi-dimensional matrix pattern of the behaviour of a group of time series is used in a real-time decision-making regime in the expert system to implements a universal approach to the description of the dynamics of these changes in the expert system. The multidimensional matrix pattern is specifically intended for decision-making in an expert system; the modified temporal grammar is used to identify patterns in the data. Results. It is proposed to use the temporal relations of the series and fix observation values in the time interval as ―From-To‖, ―Before‖, ―After‖, ―Simultaneously‖ and ―Duration‖. A syntactically oriented converter of descriptions is developed. A schema for the creation and application of matrix patterns in expert systems is drawn up. Conclusion. The advantage of the implementation of the proposed hybrid approaches consists in a reduction of the time taken for identifying temporal patterns and an automation of the matrix pattern of the decision-making system based on expert descriptions verified using live data derived from relationships in the monitoring data. 

  10. The Temporal Dynamics of Regularity Extraction in Non-Human Primates

    Science.gov (United States)

    Minier, Laure; Fagot, Joël; Rey, Arnaud

    2016-01-01

    Extracting the regularities of our environment is one of our core cognitive abilities. To study the fine-grained dynamics of the extraction of embedded regularities, a method combining the advantages of the artificial language paradigm (Saffran, Aslin, & Newport, [Saffran, J. R., 1996]) and the serial response time task (Nissen & Bullemer,…

  11. Learning about Time within the Spinal Cord II: Evidence that Temporal Regularity Is Encoded by a Spinal Oscillator

    Science.gov (United States)

    Lee, Kuan H.; Huang, Yung-Jen; Grau, James W.

    2016-01-01

    How a stimulus impacts spinal cord function depends upon temporal relations. When intermittent noxious stimulation (shock) is applied and the interval between shock pulses is varied (unpredictable), it induces a lasting alteration that inhibits adaptive learning. If the same stimulus is applied in a temporally regular (predictable) manner, the capacity to learn is preserved and a protective/restorative effect is engaged that counters the adverse effect of variable stimulation. Sensitivity to temporal relations implies a capacity to encode time. This study explores how spinal neurons discriminate variable and fixed spaced stimulation. Communication with the brain was blocked by means of a spinal transection and adaptive capacity was tested using an instrumental learning task. In this task, subjects must learn to maintain a hind limb in a flexed position to minimize shock exposure. To evaluate the possibility that a distinct class of afferent fibers provide a sensory cue for regularity, we manipulated the temporal relation between shocks given to two dermatomes (leg and tail). Evidence for timing emerged when the stimuli were applied in a coherent manner across dermatomes, implying that a central (spinal) process detects regularity. Next, we show that fixed spaced stimulation has a restorative effect when half the physical stimuli are randomly omitted, as long as the stimuli remain in phase, suggesting that stimulus regularity is encoded by an internal oscillator Research suggests that the oscillator that drives the tempo of stepping depends upon neurons within the rostral lumbar (L1-L2) region. Disrupting communication with the L1-L2 tissue by means of a L3 transection eliminated the restorative effect of fixed spaced stimulation. Implications of the results for step training and rehabilitation after injury are discussed. PMID:26903830

  12. Learning about Time within the Spinal Cord II: Evidence that Temporal Regularity Is Encoded by a Spinal Oscillator.

    Science.gov (United States)

    Lee, Kuan H; Huang, Yung-Jen; Grau, James W

    2016-01-01

    How a stimulus impacts spinal cord function depends upon temporal relations. When intermittent noxious stimulation (shock) is applied and the interval between shock pulses is varied (unpredictable), it induces a lasting alteration that inhibits adaptive learning. If the same stimulus is applied in a temporally regular (predictable) manner, the capacity to learn is preserved and a protective/restorative effect is engaged that counters the adverse effect of variable stimulation. Sensitivity to temporal relations implies a capacity to encode time. This study explores how spinal neurons discriminate variable and fixed spaced stimulation. Communication with the brain was blocked by means of a spinal transection and adaptive capacity was tested using an instrumental learning task. In this task, subjects must learn to maintain a hind limb in a flexed position to minimize shock exposure. To evaluate the possibility that a distinct class of afferent fibers provide a sensory cue for regularity, we manipulated the temporal relation between shocks given to two dermatomes (leg and tail). Evidence for timing emerged when the stimuli were applied in a coherent manner across dermatomes, implying that a central (spinal) process detects regularity. Next, we show that fixed spaced stimulation has a restorative effect when half the physical stimuli are randomly omitted, as long as the stimuli remain in phase, suggesting that stimulus regularity is encoded by an internal oscillator Research suggests that the oscillator that drives the tempo of stepping depends upon neurons within the rostral lumbar (L1-L2) region. Disrupting communication with the L1-L2 tissue by means of a L3 transection eliminated the restorative effect of fixed spaced stimulation. Implications of the results for step training and rehabilitation after injury are discussed.

  13. Learning About Time Within the Spinal Cord II: Evidence that Temporal Regularity is Encoded by a Spinal Oscillator

    Directory of Open Access Journals (Sweden)

    Kuan Hsien Lee

    2016-02-01

    Full Text Available How a stimulus impacts spinal cord function depends upon temporal relations. When intermittent noxious stimulation (shock is applied and the interval between shock pulses is varied (unpredictable, it induces a lasting alteration that inhibits adaptive learning. If the same stimulus is applied in a temporally regular (predictable manner, the capacity to learn is preserved and a protective/restorative effect is engaged that counters the adverse effect of variable stimulation. Sensitivity to temporal relations implies a capacity to encode time. This study explores how spinal neurons discriminate variable and fixed spaced stimulation. Communication with the brain was blocked by means of a spinal transection and adaptive capacity was tested using an instrumental learning task. In this task, subjects must learn to maintain a hind limb in a flexed position to minimize shock exposure. To evaluate the possibility that a distinct class of afferent fibers provide a sensory cue for regularity, we manipulated the temporal relation between shocks given to two dermatomes (leg and tail. Evidence for timing emerged when the stimuli were applied in a coherent manner across dermatomes, implying that a central (spinal process detects regularity. Next, we show that fixed spaced stimulation has a restorative effect when half the physical stimuli are randomly omitted, as long as the stimuli remain in phase, suggesting that stimulus regularity is encoded by an internal oscillator Research suggests that the oscillator that drives the tempo of stepping depends upon neurons within the rostral lumbar (L1-L2 region. Disrupting communication with the L1-L2 tissue by means of a L3 transection eliminated the restorative effect of fixed spaced stimulation. Implications of the results for step training and rehabilitation after injury are discussed.

  14. High-resolution imaging-guided electroencephalography source localization: temporal effect regularization incorporation in LORETA inverse solution

    Science.gov (United States)

    Boughariou, Jihene; Zouch, Wassim; Slima, Mohamed Ben; Kammoun, Ines; Hamida, Ahmed Ben

    2015-11-01

    Electroencephalography (EEG) and magnetic resonance imaging (MRI) are noninvasive neuroimaging modalities. They are widely used and could be complementary. The fusion of these modalities may enhance some emerging research fields targeting the exploration better brain activities. Such research attracted various scientific investigators especially to provide a convivial and helpful advanced clinical-aid tool enabling better neurological explorations. Our present research was, in fact, in the context of EEG inverse problem resolution and investigated an advanced estimation methodology for the localization of the cerebral activity. Our focus was, therefore, on the integration of temporal priors to low-resolution brain electromagnetic tomography (LORETA) formalism and to solve the inverse problem in the EEG. The main idea behind our proposed method was in the integration of a temporal projection matrix within the LORETA weighting matrix. A hyperparameter is the principal fact for such a temporal integration, and its importance would be obvious when obtaining a regularized smoothness solution. Our experimental results clearly confirmed the impact of such an optimization procedure adopted for the temporal regularization parameter comparatively to the LORETA method.

  15. Cardiac C-arm computed tomography using a 3D + time ROI reconstruction method with spatial and temporal regularization

    Energy Technology Data Exchange (ETDEWEB)

    Mory, Cyril, E-mail: cyril.mory@philips.com [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, F-69621 Villeurbanne Cedex (France); Philips Research Medisys, 33 rue de Verdun, 92156 Suresnes (France); Auvray, Vincent; Zhang, Bo [Philips Research Medisys, 33 rue de Verdun, 92156 Suresnes (France); Grass, Michael; Schäfer, Dirk [Philips Research, Röntgenstrasse 24–26, D-22335 Hamburg (Germany); Chen, S. James; Carroll, John D. [Department of Medicine, Division of Cardiology, University of Colorado Denver, 12605 East 16th Avenue, Aurora, Colorado 80045 (United States); Rit, Simon [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon (France); Université Lyon 1 (France); Centre Léon Bérard, 28 rue Laënnec, F-69373 Lyon (France); Peyrin, Françoise [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, F-69621 Villeurbanne Cedex (France); X-ray Imaging Group, European Synchrotron, Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Douek, Philippe; Boussel, Loïc [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon (France); Université Lyon 1 (France); Hospices Civils de Lyon, 28 Avenue du Doyen Jean Lépine, 69500 Bron (France)

    2014-02-15

    Purpose: Reconstruction of the beating heart in 3D + time in the catheter laboratory using only the available C-arm system would improve diagnosis, guidance, device sizing, and outcome control for intracardiac interventions, e.g., electrophysiology, valvular disease treatment, structural or congenital heart disease. To obtain such a reconstruction, the patient's electrocardiogram (ECG) must be recorded during the acquisition and used in the reconstruction. In this paper, the authors present a 4D reconstruction method aiming to reconstruct the heart from a single sweep 10 s acquisition. Methods: The authors introduce the 4D RecOnstructiOn using Spatial and TEmporal Regularization (short 4D ROOSTER) method, which reconstructs all cardiac phases at once, as a 3D + time volume. The algorithm alternates between a reconstruction step based on conjugate gradient and four regularization steps: enforcing positivity, averaging along time outside a motion mask that contains the heart and vessels, 3D spatial total variation minimization, and 1D temporal total variation minimization. Results: 4D ROOSTER recovers the different temporal representations of a moving Shepp and Logan phantom, and outperforms both ECG-gated simultaneous algebraic reconstruction technique and prior image constrained compressed sensing on a clinical case. It generates 3D + time reconstructions with sharp edges which can be used, for example, to estimate the patient's left ventricular ejection fraction. Conclusions: 4D ROOSTER can be applied for human cardiac C-arm CT, and potentially in other dynamic tomography areas. It can easily be adapted to other problems as regularization is decoupled from projection and back projection.

  16. Analysis of Regularly and Irregularly Sampled Spatial, Multivariate, and Multi-temporal Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    1994-01-01

    This thesis describes different methods that are useful in the analysis of multivariate data. Some methods focus on spatial data (sampled regularly or irregularly), others focus on multitemporal data or data from multiple sources. The thesis covers selected and not all aspects of relevant data...... maximize the variance represented by each component, MAFs maximize the spatial autocorrelation represented by each component, and MNFs maximize a measure of signal-to-noise ratio represented by each component. In the literature MAF/MNF analysis is described for regularly gridded data only. Here...... filtering of MAF/MNFs is suggested. One case study successfully shows the effect of the MNF Fourier restoration. Another case shows the superiority of the MAF/MNF analysis over ordinary non-spatial factor analysis of geochemical data in South Greenland (with a geologist's comment). Also, two examples of MAF...

  17. Temporal regularity increases with repertoire complexity in the Australian pied butcherbird's song

    OpenAIRE

    Janney, Eathan; Taylor, Hollis; Scharff, Constance; Rothenberg, David; Parra, Lucas C.; Tchernichovski, Ofer

    2016-01-01

    Music maintains a characteristic balance between repetition and novelty. Here, we report a similar balance in singing performances of free-living Australian pied butcherbirds. Their songs include many phrase types. The more phrase types in a bird's repertoire, the more diverse the singing performance can be. However, without sufficient temporal organization, avian listeners may find diverse singing performances difficult to perceive and memorize. We tested for a correlation between the comple...

  18. Regular icosahedron

    OpenAIRE

    Mihelak, Veronika

    2016-01-01

    Here are collected properties of regular icosahedron which are useful for students of mathematics or mathematics teachers who can prepare exercises for talented students in elementary or middle school. The initial section describes the basic properties of regular polyhedra: tetrahedron, cube, dodecahedron, octahedron and of course icosahedron. We have proven that there are only five regular or platonic solids and have verified Euler's polyhedron formula for them. Then we focused on selected p...

  19. Temporal regularity determines the impact of electrical stimulation on tactile reactivity and response to capsaicin in spinally transected rats.

    Science.gov (United States)

    Baumbauer, K M; Lee, K H; Puga, D A; Woller, S A; Hughes, A J; Grau, J W

    2012-12-27

    Nociceptive plasticity and central sensitization within the spinal cord depend on neurobiological mechanisms implicated in learning and memory in higher neural systems, suggesting that the factors that impact brain-mediated learning and memory could modulate how stimulation affects spinal systems. One such factor is temporal regularity (predictability). The present paper shows that intermittent hindleg shock has opposing effects in spinally transected rats depending upon whether shock is presented in a regular or irregular (variable) manner. Variable intermittent legshock (900 shocks) enhanced mechanical reactivity to von Frey stimuli (hyperreactivity), whereas 900 fixed-spaced legshocks produced hyporeactivity. The impact of fixed-spaced shock depended upon the duration of exposure; a brief exposure (36 shocks) induced hyperreactivity whereas an extended exposure (900 shocks) produced hyporeactivity. The enhanced reactivity observed after variable shock was most evident 60-180 min after treatment. Fixed and variable intermittent stimulation applied to the sciatic nerve, or the tail, yielded a similar pattern of results. Stimulation had no effect on thermal reactivity. Exposure to fixed-spaced shock, but not variable shock, attenuated the enhanced mechanical reactivity (EMR) produced by treatment with hindpaw capsaicin. The effect of fixed-spaced stimulation lasted 24h. Treatment with fixed-spaced shock also attenuated the maintenance of capsaicin-induced EMR. The results show that variable intermittent shock enhances mechanical reactivity, while an extended exposure to fixed-spaced shock has the opposite effect on mechanical reactivity and attenuates capsaicin-induced EMR. Copyright © 2012 IBRO. Published by Elsevier Ltd. All rights reserved.

  20. Application of a regularized model inversion system (REGFLEC) to multi-temporal RapidEye imagery for retrieving vegetation characteristics

    Science.gov (United States)

    Houborg, Rasmus; McCabe, Matthew F.

    2015-10-01

    Accurate retrieval of canopy biophysical and leaf biochemical constituents from space observations is critical to diagnosing the functioning and condition of vegetation canopies across spatio-temporal scales. Retrieved vegetation characteristics may serve as important inputs to precision farming applications and as constraints in spatially and temporally distributed model simulations of water and carbon exchange processes. However significant challenges remain in the translation of composite remote sensing signals into useful biochemical, physiological or structural quantities and treatment of confounding factors in spectrum-trait relations. Bands in the red-edge spectrum have particular potential for improving the robustness of retrieved vegetation properties. The development of observationally based vegetation retrieval capacities, effectively constrained by the enhanced information content afforded by bands in the red-edge, is a needed investment towards optimizing the benefit of current and future satellite sensor systems. In this study, a REGularized canopy reFLECtance model (REGFLEC) for joint leaf chlorophyll (Chll) and leaf area index (LAI) retrieval is extended to sensor systems with a band in the red-edge region for the first time. Application to time-series of 5 m resolution multi-spectral RapidEye data is demonstrated over an irrigated agricultural region in central Saudi Arabia, showcasing the value of satellite-derived crop information at this fine scale for precision management. Validation against in-situ measurements in fields of alfalfa, Rhodes grass, carrot and maize indicate improved accuracy of retrieved vegetation properties when exploiting red-edge information in the model inversion process.

  1. Application of a regularized model inversion system (REGFLEC) to multi-temporal RapidEye imagery for retrieving vegetation characteristics

    KAUST Repository

    Houborg, Rasmus

    2015-10-14

    Accurate retrieval of canopy biophysical and leaf biochemical constituents from space observations is critical to diagnosing the functioning and condition of vegetation canopies across spatio-temporal scales. Retrieved vegetation characteristics may serve as important inputs to precision farming applications and as constraints in spatially and temporally distributed model simulations of water and carbon exchange processes. However significant challenges remain in the translation of composite remote sensing signals into useful biochemical, physiological or structural quantities and treatment of confounding factors in spectrum-trait relations. Bands in the red-edge spectrum have particular potential for improving the robustness of retrieved vegetation properties. The development of observationally based vegetation retrieval capacities, effectively constrained by the enhanced information content afforded by bands in the red-edge, is a needed investment towards optimizing the benefit of current and future satellite sensor systems. In this study, a REGularized canopy reFLECtance model (REGFLEC) for joint leaf chlorophyll (Chll) and leaf area index (LAI) retrieval is extended to sensor systems with a band in the red-edge region for the first time. Application to time-series of 5 m resolution multi-spectral RapidEye data is demonstrated over an irrigated agricultural region in central Saudi Arabia, showcasing the value of satellite-derived crop information at this fine scale for precision management. Validation against in-situ measurements in fields of alfalfa, Rhodes grass, carrot and maize indicate improved accuracy of retrieved vegetation properties when exploiting red-edge information in the model inversion process. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  2. Externally Verifiable Oblivious RAM

    Directory of Open Access Journals (Sweden)

    Gancher Joshua

    2017-04-01

    Full Text Available We present the idea of externally verifiable oblivious RAM (ORAM. Our goal is to allow a client and server carrying out an ORAM protocol to have disputes adjudicated by a third party, allowing for the enforcement of penalties against an unreliable or malicious server. We give a security definition that guarantees protection not only against a malicious server but also against a client making false accusations. We then give modifications of the Path ORAM [15] and Ring ORAM [9] protocols that meet this security definition. These protocols both have the same asymptotic runtimes as the semi-honest original versions and require the external verifier to be involved only when the client or server deviates from the protocol. Finally, we implement externally verified ORAM, along with an automated cryptocurrency contract to use as the external verifier.

  3. Verifiable postal voting

    OpenAIRE

    Ryan, Peter; Benaloh, Josh; Teague, Vanessa

    2013-01-01

    This proposal aims to combine the best properties of paper-based and end-to-end verifiable remote voting systems. Ballots are delivered electronically to voters, who return their votes on paper together with some cryptographic information that allows them to verify later that their votes were correctly included and counted. We emphasise the ease of the voter's experience, which is not much harder than basic electronic delivery and postal returns. A typical voter needs only to perform a simple...

  4. Verifiably Truthful Mechanisms

    DEFF Research Database (Denmark)

    Branzei, Simina; Procaccia, Ariel D.

    2015-01-01

    the computational sense). Our approach involves three steps: (i) specifying the structure of mechanisms, (ii) constructing a verification algorithm, and (iii) measuring the quality of verifiably truthful mechanisms. We demonstrate this approach using a case study: approximate mechanism design without money...

  5. Regular figures

    CERN Document Server

    Tóth, L Fejes; Ulam, S; Stark, M

    1964-01-01

    Regular Figures concerns the systematology and genetics of regular figures. The first part of the book deals with the classical theory of the regular figures. This topic includes description of plane ornaments, spherical arrangements, hyperbolic tessellations, polyhedral, and regular polytopes. The problem of geometry of the sphere and the two-dimensional hyperbolic space are considered. Classical theory is explained as describing all possible symmetrical groupings in different spaces of constant curvature. The second part deals with the genetics of the regular figures and the inequalities fo

  6. Maximal ? -regularity

    NARCIS (Netherlands)

    Van Neerven, J.M.A.M.; Veraar, M.C.; Weis, L.

    2015-01-01

    In this paper, we prove maximal regularity estimates in “square function spaces” which are commonly used in harmonic analysis, spectral theory, and stochastic analysis. In particular, they lead to a new class of maximal regularity results for both deterministic and stochastic equations in L p

  7. Definition of Verifiable School IPM

    Science.gov (United States)

    EPA is promoting use of verifiable school IPM. This is an activity that includes several elements with documentation, including pest identification, action thresholds, monitoring, effective pest control.

  8. Regular Boardgames

    OpenAIRE

    Kowalski, Jakub; Sutowicz, Jakub; Szykuła, Marek

    2017-01-01

    We present an initial version of Regular Boardgames general game description language. This stands as an extension of Simplified Boardgames language. Our language is designed to be able to express the rules of a majority of popular boardgames including the complex rules such as promotions, castling, en passant, jump captures, liberty captures, and obligatory moves. The language describes all the above through one consistent general mechanism based on regular expressions, without using excepti...

  9. Adaptive regularization

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Rasmussen, Carl Edward; Svarer, C.

    1994-01-01

    Regularization, e.g., in the form of weight decay, is important for training and optimization of neural network architectures. In this work the authors provide a tool based on asymptotic sampling theory, for iterative estimation of weight decay parameters. The basic idea is to do a gradient descent...... in the estimated generalization error with respect to the regularization parameters. The scheme is implemented in the authors' Designer Net framework for network training and pruning, i.e., is based on the diagonal Hessian approximation. The scheme does not require essential computational overhead in addition...

  10. End-to-end verifiability

    OpenAIRE

    Ryan, Peter; Benaloh, Josh; Rivest, Ronald; Stark, Philip; Teague, Vanessa; Vora, Poorvi

    2016-01-01

    This pamphlet describes end-to-end election verifiability (E2E-V) for a nontechnical audience: election officials, public policymakers, and anyone else interested in secure, transparent, evidence - based electronic elections. This work is part of the Overseas Vote Foundation’s End-to-End Verifiable Internet Voting: Specification and Feasibility Assessment Study (E2E VIV Project), funded by the Democracy Fund.

  11. Verifying the Hanging Chain Model

    Science.gov (United States)

    Karls, Michael A.

    2013-01-01

    The wave equation with variable tension is a classic partial differential equation that can be used to describe the horizontal displacements of a vertical hanging chain with one end fixed and the other end free to move. Using a web camera and TRACKER software to record displacement data from a vibrating hanging chain, we verify a modified version…

  12. Regular polytopes

    CERN Document Server

    Coxeter, H S M

    1973-01-01

    Polytopes are geometrical figures bounded by portions of lines, planes, or hyperplanes. In plane (two dimensional) geometry, they are known as polygons and comprise such figures as triangles, squares, pentagons, etc. In solid (three dimensional) geometry they are known as polyhedra and include such figures as tetrahedra (a type of pyramid), cubes, icosahedra, and many more; the possibilities, in fact, are infinite! H. S. M. Coxeter's book is the foremost book available on regular polyhedra, incorporating not only the ancient Greek work on the subject, but also the vast amount of information

  13. Testing Library Specifications by Verifying Conformance Tests

    DEFF Research Database (Denmark)

    Kiniry, Joseph Roland; Zimmerman, Daniel; Hyland, Ralph

    2012-01-01

    Formal specifications of standard libraries are necessary when statically verifying software that uses those libraries. Library specifications must be both correct, accurately reflecting library behavior, and useful, describing library behavior in sufficient detail to allow static verification...... of client programs. Specication and verification researchers regularly face the question of whether the library specications we use are correct and useful, and we have collectively provided no good answers. Over the past few years we have created and refined a software engineering process, which we call...... the Formal CTD Process (FCTD), to address this problem. Although FCTD is primarily targeted toward those who write Java libraries (or specifications for existing Java libraries) using the Java Modeling Language (JML), its techniques are broadly applicable. The key to FCTD is its novel usage of library...

  14. Verified OS Interface Code Synthesis

    Science.gov (United States)

    2016-12-01

    AFRL-AFOSR-JP-TR-2017-0015 Verified OS Interface Code Synthesis Gerwin Klein NATIONAL ICT AUSTRALIA LIMITED Final Report 02/14/2017 DISTRIBUTION A...ORGANIZATION NAME(S) AND ADDRESS(ES) NATIONAL ICT AUSTRALIA LIMITED L 5 13 GARDEN ST EVELEIGH, 2015 AU 8. PERFORMING ORGANIZATION REPORT NUMBER 9...public release: distribution unlimited. 1 Introduction The central question of this project was how to ensure the correctness of Operating System (OS

  15. Unconditionally verifiable blind quantum computation

    Science.gov (United States)

    Fitzsimons, Joseph F.; Kashefi, Elham

    2017-07-01

    Blind quantum computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output, and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. We previously proposed [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science, Atlanta, 2009 (IEEE, Piscataway, 2009), p. 517] a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with additional functionality allowing blind computational basis measurements, which we use to construct another verifiable BQC protocol based on a different class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. This resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest-neighbor form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

  16. Verifying RoboCup Teams

    OpenAIRE

    Benac Earle, Clara; Fredlund, Lars-Ake; Iglesias Martínez, José Antonio; Ledezma Espino, Agapito Ismael

    2009-01-01

    Pocreeding of: 5th International Workshop on Model Checking and Artificial Intelligence. MOCHART-2008, Patras, Greece, july, 21st, 2008. Verification of multi-agent systems is a challenging task due to their dynamic nature, and the complex interactions between agents. An example of such a system is the RoboCup Soccer Simulator, where two teams of eleven independent agents play a game of football against each other. In the present article we attempt to verify a number of properties of RoboC...

  17. UNFOLDED REGULAR AND SEMI-REGULAR POLYHEDRA

    Directory of Open Access Journals (Sweden)

    IONIŢĂ Elena

    2015-06-01

    Full Text Available This paper proposes a presentation unfolding regular and semi-regular polyhedra. Regular polyhedra are convex polyhedra whose faces are regular and equal polygons, with the same number of sides, and whose polyhedral angles are also regular and equal. Semi-regular polyhedra are convex polyhedra with regular polygon faces, several types and equal solid angles of the same type. A net of a polyhedron is a collection of edges in the plane which are the unfolded edges of the solid. Modeling and unfolding Platonic and Arhimediene polyhedra will be using 3dsMAX program. This paper is intended as an example of descriptive geometry applications.

  18. Verifying design patterns in Hoare Type Theory

    DEFF Research Database (Denmark)

    Svendsen, Kasper; Buisse, Alexandre; Birkedal, Lars

    In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory.......In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory....

  19. USCIS E-Verify Program Reports

    Data.gov (United States)

    Department of Homeland Security — The report builds on the last comprehensive evaluation of the E-Verify Program and demonstrates that E-Verify produces accurate results and that accuracy rates have...

  20. How to quantify student's regularity?

    OpenAIRE

    Shirvani Boroujeni, Mina; Sharma, Kshitij; Kidzinski, Lukasz; Lucignano, Lorenzo; Dillenbourg, Pierre

    2016-01-01

    Studies carried out in classroom-based learning context, have consistently shown a positive relation between students' conscientiousness and their academic success. We hypothesize that time management and regularity are main constructing blocks of students' conscientiousness in the context of online education. In online education, despite intuitive arguments supporting on-demand courses as more flexible delivery of knowledge, completion rate is higher in the courses with rigid temporal constr...

  1. Verifying FreeRTOS; a feasibility study

    OpenAIRE

    Pronk, C.

    2010-01-01

    This paper presents a study on modeling and verifying the kernel of Real-Time Operating Systems (RTOS). The study will show advances in formally verifying such an RTOS both by refinement and by model checking approaches. This work fits in the context of Hoare’s verification challenge. Several real-time operating systems will be discussed including some commercial ones. The focus of the latter part of the paper will be on verifying FreeRTOS. The paper investigates a number of ways to verify th...

  2. Software Model Checking for Verifying Distributed Algorithms

    Science.gov (United States)

    2014-10-28

    2014 Carnegie Mellon University Software Model Checking for Verifying Distributed Algorithms Sagar Chaki, James Edmondson October 28, 2014...SUBTITLE Software Model Checking for Verifying Distributed Algorithms 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...Program Software Model Checking (CBMC, BLAST etc.) Failure Success Program in Domain Specific Language Automatic verification technique for finite

  3. Verifying FreeRTOS; a feasibility study

    NARCIS (Netherlands)

    Pronk, C.

    2010-01-01

    This paper presents a study on modeling and verifying the kernel of Real-Time Operating Systems (RTOS). The study will show advances in formally verifying such an RTOS both by refinement and by model checking approaches. This work fits in the context of Hoare’s verification challenge. Several

  4. On Verified Numerical Computations in Convex Programming

    OpenAIRE

    Jansson, Christian

    2009-01-01

    This survey contains recent developments for computing verified results of convex constrained optimization problems, with emphasis on applications. Especially, we consider the computation of verified error bounds for non-smooth convex conic optimization in the framework of functional analysis, for linear programming, and for semidefinite programming. A discussion of important problem transformations to special types of convex problems and convex relaxations is included...

  5. 37 CFR 2.33 - Verified statement.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verified statement. 2.33 Section 2.33 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The...

  6. On the Complexity of Verifying Regular Properties on Flat Counter Systems

    OpenAIRE

    Demri, Stéphane; Dhar, Amit Kumar; Sangnier, Arnaud

    2013-01-01

    Among the approximation methods for the verification of counter systems, one of them consists in model-checking their flat unfoldings. Unfortunately, the complexity characterization of model-checking problems for such operational models is not always well studied except for reachability queries or for Past LTL. In this paper, we characterize the complexity of model-checking problems on flat counter systems for the specification languages including first-order logic, linear mu-calculus, infini...

  7. USCIS E-Verify Self-Check

    Data.gov (United States)

    Department of Homeland Security — E-Verify is an internet based system that contains datasets to compare information from an employee's Form I-9, Employment Eligibility Verification, to data from the...

  8. Regular Expression Pocket Reference

    CERN Document Server

    Stubblebine, Tony

    2007-01-01

    This handy little book offers programmers a complete overview of the syntax and semantics of regular expressions that are at the heart of every text-processing application. Ideal as a quick reference, Regular Expression Pocket Reference covers the regular expression APIs for Perl 5.8, Ruby (including some upcoming 1.9 features), Java, PHP, .NET and C#, Python, vi, JavaScript, and the PCRE regular expression libraries. This concise and easy-to-use reference puts a very powerful tool for manipulating text and data right at your fingertips. Composed of a mixture of symbols and text, regular exp

  9. Regular simplex refinement by regular simplices

    NARCIS (Netherlands)

    Casado, L.G.; Tóth, B.G.; Hendrix, E.M.T.; García, I.

    2014-01-01

    A naturalway to define branching in Branch-and-Bound for blending problemis to do bisection. The disadvantage of bisectioning is that partition sets are in general irregular. A regular simplex with fixed orientation can be determined by its center and size, allowing storage savings in a Branchand-

  10. An IBM 370 assembly language program verifier

    Science.gov (United States)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  11. Regular expressions cookbook

    CERN Document Server

    Goyvaerts, Jan

    2009-01-01

    This cookbook provides more than 100 recipes to help you crunch data and manipulate text with regular expressions. Every programmer can find uses for regular expressions, but their power doesn't come worry-free. Even seasoned users often suffer from poor performance, false positives, false negatives, or perplexing bugs. Regular Expressions Cookbook offers step-by-step instructions for some of the most common tasks involving this tool, with recipes for C#, Java, JavaScript, Perl, PHP, Python, Ruby, and VB.NET. With this book, you will: Understand the basics of regular expressions through a

  12. Regular expression containment

    DEFF Research Database (Denmark)

    Henglein, Fritz; Nielsen, Lasse

    2011-01-01

    * for Kleene-star, and a general coin- duction rule as the only additional rule. Our axiomatization gives rise to a natural computational inter- pretation of regular expressions as simple types that represent parse trees, and of containment proofs as coercions. This gives the axiom- atization a Curry...... to be undecidable. We discuss application of regular expressions as types to bit coding of strings and hint at other applications to the wide-spread use of regular expressions for substring matching, where classical automata-theoretic techniques are a priori inapplicable. Neither regular expressions as types nor...

  13. Firms Verify Online IDs Via Schools

    Science.gov (United States)

    Davis, Michelle R.

    2008-01-01

    Companies selling services to protect children and teenagers from sexual predators on the Internet have enlisted the help of schools and teachers to verify students' personal information. Those companies are also sharing some of the information with Web sites, which can pass it along to businesses for use in targeting advertising to young…

  14. Unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2016-07-01

    Full Text Available We investigate self-verifying nondeterministic finite automata, in the case of unary symmetric difference nondeterministic finite automata (SV-XNFA). We show that there is a family of languages Ln=2 which can always be represented non...

  15. A Framework for Multi-Robot Motion Planning from Temporal Logic Specifications

    DEFF Research Database (Denmark)

    Koo, T. John; Li, Rongqing; Quottrup, Michael Melholt

    2012-01-01

    We propose a framework for the coordination of a network of robots with respect to formal requirement specifications expressed in temporal logics. A regular tessellation is used to partition the space of interest into a union of disjoint regular and equal cells with finite facets, and each cell can......-time Temporal Logic, Computation Tree Logic, and -calculus can be preserved. Motion planning can then be performed at a discrete level by considering the parallel composition of discrete abstractions of the robots with a requirement specification given in a suitable temporal logic. The bisimilarity ensures...... that the discrete planning solutions are executable by the robots. For demonstration purpose, a finite automaton is used as the abstraction and the requirement specification is expressed in Computation Tree Logic. The model checker Cadence SMV is used to generate coordinated verified motion planning solutions. Two...

  16. Logical empiricism and the principle of verifiability

    Directory of Open Access Journals (Sweden)

    Zečević Svetlana D.

    2014-01-01

    Full Text Available This paper represents an encounter and dialogue between philosophy of language and analytic philosophy. The main aim is to present the logical empiricism of the milieu of its creation in the Vienna -Circle. Exhibited are significant points of known members of the Vienna Circle, not only on the logical empiricism as a theory and movement, but also considers its close link with the principle of verifiability, the purpose of which we list and explain the variations of the definition of verificationism as the basis of logical empiricism. In the Vienna Circle we distinguish several streams with respect of the definition of the principles of verification, and one of them proposed formulation of this principles as a theory of meaning which also require a complete verification. The second stream is leaning to the formulation of criteria for determining the meaning and they put their focus on incomplete verification. Having both pozitions in mind we are able to make distinction between criteria of adequacy and -criteria of utilitarty of the principle of verifiability. This fact implies that it is vital to determine the necessary conditions of adequacy of the principle of verifiability, which primarily reflectes in the preservation of empiricism where the main terms used in the formulation must be clear, non-ambiguous and operational.

  17. Regularization by External Variables

    DEFF Research Database (Denmark)

    Bossolini, Elena; Edwards, R.; Glendinning, P. A.

    2016-01-01

    Regularization was a big topic at the 2016 CRM Intensive Research Program on Advances in Nonsmooth Dynamics. There are many open questions concerning well known kinds of regularization (e.g., by smoothing or hysteresis). Here, we propose a framework for an alternative and important kind of regula...... of regularization, by external variables that shadow either the state or the switch of the original system. The shadow systems are derived from and inspired by various applications in electronic control, predator-prey preference, time delay, and genetic regulation....

  18. DETECTION OF OIL POLLUTION HOTSPOTS AND LEAK SOURCES THROUGH THE QUANTITATIVE ASSESSMENT OF THE PERSISTENCE AND TEMPORAL REPETITION OF REGULAR OIL SPILLS IN THE CASPIAN SEA USING REMOTE SENSING AND GIS

    OpenAIRE

    E. R. Bayramov; Buchroithner, M. F.; Bayramov, R. V.

    2015-01-01

    The main goal of this research was to detect oil spills, to determine the oil spill frequencies and to approximate oil leak sources around the Oil Rocks Settlement, the Chilov and Pirallahi Islands in the Caspian Sea using 136 multi-temporal ENVISAT Advanced Synthetic Aperture Radar Wide Swath Medium Resolution Images acquired during 2006-2010. The following oil spill frequencies were observed around the Oil Rocks Settlement, the Chilov and Pirallahi Islands: 2-10 (3471.04 sq. km....

  19. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan

    2015-02-12

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  20. Verifying bound entanglement of dephased Werner states

    Science.gov (United States)

    Thomas, P.; Bohmann, M.; Vogel, W.

    2017-10-01

    The verification of quantum entanglement under the influence of realistic noise and decoherence is crucial for the development of quantum technologies. Unfortunately, a full entanglement characterization is generally not possible with most entanglement criteria such as entanglement witnesses or the partial transposition criterion. In particular, so-called bound entanglement cannot be certified via the partial transposition criterion. Here we present the full entanglement verification of dephased qubit and qutrit Werner states via entanglement quasiprobabilities. Remarkably, we are able to reveal bound entanglement for noisy mixed states in the qutrit case. This example demonstrates the strength of the entanglement quasiprobabilities for verifying the full entanglement of quantum states suffering from noise.

  1. Verified Subtyping with Traits and Mixins

    Directory of Open Access Journals (Sweden)

    Asankhaya Sharma

    2014-07-01

    Full Text Available Traits allow decomposing programs into smaller parts and mixins are a form of composition that resemble multiple inheritance. Unfortunately, in the presence of traits, programming languages like Scala give up on subtyping relation between objects. In this paper, we present a method to check subtyping between objects based on entailment in separation logic. We implement our method as a domain specific language in Scala and apply it on the Scala standard library. We have verified that 67% of mixins used in the Scala standard library do indeed conform to subtyping between the traits that are used to build them.

  2. Phenotype in 18 Danish subjects with genetically verified CHARGE syndrome

    DEFF Research Database (Denmark)

    Husu, E; Hove, Hd; Farholt, Stense

    2012-01-01

    Husu E, Hove HD, Farholt S, Bille M, Tranebjaerg L, Vogel I, Kreiborg S. Phenotype in 18 Danish subjects with genetically verified CHARGE syndrome. CHARGE (coloboma of the eye, heart defects, choanal atresia, retarded growth and development, genital hypoplasia and ear anomalies and/or hearing loss......) syndrome is a rare genetic, multiple-malformation syndrome. About 80% of patients with a clinical diagnose, have a mutation or a deletion in the gene encoding chromodomain helicase DNA-binding protein 7 (CHD7). Genotype-phenotype correlation is only partly known. In this nationwide study, phenotypic...... problems (12/15) were other frequent cranial nerve dysfunctions. Three-dimensional reconstructions of MRI scans showed temporal bone abnormalities in >85%. CHARGE syndrome present a broad phenotypic spectrum, although some clinical features are more frequently occurring than others. Here, we suggest...

  3. Verifying disarmament: scientific, technological and political challenges

    Energy Technology Data Exchange (ETDEWEB)

    Pilat, Joseph R [Los Alamos National Laboratory

    2011-01-25

    There is growing interest in, and hopes for, nuclear disarmament in governments and nongovernmental organizations (NGOs) around the world. If a nuclear-weapon-free world is to be achievable, verification and compliance will be critical. VerifYing disarmament would have unprecedented scientific, technological and political challenges. Verification would have to address warheads, components, materials, testing, facilities, delivery capabilities, virtual capabilities from existing or shutdown nuclear weapon and existing nuclear energy programs and material and weapon production and related capabilities. Moreover, it would likely have far more stringent requirements. The verification of dismantlement or elimination of nuclear warheads and components is widely recognized as the most pressing problem. There has been considerable research and development done in the United States and elsewhere on warhead and dismantlement transparency and verification since the early 1990s. However, we do not today know how to verifY low numbers or zero. We need to develop the needed verification tools and systems approaches that would allow us to meet this complex set of challenges. There is a real opportunity to explore verification options and, given any realistic time frame for disarmament, there is considerable scope to invest resources at the national and international levels to undertake research, development and demonstrations in an effort to address the anticipated and perhaps unanticipated verification challenges of disarmament now andfor the next decades. Cooperative approaches have the greatest possibility for success.

  4. Detection of Oil Pollution Hotspots and Leak Sources Through the Quantitative Assessment of the Persistence and Temporal Repetition of Regular Oil Spills in the Caspian Sea Using Remote Sensing and GIS

    Science.gov (United States)

    Bayramov, E. R.; Buchroithner, M. F.; Bayramov, R. V.

    2015-08-01

    The main goal of this research was to detect oil spills, to determine the oil spill frequencies and to approximate oil leak sources around the Oil Rocks Settlement, the Chilov and Pirallahi Islands in the Caspian Sea using 136 multi-temporal ENVISAT Advanced Synthetic Aperture Radar Wide Swath Medium Resolution Images acquired during 2006-2010. The following oil spill frequencies were observed around the Oil Rocks Settlement, the Chilov and Pirallahi Islands: 2-10 (3471.04 sq. km.), 11-20 (971.66 sq. km.), 21-50 (692.44 sq. km.), 51-128 (191.38 sq. km.). The most critical oil leak sources with the frequency range of 41-128 were observed at the Oil Rocks Settlement. The exponential regression analysis between wind speeds and oil slick areas detected from 136 multi-temporal ENVISAT images revealed the regression coefficient equal to 63%. The regression model showed that larger oil spill areas were observed with decreasing wind speeds. The spatiotemporal patterns of currents in the Caspian Sea explained the multi-directional spatial distribution of oil spills around Oil Rocks Settlement, the Chilov and Pirallahi Islands. The linear regression analysis between detected oil spill frequencies and predicted oil contamination probability by the stochastic model showed the positive trend with the regression coefficient of 30%.

  5. (2 + 1)-dimensional regular black holes with nonlinear electrodynamics sources

    Science.gov (United States)

    He, Yun; Ma, Meng-Sen

    2017-11-01

    On the basis of two requirements: the avoidance of the curvature singularity and the Maxwell theory as the weak field limit of the nonlinear electrodynamics, we find two restricted conditions on the metric function of (2 + 1)-dimensional regular black hole in general relativity coupled with nonlinear electrodynamics sources. By the use of the two conditions, we obtain a general approach to construct (2 + 1)-dimensional regular black holes. In this manner, we construct four (2 + 1)-dimensional regular black holes as examples. We also study the thermodynamic properties of the regular black holes and verify the first law of black hole thermodynamics.

  6. Verifying Deadlock-Freedom of Communication Fabrics

    Science.gov (United States)

    Gotmanov, Alexander; Chatterjee, Satrajit; Kishinevsky, Michael

    Avoiding message dependent deadlocks in communication fabrics is critical for modern microarchitectures. If discovered late in the design cycle, deadlocks lead to missed project deadlines and suboptimal design decisions. One approach to avoid this problem is to get high level of confidence on an early microarchitectural model. However, formal proofs of liveness even on abstract models are hard due to large number of queues and distributed control. In this work we address liveness verification of communication fabrics described in the form of high-level microarchitectural models which use a small set of well-defined primitives. We prove that under certain realistic restrictions, deadlock freedom can be reduced to unsatisfiability of a system of Boolean equations. Using this approach, we have automatically verified liveness of several non-trivial models (derived from industrial microarchitectures), where state-of-the-art model checkers failed and pen and paper proofs were either tedious or unknown.

  7. Group-Interest-Based Verifiable CCN

    Directory of Open Access Journals (Sweden)

    DaeYoub Kim

    2016-01-01

    Full Text Available To solve various problems of the Internet, content centric networking (CCN, one of information centric networking architectures (ICN, provides both an in-network content caching scheme and a built-in content verification scheme. However, a user is still asked to generate many request messages when retrieving fragmented content through CCN. This model can seriously increase the amount of network traffic. Furthermore, when receiving content, a user is asked to verify the received content before using it. This verification process can cause a serious service delay. To improve such inefficiencies, this paper proposes a transmission process to handle request messages at one time. Also, it suggests an efficient content verification method using both hash chains and Merkel-hash tree.

  8. What proof do we prefer? Variants of verifiability in voting

    NARCIS (Netherlands)

    Pieters, Wolter

    2006-01-01

    In this paper, we discuss one particular feature of Internet voting, verifiability, against the background of scientific literature and experiments in the Netherlands. In order to conceptually clarify what verifiability is about, we distinguish classical verifiability from constructive veriability

  9. Annotation of Regular Polysemy

    DEFF Research Database (Denmark)

    Martinez Alonso, Hector

    Regular polysemy has received a lot of attention from the theory of lexical semantics and from computational linguistics. However, there is no consensus on how to represent the sense of underspecified examples at the token level, namely when annotating or disambiguating senses of metonymic words...

  10. A Verified Algebra for Linked Data

    Directory of Open Access Journals (Sweden)

    Ross Horne

    2011-07-01

    Full Text Available A foundation is investigated for the application of loosely structured data on the Web. This area is often referred to as Linked Data, due to the use of URIs in data to establish links. This work focuses on emerging W3C standards which specify query languages for Linked Data. The approach is to provide an abstract syntax to capture Linked Data structures and queries, which are then internalised in a process calculus. An operational semantics for the calculus specifies how queries, data and processes interact. A labelled transition system is shown to be sound with respect to the operational semantics. Bisimulation over the labelled transition system is used to verify an algebra over queries. The derived algebra is a contribution to the application domain. For instance, the algebra may be used to rewrite a query to optimise its distribution across a cluster of servers. The framework used to provide the operational semantics is powerful enough to model related calculi for the Web.

  11. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  12. Sparse structure regularized ranking

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-04-17

    Learning ranking scores is critical for the multimedia database retrieval problem. In this paper, we propose a novel ranking score learning algorithm by exploring the sparse structure and using it to regularize ranking scores. To explore the sparse structure, we assume that each multimedia object could be represented as a sparse linear combination of all other objects, and combination coefficients are regarded as a similarity measure between objects and used to regularize their ranking scores. Moreover, we propose to learn the sparse combination coefficients and the ranking scores simultaneously. A unified objective function is constructed with regard to both the combination coefficients and the ranking scores, and is optimized by an iterative algorithm. Experiments on two multimedia database retrieval data sets demonstrate the significant improvements of the propose algorithm over state-of-the-art ranking score learning algorithms.

  13. Seismic methods for verifying nuclear test bans

    Science.gov (United States)

    Sykes, Lynn R.; Evernden, Jack F.; Cifuentes, Inés

    1983-10-01

    Seismological research of the past 25 years related to verification of a Threshold Test Ban Treaty (TIBT) indicates that a treaty banning nuclear weapons tests in all environments, including underground explosions, can be monitored with high reliablility down to explosions of very small size (about one kiloton). There would be high probability of successful identification of explosions of that size even if elaborate measures were taken to evade detection. Seismology provides the principal means of detecting, locating and identifying underground explosions and of determining their yields. We discuss a number of methods for identifying detected seismic events as being either explosions or earthquakes including the event's location, depth and spectral character. The seismic waves generated by these two types of sources differ in a number of fundamental ways that can be utilized for identification or discrimination. All of the long-standing issues related to a comprehensive treaty were resolved in principle (and in may cases in detail) in negotiations between the U.S., the U.S.S.R. and Britian from 1977 to 1980. Those negotiations have not resumed since 1980. Inadequate seismic means of verifying a CTBT, Soviet cheating on the 150-kt limit of the Treshold Test Ban Treaty of 1976, and the need to develop and test new nuclear weapons were cited in 1982 by the U.S. government as reasons for not continuing negotiations for a CTBT. The first two reservations, which depend heavily on seismological information, are not supported scientifically. A CTBT could help to put a lid on the seemingly endless testing of new generations of nuclear weapons by both superpowers.

  14. Accelerating regular polygon beams.

    Science.gov (United States)

    Barwick, Shane

    2010-12-15

    Beams that possess high-intensity peaks that follow curved paths of propagation under linear diffraction have recently been shown to have a multitude of interesting uses. In this Letter, a family of phase-only masks is derived, and each mask gives rise to multiple accelerating intensity maxima. The curved paths of the peaks can be described by the vertices of a regular polygon that is centered on the optic axis and expands with propagation.

  15. Regular Single Valued Neutrosophic Hypergraphs

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam Malik

    2016-12-01

    Full Text Available In this paper, we define the regular and totally regular single valued neutrosophic hypergraphs, and discuss the order and size along with properties of regular and totally regular single valued neutrosophic hypergraphs. We also extend work on completeness of single valued neutrosophic hypergraphs.

  16. The geometry of continuum regularization

    Energy Technology Data Exchange (ETDEWEB)

    Halpern, M.B.

    1987-03-01

    This lecture is primarily an introduction to coordinate-invariant regularization, a recent advance in the continuum regularization program. In this context, the program is seen as fundamentally geometric, with all regularization contained in regularized DeWitt superstructures on field deformations.

  17. Modular Regularization Algorithms

    DEFF Research Database (Denmark)

    Jacobsen, Michael

    2004-01-01

    an iterative method. The parameter choice method is also used to demonstrate the implementation of the standard-form transformation. We have implemented a simple preconditioner aimed at the preconditioning of the general-form Tikhonov problem and demonstrate its simplicity and effciency. The steps taken...... and used to set up the illposed problems in the toolbox. Hereby, we are able to write regularization algorithms that automatically exploit structure in the ill-posed problem without being rewritten explicitly. We explain how to implement a stopping criteria for a parameter choice method based upon...

  18. USCIS E-Verify Customer Satisfaction Survey, January 2013

    Data.gov (United States)

    Department of Homeland Security — This report focuses on the customer satisfaction of companies currently enrolled in the E-Verify program. Satisfaction with E-Verify remains high and follows up a...

  19. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications

    National Research Council Canada - National Science Library

    Jiajun Sun; Ningzhong Liu

    2017-01-01

    .... In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model...

  20. 28 CFR 802.13 - Verifying your identity.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity. You...

  1. Single Cell “Glucose Nanosensor” Verifies Elevated Glucose Levels in Individual Cancer Cells

    OpenAIRE

    Nascimento, Raphael A. S.; Özel, Rıfat Emrah; Mak, Wai Han; Mulato, Marcelo; Singaram, Bakthan; Pourmand, Nader

    2016-01-01

    Because the transition from oxidative phosphorylation to anaerobic glycolytic metabolism is a hallmark of cancer progression, approaches to identify single living cancer cells by their unique glucose metabolic signature would be useful. Here, we present nanopipettes specifically developed to measure glucose levels in single cells with temporal and spatial resolution, and we use this technology to verify the hypothesis that individual cancer cells can indeed display higher intracellular glucos...

  2. Adaptive Regularization of Neural Classifiers

    DEFF Research Database (Denmark)

    Andersen, Lars Nonboe; Larsen, Jan; Hansen, Lars Kai

    1997-01-01

    We present a regularization scheme which iteratively adapts the regularization parameters by minimizing the validation error. It is suggested to use the adaptive regularization scheme in conjunction with optimal brain damage pruning to optimize the architecture and to avoid overfitting. Furthermore......, we propose an improved neural classification architecture eliminating an inherent redundancy in the widely used SoftMax classification network. Numerical results demonstrate the viability of the method...

  3. Identity-Based Verifiably Encrypted Signatures without Random Oracles

    Science.gov (United States)

    Zhang, Lei; Wu, Qianhong; Qin, Bo

    Fair exchange protocol plays an important role in electronic commerce in the case of exchanging digital contracts. Verifiably encrypted signatures provide an optimistic solution to these scenarios with an off-line trusted third party. In this paper, we propose an identity-based verifiably encrypted signature scheme. The scheme is non-interactive to generate verifiably encrypted signatures and the resulting encrypted signature consists of only four group elements. Based on the computational Diffie-Hellman assumption, our scheme is proven secure without using random oracles. To the best of our knowledge, this is the first identity-based verifiably encrypted signature scheme provably secure in the standard model.

  4. Existence of connected regular and nearly regular graphs

    OpenAIRE

    Ganesan, Ghurumuruhan

    2018-01-01

    For integers $k \\geq 2$ and $n \\geq k+1$, we prove the following: If $n\\cdot k$ is even, there is a connected $k$-regular graph on $n$ vertices. If $n\\cdot k$ is odd, there is a connected nearly $k$-regular graph on $n$ vertices.

  5. A criterion for regular sequences

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    D P PATIL1, U STORCH2 and J ST ¨UCKRAD3. 1Department of Mathematics, Indian Institute of Science, Bangalore 560 012, India ... For general notations in com- mutative algebra we also refer to [1]. ... Note that every sequence is a strongly regular as well as regular sequence on the zero module. Further, it is clear that a ...

  6. Online co-regularized algorithms

    NARCIS (Netherlands)

    Ruijter, T. de; Tsivtsivadze, E.; Heskes, T.

    2012-01-01

    We propose an online co-regularized learning algorithm for classification and regression tasks. We demonstrate that by sequentially co-regularizing prediction functions on unlabeled data points, our algorithm provides improved performance in comparison to supervised methods on several UCI benchmarks

  7. A Mollification Regularization Method for a Fractional-Diffusion Inverse Heat Conduction Problem

    Directory of Open Access Journals (Sweden)

    Zhi-Liang Deng

    2013-01-01

    the governing linear diffusion equation is of fractional type is discussed. A simple regularization method based on Dirichlet kernel mollification techniques is introduced. We also propose a priori and a posteriori parameter choice rules and get the corresponding error estimate between the exact solution and its regularized approximation. Moreover, a numerical example is provided to verify our theoretical results.

  8. National, Regional and Global Certification Bodies for Polio Eradication: A Framework for Verifying Measles Elimination.

    Science.gov (United States)

    Deblina Datta, S; Tangermann, Rudolf H; Reef, Susan; William Schluter, W; Adams, Anthony

    2017-07-01

    The Global Certification Commission (GCC), Regional Certification Commissions (RCCs), and National Certification Committees (NCCs) provide a framework of independent bodies to assist the Global Polio Eradication Initiative (GPEI) in certifying and maintaining polio eradication in a standardized, ongoing, and credible manner. Their members meet regularly to comprehensively review population immunity, surveillance, laboratory, and other data to assess polio status in the country (NCC), World Health Organization (WHO) region (RCC), or globally (GCC). These highly visible bodies provide a framework to be replicated to independently verify measles and rubella elimination in the regions and globally. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  9. Natural selection and mechanistic regularity.

    Science.gov (United States)

    DesAutels, Lane

    2016-06-01

    In this article, I address the question of whether natural selection operates regularly enough to qualify as a mechanism of the sort characterized by Machamer, Darden, and Craver (2000). Contrary to an influential critique by Skipper and Millstein (2005), I argue that natural selection can be seen to be regular enough to qualify as an MDC mechanism just fine-as long as we pay careful attention to some important distinctions regarding mechanistic regularity and abstraction. Specifically, I suggest that when we distinguish between process vs. product regularity, mechanism-internal vs. mechanism-external sources of irregularity, and abstract vs. concrete regularity, we can see that natural selection is only irregular in senses that are unthreatening to its status as an MDC mechanism. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Appraising the value of independent EIA follow-up verifiers

    Energy Technology Data Exchange (ETDEWEB)

    Wessels, Jan-Albert, E-mail: janalbert.wessels@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Retief, Francois, E-mail: francois.retief@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Environmental Assessment, School of Environmental Science, Murdoch University, Australia. (Australia)

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  11. Mathematical Modeling the Geometric Regularity in Proteus Mirabilis Colonies

    Science.gov (United States)

    Zhang, Bin; Jiang, Yi; Minsu Kim Collaboration

    Proteus Mirabilis colony exhibits striking spatiotemporal regularity, with concentric ring patterns with alternative high and low bacteria density in space, and periodicity for repetition process of growth and swarm in time. We present a simple mathematical model to explain the spatiotemporal regularity of P. Mirabilis colonies. We study a one-dimensional system. Using a reaction-diffusion model with thresholds in cell density and nutrient concentration, we recreated periodic growth and spread patterns, suggesting that the nutrient constraint and cell density regulation might be sufficient to explain the spatiotemporal periodicity in P. Mirabilis colonies. We further verify this result using a cell based model.

  12. Attacks on One Designated Verifier Proxy Signature Scheme

    Directory of Open Access Journals (Sweden)

    Baoyuan Kang

    2012-01-01

    Full Text Available In a designated verifier proxy signature scheme, there are three participants, namely, the original signer, the proxy signer, and the designated verifier. The original signer delegates his or her signing right to the proxy signer, then the proxy signer can generate valid signature on behalf of the original signer. But only the designated verifier can verify the proxy signature. Several designated verifier proxy signature schemes have been proposed. However, most of them were proven secure in the random oracle model, which has received a lot of criticism since the security proofs in the random oracle model are not sound with respect to the standard model. Recently, by employing Water's hashing technique, Yu et al. proposed a new construction of designated verifier proxy signature. They claimed that the new construction is the first designated verifier proxy signature, whose security does not rely on the random oracles. But, in this paper, we will show some attacks on Yu et al.'s scheme. So, their scheme is not secure.

  13. vVote: Verifiable Electronic Voting in Practice

    OpenAIRE

    Burton, C; Culnane, C; Schneider, S.

    2016-01-01

    This paper reports on the experience of deploying the vVote verifiable voting system in the November 2014 State election in Victoria, Australia. It describes the system that was deployed, discusses its end-to-end verifiability, and reports on the voters’ and poll workers’ experience with the system. Blind voters were able to cast a fully secret ballot in a verifiable way, as were voters in remote locations. The feedback finds the system to be acceptably usable with an electronic interface, th...

  14. NOS CO-OPS Water Level Data, Verified, Hourly

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), hourly, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  15. NOS CO-OPS Water Level Data, Verified, 6-Minute

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), 6-minute, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  16. NOS CO-OPS Water Level Data, Verified, High Low

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), daily, high low water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services...

  17. Learn About SmartWay Verified Aerodynamic Devices

    Science.gov (United States)

    Installing EPA-verified aerodynamic technologies on your trailer can help fleet and truck owners save fuel. Options include gap reducers, skirts, or tails and can be installed individually or in combination.

  18. NONCONVEX REGULARIZATION FOR SHAPE PRESERVATION

    Energy Technology Data Exchange (ETDEWEB)

    CHARTRAND, RICK [Los Alamos National Laboratory

    2007-01-16

    The authors show that using a nonconvex penalty term to regularize image reconstruction can substantially improve the preservation of object shapes. The commonly-used total-variation regularization, {integral}|{del}u|, penalizes the length of the object edges. They show that {integral}|{del}u|{sup p}, 0 < p < 1, only penalizes edges of dimension at least 2-p, and thus finite-length edges not at all. We give numerical examples showing the resulting improvement in shape preservation.

  19. Regularization with a pruning prior

    DEFF Research Database (Denmark)

    Goutte, Cyril; Hansen, Lars Kai

    1997-01-01

    We investigate the use of a regularization priorthat we show has pruning properties. Analyses areconducted both using a Bayesian framework and withthe generalization method, on a simple toyproblem. Results are thoroughly compared withthose obtained with a traditional weight decay.......We investigate the use of a regularization priorthat we show has pruning properties. Analyses areconducted both using a Bayesian framework and withthe generalization method, on a simple toyproblem. Results are thoroughly compared withthose obtained with a traditional weight decay....

  20. Commuting Π-regular rings

    Directory of Open Access Journals (Sweden)

    Shervin Sahebi

    2014-05-01

    Full Text Available ‎$R$ is called commuting regular ring (resp‎. ‎semigroupif‎ for each $x,y\\in R$ there exists $a\\in R$‎ such that$xy=yxayx$‎. ‎In this paper‎, ‎we introduce the concept of‎‎commuting $\\pi$-regular rings (resp‎. ‎semigroups and‎‎study various properties of them.

  1. Implicit Regularization in Deep Learning

    OpenAIRE

    Neyshabur, Behnam

    2017-01-01

    In an attempt to better understand generalization in deep learning, we study several possible explanations. We show that implicit regularization induced by the optimization method is playing a key role in generalization and success of deep learning models. Motivated by this view, we study how different complexity measures can ensure generalization and explain how optimization algorithms can implicitly regularize complexity measures. We empirically investigate the ability of these measures to ...

  2. A Tutorial on Using Dafny to Construct Verified Software

    Directory of Open Access Journals (Sweden)

    Paqui Lucio

    2017-01-01

    Full Text Available This paper is a tutorial for newcomers to the field of automated verification tools, though we assume the reader to be relatively familiar with Hoare-style verification. In this paper, besides introducing the most basic features of the language and verifier Dafny, we place special emphasis on how to use Dafny as an assistant in the development of verified programs. Our main aim is to encourage the software engineering community to make the move towards using formal verification tools.

  3. TrustGuard: A Containment Architecture with Verified Output

    Science.gov (United States)

    2017-01-01

    support structures . Finally, the link consumes a geomean 5.0% of the energy of the untrusted processor. 84 Chapter 7 The Simplicity of the Sentry Chapter 2...TRUSTGUARD: A CONTAINMENT ARCHITECTURE WITH VERIFIED OUTPUT SOUMYADEEP GHOSH A DISSERTATION PRESENTED TO THE FACULTY OF PRINCETON UNIVERSITY IN...depends on claims made by each party supplying the system’s components. This dissertation presents the Containment Architecture with Verified Output

  4. Regularization of Instantaneous Frequency Attribute Computations

    Science.gov (United States)

    Yedlin, M. J.; Margrave, G. F.; Van Vorst, D. G.; Ben Horin, Y.

    2014-12-01

    We compare two different methods of computation of a temporally local frequency:1) A stabilized instantaneous frequency using the theory of the analytic signal.2) A temporally variant centroid (or dominant) frequency estimated from a time-frequency decomposition.The first method derives from Taner et al (1979) as modified by Fomel (2007) and utilizes the derivative of the instantaneous phase of the analytic signal. The second method computes the power centroid (Cohen, 1995) of the time-frequency spectrum, obtained using either the Gabor or Stockwell Transform. Common to both methods is the necessity of division by a diagonal matrix, which requires appropriate regularization.We modify Fomel's (2007) method by explicitly penalizing the roughness of the estimate. Following Farquharson and Oldenburg (2004), we employ both the L curve and GCV methods to obtain the smoothest model that fits the data in the L2 norm.Using synthetic data, quarry blast, earthquakes and the DPRK tests, our results suggest that the optimal method depends on the data. One of the main applications for this work is the discrimination between blast events and earthquakesFomel, Sergey. " Local seismic attributes." , Geophysics, 72.3 (2007): A29-A33.Cohen, Leon. " Time frequency analysis theory and applications." USA: Prentice Hall, (1995).Farquharson, Colin G., and Douglas W. Oldenburg. "A comparison of automatic techniques for estimating the regularization parameter in non-linear inverse problems." Geophysical Journal International 156.3 (2004): 411-425.Taner, M. Turhan, Fulton Koehler, and R. E. Sheriff. " Complex seismic trace analysis." Geophysics, 44.6 (1979): 1041-1063.

  5. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-01-01

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration. PMID:28869574

  6. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications

    Directory of Open Access Journals (Sweden)

    Jiajun Sun

    2017-09-01

    Full Text Available Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  7. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  8. The perception of regularity in an isochronous stimulus in zebra finches (Taeniopygia guttata) and humans

    NARCIS (Netherlands)

    van der Aa, J.; Honing, H.; ten Cate, C.

    2015-01-01

    Perceiving temporal regularity in an auditory stimulus is considered one of the basic features of musicality. Here we examine whether zebra finches can detect regularity in an isochronous stimulus. Using a go/no go paradigm we show that zebra finches are able to distinguish between an isochronous

  9. Toric Geometry of the Regular Convex Polyhedra

    Directory of Open Access Journals (Sweden)

    Fiammetta Battaglia

    2017-01-01

    Full Text Available We describe symplectic and complex toric spaces associated with the five regular convex polyhedra. The regular tetrahedron and the cube are rational and simple, the regular octahedron is not simple, the regular dodecahedron is not rational, and the regular icosahedron is neither simple nor rational. We remark that the last two cases cannot be treated via standard toric geometry.

  10. Word regularity affects orthographic learning.

    Science.gov (United States)

    Wang, Hua-Chen; Castles, Anne; Nickels, Lyndsey

    2012-01-01

    Share's self-teaching hypothesis proposes that orthographic representations are acquired via phonological decoding. A key, yet untested, prediction of this theory is that there should be an effect of word regularity on the number and quality of word-specific orthographic representations that children acquire. Thirty-four Grade 2 children were exposed to the sound and meaning of eight novel words and were then presented with those words in written form in short stories. Half the words were assigned regular pronunciations and half irregular pronunciations. Lexical decision and spelling tasks conducted 10 days later revealed that the children's orthographic representations of the regular words appeared to be stronger and more extensive than those of the irregular words.

  11. Vertex-deleted subgraphs and regular factors from regular graph☆

    Science.gov (United States)

    Lu, Hongliang; Wang, Wei; Bai, Bing

    2011-01-01

    Let k, m, and r be three integers such that 2≤k≤m≤r. Let G be a 2r-regular, 2m-edge-connected graph of odd order. We obtain some sufficient conditions for G−v to contain a k-factor for all v∈V(G). PMID:21984842

  12. (2+1-dimensional regular black holes with nonlinear electrodynamics sources

    Directory of Open Access Journals (Sweden)

    Yun He

    2017-11-01

    Full Text Available On the basis of two requirements: the avoidance of the curvature singularity and the Maxwell theory as the weak field limit of the nonlinear electrodynamics, we find two restricted conditions on the metric function of (2+1-dimensional regular black hole in general relativity coupled with nonlinear electrodynamics sources. By the use of the two conditions, we obtain a general approach to construct (2+1-dimensional regular black holes. In this manner, we construct four (2+1-dimensional regular black holes as examples. We also study the thermodynamic properties of the regular black holes and verify the first law of black hole thermodynamics.

  13. Temporal bone imaging

    Energy Technology Data Exchange (ETDEWEB)

    Lemmerling, Marc [Algemeen Ziekenhuis Sint-Lucas, Gent (Belgium). Dept. of Radiology; Foer, Bert de (ed.) [Sint-Augustinus Ziekenhuis, Wilrijk (Belgium). Dept. of Radiology

    2015-04-01

    Complete overview of imaging of normal and diseased temporal bone. Straightforward structure to facilitate learning. Detailed consideration of newer imaging techniques, including the hot topic of diffusion-weighted imaging. Includes a chapter on anatomy that will be of great help to the novice interpreter of imaging findings. Excellent illustrations throughout. This book provides a complete overview of imaging of normal and diseased temporal bone. After description of indications for imaging and the cross-sectional imaging anatomy of the area, subsequent chapters address the various diseases and conditions that affect the temporal bone and are likely to be encountered regularly in clinical practice. The classic imaging methods are described and discussed in detail, and individual chapters are included on newer techniques such as functional imaging and diffusion-weighted imaging. There is also a strong focus on postoperative imaging. Throughout, imaging findings are documented with the aid of numerous informative, high-quality illustrations. Temporal Bone Imaging, with its straightforward structure based essentially on topography, will prove of immense value in daily practice.

  14. From random to regular: Neural constraints on the emergence of isochronous rhythm during cultural transmission

    DEFF Research Database (Denmark)

    Lumaca, Massimo; Haumann, Niels Trusbak; Brattico, Elvira

    2017-01-01

    A core design feature of human communication systems and expressive behaviours is their temporal organization. The cultural evolutionary origins of this feature remain unclear. Here, we test the hypothesis that regularities in the temporal organization of signalling sequences arise in the course...

  15. Empirical laws, regularity and necessity

    NARCIS (Netherlands)

    Koningsveld, H.

    1973-01-01

    In this book I have tried to develop an analysis of the concept of an empirical law, an analysis that differs in many ways from the alternative analyse's found in contemporary literature dealing with the subject.

    1 am referring especially to two well-known views, viz. the regularity and

  16. Regularization in Matrix Relevance Learning

    NARCIS (Netherlands)

    Schneider, Petra; Bunte, Kerstin; Stiekema, Han; Hammer, Barbara; Villmann, Thomas; Biehl, Michael

    A In this paper, we present a regularization technique to extend recently proposed matrix learning schemes in learning vector quantization (LVQ). These learning algorithms extend the concept of adaptive distance measures in LVQ to the use of relevance matrices. In general, metric learning can

  17. Regularity of conservative inductive limits

    Directory of Open Access Journals (Sweden)

    Jan Kucera

    1999-01-01

    Full Text Available A sequentially complete inductive limit of Fréchet spaces is regular, see [3]. With a minor modification, this property can be extended to inductive limits of arbitrary locally convex spaces under an additional assumption of conservativeness.

  18. Regular Matroids with Graphic Cocircuits

    OpenAIRE

    Papalamprou, Konstantinos; Pitsoulis, Leonidas

    2009-01-01

    We introduce the notion of graphic cocircuits and show that a large class of regular matroids with graphic cocircuits belongs to the class of signed-graphic matroids. Moreover, we provide an algorithm which determines whether a cographic matroid with graphic cocircuits is signed-graphic or not.

  19. Interval matrices: Regularity generates singularity

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří; Shary, S.P.

    2018-01-01

    Roč. 540, 1 March (2018), s. 149-159 ISSN 0024-3795 Institutional support: RVO:67985807 Keywords : interval matrix * regularity * singularity * P-matrix * absolute value equation * diagonally singilarizable matrix Subject RIV: BA - General Mathematics Impact factor: 0.973, year: 2016

  20. Regularized Statistical Analysis of Anatomy

    DEFF Research Database (Denmark)

    Sjöstrand, Karl

    2007-01-01

    This thesis presents the application and development of regularized methods for the statistical analysis of anatomical structures. Focus is on structure-function relationships in the human brain, such as the connection between early onset of Alzheimer’s disease and shape changes of the corpus cal...

  1. Verifying continuous-variable entanglement in finite spaces

    Science.gov (United States)

    Sperling, J.; Vogel, W.

    2009-05-01

    Starting from arbitrary Hilbert spaces, we reduce the problem to verify entanglement of any bipartite quantum state to finite-dimensional subspaces. Entanglement can be fully characterized as a finite-dimensional property, even though in general the truncation of the Hilbert space may cause fake nonclassicality. A generalization for multipartite quantum states is also given.

  2. Verifying real-time systems against scenario-based requirements

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Li, Shuhao; Nielsen, Brian

    2009-01-01

    subset of the LSC language. By equivalently translating an LSC chart into an observer TA and then non-intrusively composing this observer with the original system model, the problem of verifying a real-time system against a scenario-based requirement reduces to a classical real-time model checking...

  3. 20 CFR 401.45 - Verifying your identity.

    Science.gov (United States)

    2010-04-01

    ... records such as medical records. (3) Electronic requests. If you make a request by computer or other... personally identifiable information over open networks such as the Internet, we use encryption in all of our... verifying your own identity, by providing a copy of the minor's birth certificate, a court order, or other...

  4. Firming up the Foundations: Reflections on Verifying the 248 ...

    African Journals Online (AJOL)

    Firming up the Foundations: Reflections on Verifying the 248 Quotations in a Historical Dict ionary, with Reference to "A Dictionary of South African English on ... In addition to this, the article looks at the necessarily systematic nature of quotation handling and the main types of considerations determining methodology (for ...

  5. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  6. Calling Out Cheaters : Covert Security with Public VerifiabilitySecurity

    DEFF Research Database (Denmark)

    Asharov, Gilad; Orlandi, Claudio

    2012-01-01

    We introduce the notion of covert security with public verifiability, building on the covert security model introduced by Aumann and Lindell (TCC 2007). Protocols that satisfy covert security guarantee that the honest parties involved in the protocol will notice any cheating attempt with some...

  7. Project Temporalities

    DEFF Research Database (Denmark)

    Tryggestad, Kjell; Justesen, Lise; Mouritsen, Jan

    2013-01-01

    Purpose – The purpose of this paper is to explore how animals can become stakeholders in interaction with project management technologies and what happens with project temporalities when new and surprising stakeholders become part of a project and a recognized matter of concern to be taken...... into account. Design/methodology/approach – The paper is based on a qualitative case study of a project in the building industry. The authors use actor-network theory (ANT) to analyze the emergence of animal stakeholders, stakes and temporalities. Findings – The study shows how project temporalities can...... multiply in interaction with project management technologies and how conventional linear conceptions of project time may be contested with the emergence of new non-human stakeholders and temporalities. Research limitations/implications – The study draws on ANT to show how animals can become stakeholders...

  8. Regular languages, regular grammars and automata in splicing systems

    Science.gov (United States)

    Mohamad Jan, Nurhidaya; Fong, Wan Heng; Sarmin, Nor Haniza

    2013-04-01

    Splicing system is known as a mathematical model that initiates the connection between the study of DNA molecules and formal language theory. In splicing systems, languages called splicing languages refer to the set of double-stranded DNA molecules that may arise from an initial set of DNA molecules in the presence of restriction enzymes and ligase. In this paper, some splicing languages resulted from their respective splicing systems are shown. Since all splicing languages are regular, languages which result from the splicing systems can be further investigated using grammars and automata in the field of formal language theory. The splicing language can be written in the form of regular languages generated by grammar. Besides that, splicing systems can be accepted by automata. In this research, two restriction enzymes are used in splicing systems namely BfuCI and NcoI.

  9. Regular and conformal regular cores for static and rotating solutions

    Energy Technology Data Exchange (ETDEWEB)

    Azreg-Aïnou, Mustapha

    2014-03-07

    Using a new metric for generating rotating solutions, we derive in a general fashion the solution of an imperfect fluid and that of its conformal homolog. We discuss the conditions that the stress–energy tensors and invariant scalars be regular. On classical physical grounds, it is stressed that conformal fluids used as cores for static or rotating solutions are exempt from any malicious behavior in that they are finite and defined everywhere.

  10. Recursively-regular subdivisions and applications

    Directory of Open Access Journals (Sweden)

    Rafel Jaume

    2016-05-01

    Full Text Available We generalize regular subdivisions (polyhedral complexes resulting from the projection of the lower faces of a polyhedron introducing the class of recursively-regular subdivisions. Informally speaking, a recursively-regular subdivision is a subdivision that can be obtained by splitting some faces of a regular subdivision by other regular subdivisions (and continue recursively. We also define the finest regular coarsening and the regularity tree of a polyhedral complex. We prove that recursively-regular subdivisions are not necessarily connected by flips and that they are acyclic with respect to the in-front relation. We show that the finest regular coarsening of a subdivision can be efficiently computed, and that whether a subdivision is recursively regular can be efficiently decided. As an application, we also extend a theorem known since 1981 on illuminating space by cones and present connections of recursive regularity to tensegrity theory and graph-embedding problems.     

  11. Auditory feedback in error-based learning of motor regularity.

    Science.gov (United States)

    van Vugt, Floris T; Tillmann, Barbara

    2015-05-05

    Music and speech are skills that require high temporal precision of motor output. A key question is how humans achieve this timing precision given the poor temporal resolution of somatosensory feedback, which is classically considered to drive motor learning. We hypothesise that auditory feedback critically contributes to learn timing, and that, similarly to visuo-spatial learning models, learning proceeds by correcting a proportion of perceived timing errors. Thirty-six participants learned to tap a sequence regularly in time. For participants in the synchronous-sound group, a tone was presented simultaneously with every keystroke. For the jittered-sound group, the tone was presented after a random delay of 10-190 ms following the keystroke, thus degrading the temporal information that the sound provided about the movement. For the mute group, no keystroke-triggered sound was presented. In line with the model predictions, participants in the synchronous-sound group were able to improve tapping regularity, whereas the jittered-sound and mute group were not. The improved tapping regularity of the synchronous-sound group also transferred to a novel sequence and was maintained when sound was subsequently removed. The present findings provide evidence that humans engage in auditory feedback error-based learning to improve movement quality (here reduce variability in sequence tapping). We thus elucidate the mechanism by which high temporal precision of movement can be achieved through sound in a way that may not be possible with less temporally precise somatosensory modalities. Furthermore, the finding that sound-supported learning generalises to novel sequences suggests potential rehabilitation applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Verifying Multi-Agent Systems via Unbounded Model Checking

    Science.gov (United States)

    Kacprzak, M.; Lomuscio, A.; Lasica, T.; Penczek, W.; Szreter, M.

    2004-01-01

    We present an approach to the problem of verification of epistemic properties in multi-agent systems by means of symbolic model checking. In particular, it is shown how to extend the technique of unbounded model checking from a purely temporal setting to a temporal-epistemic one. In order to achieve this, we base our discussion on interpreted systems semantics, a popular semantics used in multi-agent systems literature. We give details of the technique and show how it can be applied to the well known train, gate and controller problem. Keywords: model checking, unbounded model checking, multi-agent systems

  13. From inactive to regular jogger

    DEFF Research Database (Denmark)

    Lund-Cramer, Pernille; Brinkmann Løite, Vibeke; Bredahl, Thomas Viskum Gjelstrup

    to translate intention into regular behavior. TTM: Informants expressed rapid progression from the pre-contemplation to the action stage caused by an early shift in the decisional balance towards advantages overweighing disadvantages. This was followed by a continuous improvement in self-efficacy, which...... jogging-related self-efficacy, and deployment of realistic goal setting was significant in the achievement of regular jogging behavior. Cognitive factors included a positive change in both affective and instrumental beliefs about jogging. Expectations from society and social relations had limited effect...... health, and well-being. An experience-driven change in affective beliefs contributed to the shift in the intention. Informants’ initial expectations to own jogging abilities were realistic and modest, which in combination with a GPS-training watch applied sufficient perceived behavioral control...

  14. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  15. Physical model of dimensional regularization

    Energy Technology Data Exchange (ETDEWEB)

    Schonfeld, Jonathan F.

    2016-12-15

    We explicitly construct fractals of dimension 4-ε on which dimensional regularization approximates scalar-field-only quantum-field theory amplitudes. The construction does not require fractals to be Lorentz-invariant in any sense, and we argue that there probably is no Lorentz-invariant fractal of dimension greater than 2. We derive dimensional regularization's power-law screening first for fractals obtained by removing voids from 3-dimensional Euclidean space. The derivation applies techniques from elementary dielectric theory. Surprisingly, fractal geometry by itself does not guarantee the appropriate power-law behavior; boundary conditions at fractal voids also play an important role. We then extend the derivation to 4-dimensional Minkowski space. We comment on generalization to non-scalar fields, and speculate about implications for quantum gravity. (orig.)

  16. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  17. Verifying Service Choreography Model Based on Description Logic

    Directory of Open Access Journals (Sweden)

    Minggang Yu

    2016-01-01

    Full Text Available Web Services Choreography Description Language lacks a formal system to accurately express the semantics of service behaviors and verify the correctness of a service choreography model. The paper presents a new approach of choreography model verification based on Description Logic. A metamodel of service choreography is built to provide a conceptual framework to capture the formal syntax and semantics of service choreography. Based on the framework, a set of rules and constraints are defined in Description Logic for choreography model verification. To automate model verification, the UML-based service choreography model will be transformed, by the given algorithms, into the DL-based ontology, and thus the model properties can be verified by reasoning through the ontology with the help of a popular DL reasoner. A case study is given to demonstrate applicability of the method. Furthermore, the work will be compared with other related researches.

  18. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  19. How to Verify and Manage the Translational Plagiarism?

    OpenAIRE

    Viroj Wiwanitkit

    2016-01-01

    The use of Google translator as a tool for determining translational plagiarism is a big challenge. As noted, plagiarism of the original papers written in Macedonian and translated into other languages can be verified after computerised translation in other languages. Attempts to screen the translational plagiarism should be supported. The use of Google Translate tool might be helpful. Special focus should be on any non-English reference that might be the source of plagiarised material and no...

  20. Real-Time Projection to Verify Plan Success During Execution

    Science.gov (United States)

    Wagner, David A.; Dvorak, Daniel L.; Rasmussen, Robert D.; Knight, Russell L.; Morris, John R.; Bennett, Matthew B.; Ingham, Michel D.

    2012-01-01

    The Mission Data System provides a framework for modeling complex systems in terms of system behaviors and goals that express intent. Complex activity plans can be represented as goal networks that express the coordination of goals on different state variables of the system. Real-time projection extends the ability of this system to verify plan achievability (all goals can be satisfied over the entire plan) into the execution domain so that the system is able to continuously re-verify a plan as it is executed, and as the states of the system change in response to goals and the environment. Previous versions were able to detect and respond to goal violations when they actually occur during execution. This new capability enables the prediction of future goal failures; specifically, goals that were previously found to be achievable but are no longer achievable due to unanticipated faults or environmental conditions. Early detection of such situations enables operators or an autonomous fault response capability to deal with the problem at a point that maximizes the available options. For example, this system has been applied to the problem of managing battery energy on a lunar rover as it is used to explore the Moon. Astronauts drive the rover to waypoints and conduct science observations according to a plan that is scheduled and verified to be achievable with the energy resources available. As the astronauts execute this plan, the system uses this new capability to continuously re-verify the plan as energy is consumed to ensure that the battery will never be depleted below safe levels across the entire plan.

  1. AnnaBot: A Static Verifier for Java Annotation Usage

    OpenAIRE

    Ian Darwin

    2010-01-01

    This paper describes AnnaBot, one of the first tools to verify correct use of Annotation-based metadata in the Java programming language. These Annotations are a standard Java 5 mechanism used to attach metadata to types, methods, or fields without using an external configuration file. A binary representation of the Annotation becomes part of the compiled “.class” file, for inspection by another component or library at runtime. Java Annotations were introduced into the Java language in ...

  2. Verifying the Absence of Common Runtime Errors in Computer Programs

    Science.gov (United States)

    1981-06-01

    the langauge designer another way to test a design, in addition to the usual ways based on experience with other languages and difficulty of...simplified verification conditions is a special skill that one must learn in order to use the verifier. In theI process of analyzing a VC, une notes...Discovery of Linear Restraints Among Variables of a Program, Proceedings of the Fifth ACM Symposium on Principles of Programming Languages, January

  3. Notes toward a verifiable vector algebraic basis for colorimetric modeling

    OpenAIRE

    Oulton, David P.

    2009-01-01

    The presented notes aim toward improved models of the scalar visual response to flat-field stimuli, and are prompted by unease over the complexity of existing colour difference models. Some of the basic assumptions of colorimetry are examined in detail, and analytical methods whereby these assumptions can be investigated experimentally are presented. A key finding is that the standard CIE colorimetric model is verifiably correct as a predictor of point colour identity and metameric visual equ...

  4. Verifiable fault tolerance in measurement-based quantum computation

    Science.gov (United States)

    Fujii, Keisuke; Hayashi, Masahito

    2017-09-01

    Quantum systems, in general, cannot be simulated efficiently by a classical computer, and hence are useful for solving certain mathematical problems and simulating quantum many-body systems. This also implies, unfortunately, that verification of the output of the quantum systems is not so trivial, since predicting the output is exponentially hard. As another problem, the quantum system is very delicate for noise and thus needs an error correction. Here, we propose a framework for verification of the output of fault-tolerant quantum computation in a measurement-based model. In contrast to existing analyses on fault tolerance, we do not assume any noise model on the resource state, but an arbitrary resource state is tested by using only single-qubit measurements to verify whether or not the output of measurement-based quantum computation on it is correct. Verifiability is equipped by a constant time repetition of the original measurement-based quantum computation in appropriate measurement bases. Since full characterization of quantum noise is exponentially hard for large-scale quantum computing systems, our framework provides an efficient way to practically verify the experimental quantum error correction.

  5. Verifying three-dimensional skull model reconstruction using cranial index of symmetry.

    Directory of Open Access Journals (Sweden)

    Woon-Man Kung

    Full Text Available BACKGROUND: Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS. MATERIALS AND METHODS: From January 2011 to June 2012, decompressive craniectomy (DC was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. RESULTS: CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47-99.84. CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (p<0.001, p = 0.064, p = 0.021 respectively, Wilcoxon matched pairs signed rank test. These data evidenced the highly accurate symmetry of these CAD models with regular contours. CONCLUSIONS: CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation.

  6. Tree games with regular objectives

    Directory of Open Access Journals (Sweden)

    Marcin Przybyłko

    2014-08-01

    Full Text Available We study tree games developed recently by Matteo Mio as a game interpretation of the probabilistic μ-calculus. With expressive power comes complexity. Mio showed that tree games are able to encode Blackwell games and, consequently, are not determined under deterministic strategies. We show that non-stochastic tree games with objectives recognisable by so-called game automata are determined under deterministic, finite memory strategies. Moreover, we give an elementary algorithmic procedure which, for an arbitrary regular language L and a finite non-stochastic tree game with a winning objective L decides if the game is determined under deterministic strategies.

  7. Academic Training Lecture - Regular Programme

    CERN Multimedia

    PH Department

    2011-01-01

    Regular Lecture Programme 9 May 2011 ACT Lectures on Detectors - Inner Tracking Detectors by Pippa Wells (CERN) 10 May 2011 ACT Lectures on Detectors - Calorimeters (2/5) by Philippe Bloch (CERN) 11 May 2011 ACT Lectures on Detectors - Muon systems (3/5) by Kerstin Hoepfner (RWTH Aachen) 12 May 2011 ACT Lectures on Detectors - Particle Identification and Forward Detectors by Peter Krizan (University of Ljubljana and J. Stefan Institute, Ljubljana, Slovenia) 13 May 2011 ACT Lectures on Detectors - Trigger and Data Acquisition (5/5) by Dr. Brian Petersen (CERN) from 11:00 to 12:00 at CERN ( Bldg. 222-R-001 - Filtration Plant )

  8. Regular nanofabrics in emerging technologies

    CERN Document Server

    Jamaa, M Haykel Ben

    2011-01-01

    ""Regular Nanofabrics in Emerging Technologies"" gives a deep insight into both fabrication and design aspects of emerging semiconductor technologies, that represent potential candidates for the post-CMOS era. Its approach is unique, across different fields, and it offers a synergetic view for a public of different communities ranging from technologists, to circuit designers, and computer scientists. The book presents two technologies as potential candidates for future semiconductor devices and systems and it shows how fabrication issues can be addressed at the design level and vice versa. The

  9. Regularization methods in Banach spaces

    CERN Document Server

    Schuster, Thomas; Hofmann, Bernd; Kazimierski, Kamil S

    2012-01-01

    Regularization methods aimed at finding stable approximate solutions are a necessary tool to tackle inverse and ill-posed problems. Usually the mathematical model of an inverse problem consists of an operator equation of the first kind and often the associated forward operator acts between Hilbert spaces. However, for numerous problems the reasons for using a Hilbert space setting seem to be based rather on conventions than on an approprimate and realistic model choice, so often a Banach space setting would be closer to reality. Furthermore, sparsity constraints using general Lp-norms or the B

  10. Regular capacities on metrizable spaces

    Directory of Open Access Journals (Sweden)

    T. M. Cherkovskyi

    2014-07-01

    Full Text Available It is proved that for a (not necessarily compact metric space: the metrics on the space of capacities in the sense of Zarichnyi and Prokhorov are equal; completeness of the space of capacities is equivalent to completeness of the original space. It is shown that for the capacities on metrizable spaces the properties of $\\omega$-smoothness and of $\\tau$-smoothness are equivalent precisely on the separable spaces, and the properties of $\\omega$-smoothness and of regularity w.r.t. some (then w.r.t. any admissible metric are equivalent precisely on the compact spaces.

  11. Verifying Architectural Design Rules of the Flight Software Product Line

    Science.gov (United States)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  12. Lucid dreaming verified by volitional communication during REM sleep.

    Science.gov (United States)

    La Berge, S P; Nagel, L E; Dement, W C; Zarcone, V P

    1981-06-01

    The occurrence of lucid dreaming (dreaming while being conscious that one is dreaming) has been verified for 5 selected subjects who signaled that they knew they were dreaming while continuing to dream during unequivocal REM sleep. The signals consisted of particular dream actions having observable concomitants and were performed in accordance with pre-sleep agreement. The ability of proficient lucid dreamers to signal in this manner makes possible a new approach to dream research--such subjects, while lucid, could carry out diverse dream experiments marking the exact time of particular dream events, allowing derivation of of precise psychophysiological correlations and methodical testing of hypotheses.

  13. Countering Ballot Stuffing and Incorporating Eligibility Verifiability in Helios

    OpenAIRE

    Srinivasan, S.; Culnane, C; Heather, J; Schneider, SA; Z. Xia

    2014-01-01

    Helios is a web-based end-to-end verifiable electronic voting system which has been said to be suitable for low-coercion environments. Although many Internet voting schemes have been proposed in the literature, Helios stands out for its real world relevance. It has been used in a number of elections in university campuses around the world and it has also been used recently by the IACR to elect its board members. It was noted that a dishonest server in Helios can stuff ballots and this seems t...

  14. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    Science.gov (United States)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  15. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    . The Guided System Development framework contributes to more secure communication systems by aiding the development of such systems. The framework features a simple modelling language, step-wise refinement from models to implementation, interfaces to security verification tools, and code generation from...... the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...

  16. Multiparametric tissue abnormality characterization using manifold regularization

    Science.gov (United States)

    Batmanghelich, Kayhan; Wu, Xiaoying; Zacharaki, Evangelia; Markowitz, Clyde E.; Davatzikos, Christos; Verma, Ragini

    2008-03-01

    Tissue abnormality characterization is a generalized segmentation problem which aims at determining a continuous score that can be assigned to the tissue which characterizes the extent of tissue deterioration, with completely healthy tissue being one end of the spectrum and fully abnormal tissue such as lesions, being on the other end. Our method is based on the assumptions that there is some tissue that is neither fully healthy or nor completely abnormal but lies in between the two in terms of abnormality; and that the voxel-wise score of tissue abnormality lies on a spatially and temporally smooth manifold of abnormality. Unlike in a pure classification problem which associates an independent label with each voxel without considering correlation with neighbors, or an absolute clustering problem which does not consider a priori knowledge of tissue type, we assume that diseased and healthy tissue lie on a manifold that encompasses the healthy tissue and diseased tissue, stretching from one to the other. We propose a semi-supervised method for determining such as abnormality manifold, using multi-parametric features incorporated into a support vector machine framework in combination with manifold regularization. We apply the framework towards the characterization of tissue abnormality to brains of multiple sclerosis patients.

  17. Verifiable Rational Secret Sharing Scheme in Mobile Networks

    Directory of Open Access Journals (Sweden)

    En Zhang

    2015-01-01

    Full Text Available With the development of mobile network, lots of people now have access to mobile phones and the mobile networks give users ubiquitous connectivity. However, smart phones and tablets are poor in computational resources such as memory size, processor speed, and disk capacity. So far, all existing rational secret sharing schemes cannot be suitable for mobile networks. In this paper, we propose a verifiable rational secret sharing scheme in mobile networks. The scheme provides a noninteractively verifiable proof for the correctness of participants’ share and handshake protocol is not necessary; there is no need for certificate generation, propagation, and storage in the scheme, which is more suitable for devices with limited size and processing power; in the scheme, every participant uses her encryption on number of each round as the secret share and the dealer does not have to distribute any secret share; every participant cannot gain more by deviating the protocol, so rational participant has an incentive to abide by the protocol; finally, every participant can obtain the secret fairly (means that either everyone receives the secret, or else no one does in mobile networks. The scheme is coalition-resilient and the security of our scheme relies on a computational assumption.

  18. Pantheon 1.0, a manually verified dataset of globally famous biographies.

    Science.gov (United States)

    Yu, Amy Zhao; Ronen, Shahar; Hu, Kevin; Lu, Tiffany; Hidalgo, César A

    2016-01-05

    We present the Pantheon 1.0 dataset: a manually verified dataset of individuals that have transcended linguistic, temporal, and geographic boundaries. The Pantheon 1.0 dataset includes the 11,341 biographies present in more than 25 languages in Wikipedia and is enriched with: (i) manually verified demographic information (place and date of birth, gender) (ii) a taxonomy of occupations classifying each biography at three levels of aggregation and (iii) two measures of global popularity including the number of languages in which a biography is present in Wikipedia (L), and the Historical Popularity Index (HPI) a metric that combines information on L, time since birth, and page-views (2008-2013). We compare the Pantheon 1.0 dataset to data from the 2003 book, Human Accomplishments, and also to external measures of accomplishment in individual games and sports: Tennis, Swimming, Car Racing, and Chess. In all of these cases we find that measures of popularity (L and HPI) correlate highly with individual accomplishment, suggesting that measures of global popularity proxy the historical impact of individuals.

  19. A Formal Specification Framework for Designing and Verifying Reliable and Dependable Software for CNC Systems

    Directory of Open Access Journals (Sweden)

    Yunan Cao

    2014-06-01

    Full Text Available As a distributed computing system, a CNC system needs to be operated reliably, dependably, and safely. How to design reliable and dependable software and perform effective verification for CNC systems becomes an important research problem. In this paper, we propose a new modeling method called TTM/ATRTTL (timed transition models/all-time real-time temporal logics for specifying CNC systems. TTM/ATRTTL provides full supports for specifying hard real time and feedback that are needed for modeling CNC systems. We also propose a verification framework with verification rules and theorems and implement it with STeP and SF2STeP. The proposed verification framework can check reliability, dependability, and safety of systems specified by our TTM/ATRTTL method. We apply our modeling and verification techniques on an open architecture CNC (OAC system and conduct comprehensive studies on modeling and verifying a system controller that is the key part of OAC. The results show that our method can effectively model and verify CNC systems and generate CNC software that can satisfy system requirements in reliability, dependability, and safety.

  20. Temporal naturalism

    Science.gov (United States)

    Smolin, Lee

    2015-11-01

    Two people may claim both to be naturalists, but have divergent conceptions of basic elements of the natural world which lead them to mean different things when they talk about laws of nature, or states, or the role of mathematics in physics. These disagreements do not much affect the ordinary practice of science which is about small subsystems of the universe, described or explained against a background, idealized to be fixed. But these issues become crucial when we consider including the whole universe within our system, for then there is no fixed background to reference observables to. I argue here that the key issue responsible for divergent versions of naturalism and divergent approaches to cosmology is the conception of time. One version, which I call temporal naturalism, holds that time, in the sense of the succession of present moments, is real, and that laws of nature evolve in that time. This is contrasted with timeless naturalism, which holds that laws are immutable and the present moment and its passage are illusions. I argue that temporal naturalism is empirically more adequate than the alternatives, because it offers testable explanations for puzzles its rivals cannot address, and is likely a better basis for solving major puzzles that presently face cosmology and physics. This essay also addresses the problem of qualia and experience within naturalism and argues that only temporal naturalism can make a place for qualia as intrinsic qualities of matter.

  1. Verifying Embedded Systems using Component-based Runtime Observers

    DEFF Research Database (Denmark)

    Guan, Wei; Marian, Nicolae; Angelov, Christo K.

    Formal verification methods, such as exhaustive model checking, are often infeasible because of high computational complexity. Runtime observers (monitors) provide an alternative, light-weight verification method, which offers a non-exhaustive yet feasible approach to monitoring system behavior...... specified properties via simulation. The presented method has been experimentally validated in an industrial case study---a control system for a safety-critical medical ventilator unit....... against formally specified properties. This paper presents a component-based design method for runtime observers, which are configured from instances of prefabricated reusable components---Predicate Evaluator (PE) and Temporal Evaluator (TE). The PE computes atomic propositions for the TE; the latter...

  2. Modeling polycrystals with regular polyhedra

    Directory of Open Access Journals (Sweden)

    Paulo Rangel Rios

    2006-06-01

    Full Text Available Polycrystalline structure is of paramount importance to materials science and engineering. It provides an important example of a space-filling irregular network structure that also occurs in foams as well as in certain biological tissues. Therefore, seeking an accurate description of the characteristics of polycrystals is of fundamental importance. Recently, one of the authors (MEG published a paper in which a method was devised of representation of irregular networks by regular polyhedra with curved faces. In Glicksman's method a whole class of irregular polyhedra with a given number of faces, N, is represented by a single symmetrical polyhedron with N curved faces. This paper briefly describes the topological and metric properties of these special polyhedra. They are then applied to two important problems of irregular networks: the dimensionless energy 'cost' of irregular networks, and the derivation of a 3D analogue of the von Neumann-Mullins equation for the growth rate of grains in a polycrystal.

  3. Regularity of Dual Gabor Windows

    Directory of Open Access Journals (Sweden)

    Ole Christensen

    2013-01-01

    Full Text Available We present a construction of dual windows associated with Gabor frames with compactly supported windows. The size of the support of the dual windows is comparable to that of the given window. Under certain conditions, we prove that there exist dual windows with higher regularity than the canonical dual window. On the other hand, there are cases where no differentiable dual window exists, even in the overcomplete case. As a special case of our results, we show that there exists a common smooth dual window for an interesting class of Gabor frames. In particular, for any value of K∈ℕ, there is a smooth function h which simultaneously is a dual window for all B-spline generated Gabor frames {EmbTnBN(x/2}m,n∈ℕ for B-splines BN of order N=1,…,2K+1 with a fixed and sufficiently small value of b.

  4. Bayesian regularization of neural networks.

    Science.gov (United States)

    Burden, Frank; Winkler, Dave

    2008-01-01

    Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a "well-posed" statistical problem in the manner of a ridge regression. The advantage of BRANNs is that the models are robust and the validation process, which scales as O(N2) in normal regression methods, such as back propagation, is unnecessary. These networks provide solutions to a number of problems that arise in QSAR modeling, such as choice of model, robustness of model, choice of validation set, size of validation effort, and optimization of network architecture. They are difficult to overtrain, since evidence procedures provide an objective Bayesian criterion for stopping training. They are also difficult to overfit, because the BRANN calculates and trains on a number of effective network parameters or weights, effectively turning off those that are not relevant. This effective number is usually considerably smaller than the number of weights in a standard fully connected back-propagation neural net. Automatic relevance determination (ARD) of the input variables can be used with BRANNs, and this allows the network to "estimate" the importance of each input. The ARD method ensures that irrelevant or highly correlated indices used in the modeling are neglected as well as showing which are the most important variables for modeling the activity data. This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model. Some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models.

  5. Generalization performance of regularized neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1994-01-01

    Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization...

  6. EXPRESSÃO REGULAR NUMÉRICA

    Directory of Open Access Journals (Sweden)

    Bruno Vier Hoffmeister

    2014-06-01

    Full Text Available This article defines the formal definition of the computer program language Numeric Regular Expression. A language concept inspired by Regular Expression syntax, applying your power and flexibility to numeric chains are describe.

  7. A detailed and verified wind resource atlas for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Mortensen, N.G.; Landberg, L.; Rathmann, O.; Nielsen, M.N. [Risoe National Lab., Roskilde (Denmark); Nielsen, P. [Energy and Environmental Data, Aalberg (Denmark)

    1999-03-01

    A detailed and reliable wind resource atlas covering the entire land area of Denmark has been established. Key words of the methodology are wind atlas analysis, interpolation of wind atlas data sets, automated generation of digital terrain descriptions and modelling of local wind climates. The atlas contains wind speed and direction distributions, as well as mean energy densities of the wind, for 12 sectors and four heights above ground level: 25, 45, 70 and 100 m. The spatial resolution is 200 meters in the horizontal. The atlas has been verified by comparison with actual wind turbine power productions from over 1200 turbines. More than 80% of these turbines were predicted to within 10%. The atlas will become available on CD-ROM and on the Internet. (au)

  8. Leveraging Parallel Data Processing Frameworks with Verified Lifting

    Directory of Open Access Journals (Sweden)

    Maaz Bin Safeer Ahmad

    2016-11-01

    Full Text Available Many parallel data frameworks have been proposed in recent years that let sequential programs access parallel processing. To capitalize on the benefits of such frameworks, existing code must often be rewritten to the domain-specific languages that each framework supports. This rewriting–tedious and error-prone–also requires developers to choose the framework that best optimizes performance given a specific workload. This paper describes Casper, a novel compiler that automatically retargets sequential Java code for execution on Hadoop, a parallel data processing framework that implements the MapReduce paradigm. Given a sequential code fragment, Casper uses verified lifting to infer a high-level summary expressed in our program specification language that is then compiled for execution on Hadoop. We demonstrate that Casper automatically translates Java benchmarks into Hadoop. The translated results execute on average 3.3x faster than the sequential implementations and scale better, as well, to larger datasets.

  9. VERIFYING OF BETA CONVERGENCE FOR SOUTH EAST COUNTRIES OF ASIA

    Directory of Open Access Journals (Sweden)

    Michaela Blaško

    2017-05-01

    Full Text Available The convergence means the process of balancing disparities in chosen indicators of homogeneous economic groups. β-convergence and is based on the assumption, where less developed economy grows faster than advanced ones, so GDP per capita has higher speed in less developed economy. In this article is verified β convergence based on dependency between the growth of real GDP per capita and the initial level of real GDP per capita (in PPP and by modifications of this relationship by using of Least Squares Method for 9 countries of South East Asia in different samples since 2000 till 2015. For completely explanation of dependency and calculation of consistent, minimal estimator are used dummies and created a structural parameter, which eliminate shocks and possible disparities between chosen countries. Based on reached results was proved convergence just in sample since 2004 till 2008 between chosen nine countries of South East Countries of Asia.

  10. How to Verify and Manage the Translational Plagiarism?

    Science.gov (United States)

    Wiwanitkit, Viroj

    2016-01-01

    The use of Google translator as a tool for determining translational plagiarism is a big challenge. As noted, plagiarism of the original papers written in Macedonian and translated into other languages can be verified after computerised translation in other languages. Attempts to screen the translational plagiarism should be supported. The use of Google Translate tool might be helpful. Special focus should be on any non-English reference that might be the source of plagiarised material and non-English article that might translate from an original English article, which cannot be detected by simple plagiarism screening tool. It is a hard job for any journal to detect the complex translational plagiarism but the harder job might be how to effectively manage the case. PMID:27703588

  11. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  12. How to Verify and Manage the Translational Plagiarism?

    Directory of Open Access Journals (Sweden)

    Viroj Wiwanitkit

    2016-06-01

    Full Text Available The use of Google translator as a tool for determining translational plagiarism is a big challenge. As noted, plagiarism of the original papers written in Macedonian and translated into other languages can be verified after computerised translation in other languages. Attempts to screen the translational plagiarism should be supported. The use of Google Translate tool might be helpful. Special focus should be on any non-English reference that might be the source of plagiarised material and non-English article that might translate from an original English article, which cannot be detected by simple plagiarism screening tool. It is a hard job for any journal to detect the complex translational plagiarism but the harder job might be how to effectively manage the case.

  13. Modelling and Verifying Communication Failure of Hybrid Systems in HCSP

    DEFF Research Database (Denmark)

    Wang, Shuling; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    .e. the communication itself fails to occur. To address this issue, this paper proposes a formal framework by extending HCSP, a formal modeling language for hybrid systems, for modeling and verifying hybrid systems in the absence of receiving messages due to communication failure. We present two inference systems......Hybrid systems are dynamic systems with interacting discrete computation and continuous physical processes. They have become ubiquitous in our daily life, e.g. automotive, aerospace and medical systems, and in particular, many of them are safety-critical. For a safety-critical hybrid system......, the physical process evolves continuously with respect to time, and the discrete controller monitors and controls the physical process in a correct way such that the whole system satisfies the given safety requirements. The safety of hybrid systems depends heavily on the control from the controllers. However...

  14. Monitoring and verifying changes of organic carbon in soil

    Science.gov (United States)

    Post, W.M.; Izaurralde, R. C.; Mann, L. K.; Bliss, Norman B.

    2001-01-01

    Changes in soil and vegetation management can impact strongly on the rates of carbon (C) accumulation and loss in soil, even over short periods of time. Detecting the effects of such changes in accumulation and loss rates on the amount of C stored in soil presents many challenges. Consideration of the temporal and spatial heterogeneity of soil properties, general environmental conditions, and management history is essential when designing methods for monitoring and projecting changes in soil C stocks. Several approaches and tools will be required to develop reliable estimates of changes in soil C at scales ranging from the individual experimental plot to whole regional and national inventories. In this paper we present an overview of soil properties and processes that must be considered. We classify the methods for determining soil C changes as direct or indirect. Direct methods include field and laboratory measurements of total C, various physical and chemical fractions, and C isotopes. A promising direct method is eddy covariance measurement of CO2 fluxes. Indirect methods include simple and stratified accounting, use of environmental and topographic relationships, and modeling approaches. We present a conceptual plan for monitoring soil C changes at regional scales that can be readily implemented. Finally, we anticipate significant improvements in soil C monitoring with the advent of instruments capable of direct and precise measurements in the field as well as methods for interpreting and extrapolating spatial and temporal information.

  15. The perception of regularity in an isochronous stimulus in zebra finches (Taeniopygia guttata) and humans.

    Science.gov (United States)

    van der Aa, Jeroen; Honing, Henkjan; ten Cate, Carel

    2015-06-01

    Perceiving temporal regularity in an auditory stimulus is considered one of the basic features of musicality. Here we examine whether zebra finches can detect regularity in an isochronous stimulus. Using a go/no go paradigm we show that zebra finches are able to distinguish between an isochronous and an irregular stimulus. However, when the tempo of the isochronous stimulus is changed, it is no longer treated as similar to the training stimulus. Training with three isochronous and three irregular stimuli did not result in improvement of the generalization. In contrast, humans, exposed to the same stimuli, readily generalized across tempo changes. Our results suggest that zebra finches distinguish the different stimuli by learning specific local temporal features of each individual stimulus rather than attending to the global structure of the stimuli, i.e., to the temporal regularity. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Supervised scale-regularized linear convolutionary filters

    DEFF Research Database (Denmark)

    Loog, Marco; Lauze, Francois Bernard

    2017-01-01

    benefit from some form of regularization and, secondly, arguing that the problem of scale has not been taken care of in a very satis- factory manner, we come to a combined resolution of both of these shortcomings by proposing a technique that we coin scale regularization. This regularization problem can...

  17. Regular expressions for decoding of neural network outputs.

    Science.gov (United States)

    Strauß, Tobias; Leifert, Gundram; Grüning, Tobias; Labahn, Roger

    2016-07-01

    This article proposes a convenient tool for decoding the output of neural networks trained by Connectionist Temporal Classification (CTC) for handwritten text recognition. We use regular expressions to describe the complex structures expected in the writing. The corresponding finite automata are employed to build a decoder. We analyze theoretically which calculations are relevant and which can be avoided. A great speed-up results from an approximation. We conclude that the approximation most likely fails if the regular expression does not match the ground truth which is not harmful for many applications since the low probability will be even underestimated. The proposed decoder is very efficient compared to other decoding methods. The variety of applications reaches from information retrieval to full text recognition. We refer to applications where we integrated the proposed decoder successfully. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Structured Sparsity Regularization Approach to the EEG Inverse Problem

    DEFF Research Database (Denmark)

    Montoya-Martinez, Jair; Artes-Rodriguez, Antonio; Hansen, Lars Kai

    2012-01-01

    Localization of brain activity involves solving the EEG inverse problem, which is an undetermined ill-posed problem. We propose a novel approach consisting in estimating, using structured sparsity regularization techniques, the Brain Electrical Sources (BES) matrix directly in the spatio......-temporal source space. We use proximal splitting optimization methods, which are efficient optimization techniques, with good convergence rates and with the ability to handle large nonsmooth convex problems, which is the typical scenario in the EEG inverse problem. We have evaluated our approach under a simulated...... scenario, consisting in estimating a synthetic BES matrix with 5124 sources. We report results using ℓ1 (LASSO), ℓ1/ℓ2 (Group LASSO) and ℓ1 + ℓ1/ℓ2 (Sparse Group LASSO) regularizers....

  19. Slimming and regularization of cozero maps

    Directory of Open Access Journals (Sweden)

    Mohamad Mehdi Ebrahimi

    2017-01-01

    Full Text Available Cozero maps are generalized forms of cozero elements. Two particular cases of cozero maps, slim and regular cozero maps, are significant. In this paper we present methods to construct slim and regular cozero maps from a given cozero map. The construction of the slim and the regular cozero map from a cozero map are called slimming and regularization of the cozero map, respectively. Also, we prove that the slimming and regularization create reflector functors, and so we may say that they are the best method of constructing slim and regular cozero maps, in the sense of category theory. Finally, we give slim regularization for a cozero map c:M→L in the general case where A is not a Q-algebra. We use the ring and module of fractions, in this construction process.

  20. VDVR: verifiable visualization of projection-based data.

    Science.gov (United States)

    Zheng, Ziyi; Xu, Wei; Mueller, Klaus

    2010-01-01

    Practical volume visualization pipelines are never without compromises and errors. A delicate and often-studied component is the interpolation of off-grid samples, where aliasing can lead to misleading artifacts and blurring, potentially hiding fine details of critical importance. The verifiable visualization framework we describe aims to account for these errors directly in the volume generation stage, and we specifically target volumetric data obtained via computed tomography (CT) reconstruction. In this case the raw data are the X-ray projections obtained from the scanner and the volume data generation process is the CT algorithm. Our framework informs the CT reconstruction process of the specific filter intended for interpolation in the subsequent visualization process, and this in turn ensures an accurate interpolation there at a set tolerance. Here, we focus on fast trilinear interpolation in conjunction with an octree-type mixed resolution volume representation without T-junctions. Efficient rendering is achieved by a space-efficient and locality-optimized representation, which can straightforwardly exploit fast fixed-function pipelines on GPUs.

  1. A credit card verifier structure using diffraction and spectroscopy concepts

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2008-04-01

    We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

  2. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    Science.gov (United States)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  3. Updating representations of temporal intervals.

    Science.gov (United States)

    Danckert, James; Anderson, Britt

    2015-12-01

    Effectively engaging with the world depends on accurate representations of the regularities that make up that world-what we call mental models. The success of any mental model depends on the ability to adapt to changes-to 'update' the model. In prior work, we have shown that damage to the right hemisphere of the brain impairs the ability to update mental models across a range of tasks. Given the disparate nature of the tasks we have employed in this prior work (i.e. statistical learning, language acquisition, position priming, perceptual ambiguity, strategic game play), we propose that a cognitive module important for updating mental representations should be generic, in the sense that it is invoked across multiple cognitive and perceptual domains. To date, the majority of our tasks have been visual in nature. Given the ubiquity and import of temporal information in sensory experience, we examined the ability to build and update mental models of time. We had healthy individuals complete a temporal prediction task in which intervals were initially drawn from one temporal range before an unannounced switch to a different range of intervals. Separate groups had the second range of intervals switch to one that contained either longer or shorter intervals than the first range. Both groups showed significant positive correlations between perceptual and prediction accuracy. While each group updated mental models of temporal intervals, those exposed to shorter intervals did so more efficiently. Our results support the notion of generic capacity to update regularities in the environment-in this instance based on temporal information. The task developed here is well suited to investigations in neurological patients and in neuroimaging settings.

  4. Regularity extraction from non-adjacent sounds.

    Science.gov (United States)

    Bendixen, Alexandra; Schröger, Erich; Ritter, Walter; Winkler, István

    2012-01-01

    The regular behavior of sound sources helps us to make sense of the auditory environment. Regular patterns may, for instance, convey information on the identity of a sound source (such as the acoustic signature of a train moving on the rails). Yet typically, this signature overlaps in time with signals emitted from other sound sources. It is generally assumed that auditory regularity extraction cannot operate upon this mixture of signals because it only finds regularities between adjacent sounds. In this view, the auditory environment would be grouped into separate entities by means of readily available acoustic cues such as separation in frequency and location. Regularity extraction processes would then operate upon the resulting groups. Our new experimental evidence challenges this view. We presented two interleaved sound sequences which overlapped in frequency range and shared all acoustic parameters. The sequences only differed in their underlying regular patterns. We inserted deviants into one of the sequences to probe whether the regularity was extracted. In the first experiment, we found that these deviants elicited the mismatch negativity (MMN) component. Thus the auditory system was able to find the regularity between the non-adjacent sounds. Regularity extraction was not influenced by sequence cohesiveness as manipulated by the relative duration of tones and silent inter-tone-intervals. In the second experiment, we showed that a regularity connecting non-adjacent sounds was discovered only when the intervening sequence also contained a regular pattern, but not when the intervening sounds were randomly varying. This suggests that separate regular patterns are available to the auditory system as a cue for identifying signals coming from distinct sound sources. Thus auditory regularity extraction is not necessarily confined to a processing stage after initial sound grouping, but may precede grouping when other acoustic cues are unavailable.

  5. Scenarios for exercising technical approaches to verified nuclear reductions

    Energy Technology Data Exchange (ETDEWEB)

    Doyle, James [Los Alamos National Laboratory

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  6. Relative clock verifies endogenous bursts of human dynamics

    Science.gov (United States)

    Zhou, Tao; Zhao, Zhi-Dan; Yang, Zimo; Zhou, Changsong

    2012-01-01

    Temporal bursts are widely observed in many human-activated systems, which may result from both endogenous mechanisms like the highest-priority-first protocol and exogenous factors like the seasonality of activities. To distinguish the effects from different mechanisms is thus of theoretical significance. This letter reports a new timing method by using a relative clock, namely the time length between two consecutive events of an agent is counted as the number of other agents' events appeared during this interval. We propose a model, in which agents act either in a constant rate or with a power-law inter-event time distribution, and the global activity either keeps unchanged or varies periodically vs. time. Our analysis shows that the bursts caused by the heterogeneity of global activity can be eliminated by setting the relative clock, yet the bursts from real individual behaviors still exist. We perform extensive experiments on four large-scale systems, the search engine by AOL, a social bookmarking system —Delicious, a short-message communication network, and a microblogging system —Twitter. Seasonality of global activity is observed, yet the bursts cannot be eliminated by using the relative clock.

  7. Slimming and regularization of cozero maps

    Directory of Open Access Journals (Sweden)

    Mohamad Mehdi Ebrahimi

    2017-01-01

    Full Text Available Cozero maps are generalized forms of cozero elements. Two particular cases of cozero maps, slim and regular cozero maps, are significant. In this paper we present methods to construct slim and regular cozero maps from a given  cozero map. The construction of the slim and the regular cozero map from a cozero map are called slimming and regularization of the cozero map, respectively. Also, we prove that the slimming and regularization create reflector functors, and so we may say that they are the best method of constructing slim and regular cozero maps, in the sense of category theory. Finally, we give slim regularizationfor a cozero map $c:Mrightarrow L$ in the general case where $A$is not a ${Bbb Q}$-algebra. We use the ring and module offractions, in this construction process.

  8. Regularity effect in prospective memory during aging

    Directory of Open Access Journals (Sweden)

    Geoffrey Blondelle

    2016-10-01

    Full Text Available Background: Regularity effect can affect performance in prospective memory (PM, but little is known on the cognitive processes linked to this effect. Moreover, its impacts with regard to aging remain unknown. To our knowledge, this study is the first to examine regularity effect in PM in a lifespan perspective, with a sample of young, intermediate, and older adults. Objective and design: Our study examined the regularity effect in PM in three groups of participants: 28 young adults (18–30, 16 intermediate adults (40–55, and 25 older adults (65–80. The task, adapted from the Virtual Week, was designed to manipulate the regularity of the various activities of daily life that were to be recalled (regular repeated activities vs. irregular non-repeated activities. We examine the role of several cognitive functions including certain dimensions of executive functions (planning, inhibition, shifting, and binding, short-term memory, and retrospective episodic memory to identify those involved in PM, according to regularity and age. Results: A mixed-design ANOVA showed a main effect of task regularity and an interaction between age and regularity: an age-related difference in PM performances was found for irregular activities (older < young, but not for regular activities. All participants recalled more regular activities than irregular ones with no age effect. It appeared that recalling of regular activities only involved planning for both intermediate and older adults, while recalling of irregular ones were linked to planning, inhibition, short-term memory, binding, and retrospective episodic memory. Conclusion: Taken together, our data suggest that planning capacities seem to play a major role in remembering to perform intended actions with advancing age. Furthermore, the age-PM-paradox may be attenuated when the experimental design is adapted by implementing a familiar context through the use of activities of daily living. The clinical

  9. The regularization of Old English weak verbs

    OpenAIRE

    Marta Tío Sáenz

    2015-01-01

    [EN] This article deals with the regularization of non-standard spellings of the verbal forms extracted from a corpus. It addresses the question of what the limits of regularization are when lemmatizing Old English weak verbs. The purpose of such regularization, also known as normalization, is to carry out lexicological analysis or lexicographical work. The analysis concentrates on weak verbs from the second class and draws on the lexical database of Old English Nerthus, which has incorporate...

  10. Verifying cell loss requirements in high-speed communication networks

    Directory of Open Access Journals (Sweden)

    Kerry W. Fendick

    1998-01-01

    Full Text Available In high-speed communication networks it is common to have requirements of very small cell loss probabilities due to buffer overflow. Losses are measured to verify that the cell loss requirements are being met, but it is not clear how to interpret such measurements. We propose methods for determining whether or not cell loss requirements are being met. A key idea is to look at the stream of losses as successive clusters of losses. Often clusters of losses, rather than individual losses, should be regarded as the important “loss events”. Thus we propose modeling the cell loss process by a batch Poisson stochastic process. Successive clusters of losses are assumed to arrive according to a Poisson process. Within each cluster, cell losses do not occur at a single time, but the distance between losses within a cluster should be negligible compared to the distance between clusters. Thus, for the purpose of estimating the cell loss probability, we ignore the spaces between successive cell losses in a cluster of losses. Asymptotic theory suggests that the counting process of losses initiating clusters often should be approximately a Poisson process even though the cell arrival process is not nearly Poisson. The batch Poisson model is relatively easy to test statistically and fit; e.g., the batch-size distribution and the batch arrival rate can readily be estimated from cell loss data. Since batch (cluster sizes may be highly variable, it may be useful to focus on the number of batches instead of the number of cells in a measurement interval. We also propose a method for approximately determining the parameters of a special batch Poisson cell loss with geometric batch-size distribution from a queueing model of the buffer content. For this step, we use a reflected Brownian motion (RBM approximation of a G/D/1/C queueing model. We also use the RBM model to estimate the input burstiness given the cell loss rate. In addition, we use the RBM model to

  11. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several

  12. Iterative regularization with minimum-residual methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2006-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES - their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  13. Iterative Regularization with Minimum-Residual Methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2007-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  14. Bit-coded regular expression parsing

    DEFF Research Database (Denmark)

    Nielsen, Lasse; Henglein, Fritz

    2011-01-01

    Regular expression parsing is the problem of producing a parse tree of a string for a given regular expression. We show that a compact bit representation of a parse tree can be produced efficiently, in time linear in the product of input string size and regular expression size, by simplifying...... the DFA-based parsing algorithm due to Dub ´e and Feeley to emit the bits of the bit representation without explicitly materializing the parse tree itself. We furthermore show that Frisch and Cardelli’s greedy regular expression parsing algorithm can be straightforwardly modified to produce bit codings...

  15. Adaptive regularization using the entire solution surface.

    Science.gov (United States)

    Wu, S; Shen, X; Geyer, C J

    2009-09-01

    Several sparseness penalties have been suggested for delivery of good predictive performance in automatic variable selection within the framework of regularization. All assume that the true model is sparse. We propose a penalty, a convex combination of the L1- and L∞-norms, that adapts to a variety of situations including sparseness and nonsparseness, grouping and nongrouping. The proposed penalty performs grouping and adaptive regularization. In addition, we introduce a novel homotopy algorithm utilizing subgradients for developing regularization solution surfaces involving multiple regularizers. This permits efficient computation and adaptive tuning. Numerical experiments are conducted using simulation. In simulated and real examples, the proposed penalty compares well against popular alternatives.

  16. Channeling power across ecological systems: social regularities in community organizing.

    Science.gov (United States)

    Christens, Brian D; Inzeo, Paula Tran; Faust, Victoria

    2014-06-01

    Relational and social network perspectives provide opportunities for more holistic conceptualizations of phenomena of interest in community psychology, including power and empowerment. In this article, we apply these tools to build on multilevel frameworks of empowerment by proposing that networks of relationships between individuals constitute the connective spaces between ecological systems. Drawing on an example of a model for grassroots community organizing practiced by WISDOM—a statewide federation supporting local community organizing initiatives in Wisconsin—we identify social regularities (i.e., relational and temporal patterns) that promote empowerment and the development and exercise of social power through building and altering relational ties. Through an emphasis on listening-focused one-to-one meetings, reflection, and social analysis, WISDOM organizing initiatives construct and reinforce social regularities that develop social power in the organizing initiatives and advance psychological empowerment among participant leaders in organizing. These patterns are established by organizationally driven brokerage and mobilization of interpersonal ties, some of which span ecological systems.Hence, elements of these power-focused social regularities can be conceptualized as cross-system channels through which micro-level empowerment processes feed into macro-level exercise of social power, and vice versa. We describe examples of these channels in action, and offer recommendations for theory and design of future action research [corrected] .

  17. Regularity and predictability of human mobility in personal space.

    Directory of Open Access Journals (Sweden)

    Daniel Austin

    Full Text Available Fundamental laws governing human mobility have many important applications such as forecasting and controlling epidemics or optimizing transportation systems. These mobility patterns, studied in the context of out of home activity during travel or social interactions with observations recorded from cell phone use or diffusion of money, suggest that in extra-personal space humans follow a high degree of temporal and spatial regularity - most often in the form of time-independent universal scaling laws. Here we show that mobility patterns of older individuals in their home also show a high degree of predictability and regularity, although in a different way than has been reported for out-of-home mobility. Studying a data set of almost 15 million observations from 19 adults spanning up to 5 years of unobtrusive longitudinal home activity monitoring, we find that in-home mobility is not well represented by a universal scaling law, but that significant structure (predictability and regularity is uncovered when explicitly accounting for contextual data in a model of in-home mobility. These results suggest that human mobility in personal space is highly stereotyped, and that monitoring discontinuities in routine room-level mobility patterns may provide an opportunity to predict individual human health and functional status or detect adverse events and trends.

  18. Regularity and predictability of human mobility in personal space.

    Science.gov (United States)

    Austin, Daniel; Cross, Robin M; Hayes, Tamara; Kaye, Jeffrey

    2014-01-01

    Fundamental laws governing human mobility have many important applications such as forecasting and controlling epidemics or optimizing transportation systems. These mobility patterns, studied in the context of out of home activity during travel or social interactions with observations recorded from cell phone use or diffusion of money, suggest that in extra-personal space humans follow a high degree of temporal and spatial regularity - most often in the form of time-independent universal scaling laws. Here we show that mobility patterns of older individuals in their home also show a high degree of predictability and regularity, although in a different way than has been reported for out-of-home mobility. Studying a data set of almost 15 million observations from 19 adults spanning up to 5 years of unobtrusive longitudinal home activity monitoring, we find that in-home mobility is not well represented by a universal scaling law, but that significant structure (predictability and regularity) is uncovered when explicitly accounting for contextual data in a model of in-home mobility. These results suggest that human mobility in personal space is highly stereotyped, and that monitoring discontinuities in routine room-level mobility patterns may provide an opportunity to predict individual human health and functional status or detect adverse events and trends.

  19. A regularized stationary mean-field game

    KAUST Repository

    Yang, Xianjin

    2016-04-19

    In the thesis, we discuss the existence and numerical approximations of solutions of a regularized mean-field game with a low-order regularization. In the first part, we prove a priori estimates and use the continuation method to obtain the existence of a solution with a positive density. Finally, we introduce the monotone flow method and solve the system numerically.

  20. Automating InDesign with Regular Expressions

    CERN Document Server

    Kahrel, Peter

    2006-01-01

    If you need to make automated changes to InDesign documents beyond what basic search and replace can handle, you need regular expressions, and a bit of scripting to make them work. This Short Cut explains both how to write regular expressions, so you can find and replace the right things, and how to use them in InDesign specifically.

  1. Regular Event Structures and Finite Petri Nets

    DEFF Research Database (Denmark)

    Nielsen, M.; Thiagarajan, P.S.

    2002-01-01

    We present the notion of regular event structures and conjecture that they correspond exactly to finite 1-safe Petri nets. We show that the conjecture holds for the conflict-free case. Even in this restricted setting, the proof is non-trivial and involves a natural subclass of regular event...

  2. On infinite regular and chiral maps

    OpenAIRE

    Arredondo, John A.; Valdez, Camilo Ramírez y Ferrán

    2015-01-01

    We prove that infinite regular and chiral maps take place on surfaces with at most one end. Moreover, we prove that an infinite regular or chiral map on an orientable surface with genus can only be realized on the Loch Ness monster, that is, the topological surface of infinite genus with one end.

  3. Declining trend in the incidence of biopsy-verified coeliac disease in the adult population of Finland, 2005-2014.

    Science.gov (United States)

    Virta, L J; Saarinen, M M; Kolho, K-L

    2017-12-01

    The frequency of coeliac disease (CD) has been on the rise over the past decades, especially in Western Europe, but current trends are unclear. To research the recent temporal changes in the incidence of adult, biopsy-verified coeliac disease and dermatitis herpetiformis (DH) in Finland, a country with a high frequency of coeliac disease. All coeliac disease and DH cases diagnosed at age 20-79 years during 2005-2014 were retrieved from a nationwide database documenting all applicants for monthly compensation to cover the extra cost of maintaining a gluten-free diet. This benefit is granted on the basis of histology, not socioeconomic status. Temporal trends in the annual incidences were estimated using Poisson regression analyses. The total incidence of coeliac disease decreased from 33/100 000 during the years 2005-2006 to 29/100 000 during 2013-2014. The mean annual incidence of coeliac disease was nearly twice as high among women as among men, 42 vs 22 per 100 000, respectively. For middle- and old-aged women, the average rate of decrease in incidence was 4.8% (95% CI 3.9-5.7) per year and for men 3.0% (1.8-4.1) (P for linear trend coeliac disease has increased during the past decades, the incidence of biopsy-verified diagnoses is not increasing, which suggests that exposure to yet unidentified triggering factors for coeliac disease has plateaued among the Finnish adult population. © 2017 John Wiley & Sons Ltd.

  4. Visual mismatch negativity reveals automatic detection of sequential regularity violation.

    Science.gov (United States)

    Stefanics, Gábor; Kimura, Motohiro; Czigler, István

    2011-01-01

    Sequential regularities are abstract rules based on repeating sequences of environmental events, which are useful to make predictions about future events. Here, we tested whether the visual system is capable to detect sequential regularity in unattended stimulus sequences. The visual mismatch negativity (vMMN) component of the event-related potentials is sensitive to the violation of complex regularities (e.g., object-related characteristics, temporal patterns). We used the vMMN component as an index of violation of conditional (if, then) regularities. In the first experiment, to investigate emergence of vMMN and other change-related activity to the violation of conditional rules, red and green disk patterns were delivered in pairs. The majority of pairs comprised of disk patterns with identical colors, whereas in deviant pairs the colors were different. The probabilities of the two colors were equal. The second member of the deviant pairs elicited a vMMN with longer latency and more extended spatial distribution to deviants with lower probability (10 vs. 30%). In the second (control) experiment the emergence of vMMN to violation of a simple, feature-related rule was studied using oddball sequences of stimulus pairs where deviant colors were presented with 20% probabilities. Deviant colored patterns elicited a vMMN, and this component was larger for the second member of the pair, i.e., after a shorter inter-stimulus interval. This result corresponds to the SOA/(v)MMN relationship, expected on the basis of a memory-mismatch process. Our results show that the system underlying vMMN is sensitive to abstract, conditional rules. Representation of such rules implicates expectation of a subsequent event, therefore vMMN can be considered as a correlate of violated predictions about the characteristics of environmental events.

  5. Regular Simple Queues of Protein Contact Maps.

    Science.gov (United States)

    Guo, Qiang-Hui; Sun, Lisa Hui; Wang, Jian

    2017-01-01

    A protein fold can be viewed as a self-avoiding walk in certain lattice model, and its contact map is a graph that represents the patterns of contacts in the fold. Goldman, Istrail, and Papadimitriou showed that a contact map in the 2D square lattice can be decomposed into at most two stacks and one queue. In the terminology of combinatorics, stacks and queues are noncrossing and nonnesting partitions, respectively. In this paper, we are concerned with 2-regular and 3-regular simple queues, for which the degree of each vertex is at most one and the arc lengths are at least 2 and 3, respectively. We show that 2-regular simple queues are in one-to-one correspondence with hill-free Motzkin paths, which have been enumerated by Barcucci, Pergola, Pinzani, and Rinaldi by using the Enumerating Combinatorial Objects method. We derive a recurrence relation for the generating function of Motzkin paths with [Formula: see text] peaks at level i, which reduces to the generating function for hill-free Motzkin paths. Moreover, we show that 3-regular simple queues are in one-to-one correspondence with Motzkin paths avoiding certain patterns. Then we obtain a formula for the generating function of 3-regular simple queues. Asymptotic formulas for 2-regular and 3-regular simple queues are derived based on the generating functions.

  6. Multiple graph regularized protein domain ranking

    Directory of Open Access Journals (Sweden)

    Wang Jim

    2012-11-01

    Full Text Available Abstract Background Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. Results To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. Conclusion The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.

  7. Multiple graph regularized protein domain ranking

    KAUST Repository

    Wang, Jim Jing-Yan

    2012-11-19

    Background: Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods.Results: To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods.Conclusion: The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. 2012 Wang et al; licensee BioMed Central Ltd.

  8. UTP and Temporal Logic Model Checking

    Science.gov (United States)

    Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo

    In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures

  9. Completeness and regularity of generalized fuzzy graphs.

    Science.gov (United States)

    Samanta, Sovan; Sarkar, Biswajit; Shin, Dongmin; Pal, Madhumangal

    2016-01-01

    Fuzzy graphs are the backbone of many real systems like networks, image, scheduling, etc. But, due to some restriction on edges, fuzzy graphs are limited to represent for some systems. Generalized fuzzy graphs are appropriate to avoid such restrictions. In this study generalized fuzzy graphs are introduced. In this study, matrix representation of generalized fuzzy graphs is described. Completeness and regularity are two important parameters of graph theory. Here, regular and complete generalized fuzzy graphs are introduced. Some properties of them are discussed. After that, effective regular graphs are exemplified.

  10. J-regular rings with injectivities

    OpenAIRE

    Shen, Liang

    2010-01-01

    A ring $R$ is called a J-regular ring if R/J(R) is von Neumann regular, where J(R) is the Jacobson radical of R. It is proved that if R is J-regular, then (i) R is right n-injective if and only if every homomorphism from an $n$-generated small right ideal of $R$ to $R_{R}$ can be extended to one from $R_{R}$ to $R_{R}$; (ii) R is right FP-injective if and only if R is right (J, R)-FP-injective. Some known results are improved.

  11. 75 FR 876 - Agency Information Collection Activities: E-Verify Data Collection Survey, New Information...

    Science.gov (United States)

    2010-01-06

    ... SECURITY U.S. Citizenship and Immigration Services Agency Information Collection Activities: E-Verify Data... Collection Under Review: E-Verify Data ] Collection Survey, Control No. OMB-55. The Department of Homeland... Collection: New information collection. (2) Title of the Form/Collection: E-Verify Data Collection. (3...

  12. 31 CFR 363.14 - How will you verify my identity?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application. At...

  13. Lattice-Valued Convergence Spaces: Weaker Regularity and p-Regularity

    Directory of Open Access Journals (Sweden)

    Lingqiang Li

    2014-01-01

    Full Text Available By using some lattice-valued Kowalsky’s dual diagonal conditions, some weaker regularities for Jäger’s generalized stratified L-convergence spaces and those for Boustique et al’s stratified L-convergence spaces are defined and studied. Here, the lattice L is a complete Heyting algebra. Some characterizations and properties of weaker regularities are presented. For Jäger’s generalized stratified L-convergence spaces, a notion of closures of stratified L-filters is introduced and then a new p-regularity is defined. At last, the relationships between p-regularities and weaker regularities are established.

  14. Regularity of optimal transport maps and applications

    CERN Document Server

    Philippis, Guido

    2013-01-01

    In this thesis, we study the regularity of optimal transport maps and its applications to the semi-geostrophic system. The first two chapters survey the known theory, in particular there is a self-contained proof of Brenier’ theorem on existence of optimal transport maps and of Caffarelli’s Theorem on Holder continuity of optimal maps. In the third and fourth chapter we start investigating Sobolev regularity of optimal transport maps, while in Chapter 5 we show how the above mentioned results allows to prove the existence of Eulerian solution to the semi-geostrophic equation. In Chapter 6 we prove partial regularity of optimal maps with respect to a generic cost functions (it is well known that in this case global regularity can not be expected). More precisely we show that if the target and source measure have smooth densities the optimal map is always smooth outside a closed set of measure zero.

  15. Mixed-norm regularization for brain decoding.

    Science.gov (United States)

    Flamary, R; Jrad, N; Phlypo, R; Congedo, M; Rakotomamonjy, A

    2014-01-01

    This work investigates the use of mixed-norm regularization for sensor selection in event-related potential (ERP) based brain-computer interfaces (BCI). The classification problem is cast as a discriminative optimization framework where sensor selection is induced through the use of mixed-norms. This framework is extended to the multitask learning situation where several similar classification tasks related to different subjects are learned simultaneously. In this case, multitask learning helps in leveraging data scarcity issue yielding to more robust classifiers. For this purpose, we have introduced a regularizer that induces both sensor selection and classifier similarities. The different regularization approaches are compared on three ERP datasets showing the interest of mixed-norm regularization in terms of sensor selection. The multitask approaches are evaluated when a small number of learning examples are available yielding to significant performance improvements especially for subjects performing poorly.

  16. Iterative regularization method in generalized inverse beamforming

    Science.gov (United States)

    Zhang, Zhifei; Chen, Si; Xu, Zhongming; He, Yansong; Li, Shu

    2017-05-01

    Beamforming based on microphone array is a method to identify sound sources. It can visualize the sound field of the source plane and reveal interesting acoustic information. Generalized inverse beamforming (GIB) is one important branch of beamforming techniques due to its high identification accuracy and computational efficiency. However, in real testing situation, errors caused by measurement noise and configuration problems may seriously reduce the beamforming accuracy. As an inverse problem, the stability of GIB can be improved with regularization methods. We proposed a new iterative regularization method for GIB by iteratively redefining the form of regularization matrix and calculating the corresponding solution. Moreover, the new method is applied to functional beamforming and double-layer antenna beamforming respectively. Numerical simulations and experiments are implemented. The results show that the proposed regularization method leads to more robust beamforming output and higher accuracy in both the two applications.

  17. The regularization of Old English weak verbs

    Directory of Open Access Journals (Sweden)

    Marta Tío Sáenz

    2015-07-01

    Full Text Available This article deals with the regularization of non-standard spellings of the verbal forms extracted from a corpus. It addresses the question of what the limits of regularization are when lemmatizing Old English weak verbs. The purpose of such regularization, also known as normalization, is to carry out lexicological analysis or lexicographical work. The analysis concentrates on weak verbs from the second class and draws on the lexical database of Old English Nerthus, which has incorporated the texts of the Dictionary of Old English Corpus. As regards the question of the limits of normalization, the solution adopted are, in the first place, that when it is necessary to regularize, normalization is restricted to correspondences based on dialectal and diachronic variation and, secondly, that normalization has to be unidirectional.

  18. Mixed-Norm Regularization for Brain Decoding

    Directory of Open Access Journals (Sweden)

    R. Flamary

    2014-01-01

    Full Text Available This work investigates the use of mixed-norm regularization for sensor selection in event-related potential (ERP based brain-computer interfaces (BCI. The classification problem is cast as a discriminative optimization framework where sensor selection is induced through the use of mixed-norms. This framework is extended to the multitask learning situation where several similar classification tasks related to different subjects are learned simultaneously. In this case, multitask learning helps in leveraging data scarcity issue yielding to more robust classifiers. For this purpose, we have introduced a regularizer that induces both sensor selection and classifier similarities. The different regularization approaches are compared on three ERP datasets showing the interest of mixed-norm regularization in terms of sensor selection. The multitask approaches are evaluated when a small number of learning examples are available yielding to significant performance improvements especially for subjects performing poorly.

  19. Regularization of B-Spline Objects.

    Science.gov (United States)

    Xu, Guoliang; Bajaj, Chandrajit

    2011-01-01

    By a d-dimensional B-spline object (denoted as ), we mean a B-spline curve (d = 1), a B-spline surface (d = 2) or a B-spline volume (d = 3). By regularization of a B-spline object we mean the process of relocating the control points of such that they approximate an isometric map of its definition domain in certain directions and is shape preserving. In this paper we develop an efficient regularization method for , d = 1, 2, 3 based on solving weak form L(2)-gradient flows constructed from the minimization of certain regularizing energy functionals. These flows are integrated via the finite element method using B-spline basis functions. Our experimental results demonstrate that our new regularization method is very effective.

  20. On Comparison of Adaptive Regularization Methods

    DEFF Research Database (Denmark)

    Sigurdsson, Sigurdur; Larsen, Jan; Hansen, Lars Kai

    2000-01-01

    at the expense of introducing extra bias. The overall objective of adaptive regularization is to tune the amount of regularization ensuring minimal generalization error. Regularization is a supplement to direct model selection techniques like step-wise selection and one would prefer a hybrid scheme; however...... different criteria, e.g., the Bayesian evidence. The evidence expresses basically the probability of the model, which is conceptually different from generalization error; however, asymptotically for large training data sets they will converge. First the basic model definition, training and generalization......Modeling with flexible models, such as neural networks, requires careful control of the model complexity and generalization ability of the resulting model which finds expression in the ubiquitous bias-variance dilemma. Regularization is a tool for optimizing the model structure reducing variance...

  1. Fast and compact regular expression matching

    DEFF Research Database (Denmark)

    Bille, Philip; Farach-Colton, Martin

    2008-01-01

    We study 4 problems in string matching, namely, regular expression matching, approximate regular expression matching, string edit distance, and subsequence indexing, on a standard word RAM model of computation that allows logarithmic-sized words to be manipulated in constant time. We show how to ...... to improve the space and/or remove a dependency on the alphabet size for each problem using either an improved tabulation technique of an existing algorithm or by combining known algorithms in a new way....

  2. Regularity of Tor for weakly stable ideals

    Directory of Open Access Journals (Sweden)

    Katie Ansaldi

    2015-05-01

    Full Text Available It is proved that if I and J are weakly stable ideals in a polynomial ring R = k[x_1, . . ., x_n], with k a field, then the regularity of Tor^R_i (R/I, R/J has the expected upper bound. We also give a bound for the regularity of Ext^i_R (R/I, R for I a weakly stable ideal.

  3. Deterministic automata for extended regular expressions

    Directory of Open Access Journals (Sweden)

    Syzdykov Mirzakhmet

    2017-12-01

    Full Text Available In this work we present the algorithms to produce deterministic finite automaton (DFA for extended operators in regular expressions like intersection, subtraction and complement. The method like “overriding” of the source NFA(NFA not defined with subset construction rules is used. The past work described only the algorithm for AND-operator (or intersection of regular languages; in this paper the construction for the MINUS-operator (and complement is shown.

  4. Towards General Temporal Aggregation

    DEFF Research Database (Denmark)

    Boehlen, Michael H.; Gamper, Johann; Jensen, Christian Søndergaard

    2008-01-01

    Most database applications manage time-referenced, or temporal, data. Temporal data management is difficult when using conventional database technology, and many contributions have been made for how to better model, store, and query temporal data. Temporal aggregation illustrates well the problem...

  5. Applying the Water Vapor Radiometer to Verify the Precipitable Water Vapor Measured by GPS

    Directory of Open Access Journals (Sweden)

    Ta-Kang Yeh

    2014-01-01

    Full Text Available Taiwan is located at the land-sea interface in a subtropical region. Because the climate is warm and moist year round, there is a large and highly variable amount of water vapor in the atmosphere. In this study, we calculated the Zenith Wet Delay (ZWD of the troposphere using the ground-based Global Positioning System (GPS. The ZWD measured by two Water Vapor Radiometers (WVRs was then used to verify the ZWD that had been calculated using GPS. We also analyzed the correlation between the ZWD and the precipitation data of these two types of station. Moreover, we used the observational data from 14 GPS and rainfall stations to evaluate three cases. The offset between the GPS-ZWD and the WVR-ZWD ranged from 1.31 to 2.57 cm. The correlation coefficient ranged from 0.89 to 0.93. The results calculated from GPS and those measured using the WVR were very similar. Moreover, when there was no rain, light rain, moderate rain, or heavy rain, the flatland station ZWD was 0.31, 0.36, 0.38, or 0.40 m, respectively. The mountain station ZWD exhibited the same trend. Therefore, these results have demonstrated that the potential and strength of precipitation in a region can be estimated according to its ZWD values. Now that the precision of GPS-ZWD has been confirmed, this method can eventually be expanded to the more than 400 GPS stations in Taiwan and its surrounding islands. The near real-time ZWD data with improved spatial and temporal resolution can be provided to the city and countryside weather-forecasting system that is currently under development. Such an exchange would fundamentally improve the resources used to generate weather forecasts.

  6. Single Cell "Glucose Nanosensor" Verifies Elevated Glucose Levels in Individual Cancer Cells.

    Science.gov (United States)

    Nascimento, Raphael A S; Özel, Rıfat Emrah; Mak, Wai Han; Mulato, Marcelo; Singaram, Bakthan; Pourmand, Nader

    2016-02-10

    Because the transition from oxidative phosphorylation to anaerobic glycolytic metabolism is a hallmark of cancer progression, approaches to identify single living cancer cells by their unique glucose metabolic signature would be useful. Here, we present nanopipettes specifically developed to measure glucose levels in single cells with temporal and spatial resolution, and we use this technology to verify the hypothesis that individual cancer cells can indeed display higher intracellular glucose levels. The nanopipettes were functionalized as glucose nanosensors by immobilizing glucose oxidase (GOx) covalently to the tip so that the interaction of glucose with GOx resulted in a catalytic oxidation of β-d-glucose to d-gluconic acid, which was measured as a change in impedance due to drop in pH of the medium at the nanopipette tip. Calibration studies showed a direct relationship between impedance changes at the tip and glucose concentration in solution. The glucose nanosensor quantified single cell intracellular glucose levels in human fibroblasts and the metastatic breast cancer lines MDA-MB-231 and MCF7 and revealed that the cancer cells expressed reproducible and reliable increases in glucose levels compared to the nonmalignant cells. Nanopipettes allow repeated sampling of the same cell, as cells remain viable during and after measurements. Therefore, nanopipette-based glucose sensors provide an approach to compare changes in glucose levels with changes in proliferative or metastatic state. The platform has great promise for mechanistic investigations, as a diagnostic tool to distinguish cancer cells from nonmalignant cells in heterogeneous tissue biopsies, as well as a tool for monitoring cancer progression in situ.

  7. A New Objective Technique for Verifying Mesoscale Numerical Weather Prediction Models

    Science.gov (United States)

    Case, Jonathan L.; Manobianco, John; Lane, John E.; Immer, Christopher D.

    2003-01-01

    This report presents a new objective technique to verify predictions of the sea-breeze phenomenon over east-central Florida by the Regional Atmospheric Modeling System (RAMS) mesoscale numerical weather prediction (NWP) model. The Contour Error Map (CEM) technique identifies sea-breeze transition times in objectively-analyzed grids of observed and forecast wind, verifies the forecast sea-breeze transition times against the observed times, and computes the mean post-sea breeze wind direction and speed to compare the observed and forecast winds behind the sea-breeze front. The CEM technique is superior to traditional objective verification techniques and previously-used subjective verification methodologies because: It is automated, requiring little manual intervention, It accounts for both spatial and temporal scales and variations, It accurately identifies and verifies the sea-breeze transition times, and It provides verification contour maps and simple statistical parameters for easy interpretation. The CEM uses a parallel lowpass boxcar filter and a high-order bandpass filter to identify the sea-breeze transition times in the observed and model grid points. Once the transition times are identified, CEM fits a Gaussian histogram function to the actual histogram of transition time differences between the model and observations. The fitted parameters of the Gaussian function subsequently explain the timing bias and variance of the timing differences across the valid comparison domain. Once the transition times are all identified at each grid point, the CEM computes the mean wind direction and speed during the remainder of the day for all times and grid points after the sea-breeze transition time. The CEM technique performed quite well when compared to independent meteorological assessments of the sea-breeze transition times and results from a previously published subjective evaluation. The algorithm correctly identified a forecast or observed sea-breeze occurrence

  8. A two-component Matched Interface and Boundary (MIB) regularization for charge singularity in implicit solvation

    Science.gov (United States)

    Geng, Weihua; Zhao, Shan

    2017-12-01

    We present a new Matched Interface and Boundary (MIB) regularization method for treating charge singularity in solvated biomolecules whose electrostatics are described by the Poisson-Boltzmann (PB) equation. In a regularization method, by decomposing the potential function into two or three components, the singular component can be analytically represented by the Green's function, while other components possess a higher regularity. Our new regularization combines the efficiency of two-component schemes with the accuracy of the three-component schemes. Based on this regularization, a new MIB finite difference algorithm is developed for solving both linear and nonlinear PB equations, where the nonlinearity is handled by using the inexact-Newton's method. Compared with the existing MIB PB solver based on a three-component regularization, the present algorithm is simpler to implement by circumventing the work to solve a boundary value Poisson equation inside the molecular interface and to compute related interface jump conditions numerically. Moreover, the new MIB algorithm becomes computationally less expensive, while maintains the same second order accuracy. This is numerically verified by calculating the electrostatic potential and solvation energy on the Kirkwood sphere on which the analytical solutions are available and on a series of proteins with various sizes.

  9. MAP-MRF-Based Super-Resolution Reconstruction Approach for Coded Aperture Compressive Temporal Imaging

    Directory of Open Access Journals (Sweden)

    Tinghua Zhang

    2018-02-01

    Full Text Available Coded Aperture Compressive Temporal Imaging (CACTI can afford low-cost temporal super-resolution (SR, but limits are imposed by noise and compression ratio on reconstruction quality. To utilize inter-frame redundant information from multiple observations and sparsity in multi-transform domains, a robust reconstruction approach based on maximum a posteriori probability and Markov random field (MAP-MRF model for CACTI is proposed. The proposed approach adopts a weighted 3D neighbor system (WNS and the coordinate descent method to perform joint estimation of model parameters, to achieve the robust super-resolution reconstruction. The proposed multi-reconstruction algorithm considers both total variation (TV and ℓ 2 , 1 norm in wavelet domain to address the minimization problem for compressive sensing, and solves it using an accelerated generalized alternating projection algorithm. The weighting coefficient for different regularizations and frames is resolved by the motion characteristics of pixels. The proposed approach can provide high visual quality in the foreground and background of a scene simultaneously and enhance the fidelity of the reconstruction results. Simulation results have verified the efficacy of our new optimization framework and the proposed reconstruction approach.

  10. Temporal logic runtime verification of dynamic systems

    CSIR Research Space (South Africa)

    Seotsanyana, M

    2010-07-01

    Full Text Available linear temporal logic as well as extended regular expressions. Java with assertion (Jass) [8] is a general monitoring methodology implemented for sequential, concurrent and reactive systems written in java. The tool Jass is a pre- compiler... that translates annotations into a pure java code in which a compliance with specification is tested dynamically at runtime. Assertions extends the Design by Contract [11], that allows specification of assertions in the form of pre- and post-conditions, class...

  11. Gravitational Quasinormal Modes of Regular Phantom Black Hole

    Directory of Open Access Journals (Sweden)

    Jin Li

    2017-01-01

    Full Text Available We investigate the gravitational quasinormal modes (QNMs for a type of regular black hole (BH known as phantom BH, which is a static self-gravitating solution of a minimally coupled phantom scalar field with a potential. The studies are carried out for three different spacetimes: asymptotically flat, de Sitter (dS, and anti-de Sitter (AdS. In order to consider the standard odd parity and even parity of gravitational perturbations, the corresponding master equations are derived. The QNMs are discussed by evaluating the temporal evolution of the perturbation field which, in turn, provides direct information on the stability of BH spacetime. It is found that in asymptotically flat, dS, and AdS spacetimes the gravitational perturbations have similar characteristics for both odd and even parities. The decay rate of perturbation is strongly dependent on the scale parameter b, which measures the coupling strength between phantom scalar field and the gravity. Furthermore, through the analysis of Hawking radiation, it is shown that the thermodynamics of such regular phantom BH is also influenced by b. The obtained results might shed some light on the quantum interpretation of QNM perturbation.

  12. Verified Gaming

    DEFF Research Database (Denmark)

    Kiniry, Joseph Roland; Zimmerman, Daniel

    2011-01-01

    ---falls every year and any mention of mathematics in the classroom seems to frighten students away. So the question is: How do we attract new students in computing to the area of dependable software systems? Over the past several years at three universities we have experimented with the use of computer games...... as a target domain for software engineering project courses that focus on reliable systems engineering. This position paper summarizes our experiences in incorporating rigorous software engineering into courses whose projects include computer games....

  13. Conservative regularization of ideal hydrodynamics and magnetohydrodynamics

    Science.gov (United States)

    Thyagaraja, A.

    2010-03-01

    Inviscid, incompressible hydrodynamics and incompressible ideal magnetohydrodynamics (MHD) share many properties such as time-reversal invariance of equations, conservation laws, and certain topological features. In three dimensions, these systems may lead to singular solutions (involving vortex and current sheets). While dissipative (viscoresistive) effects can regularize the equations leading to bounded solutions to the initial-boundary value (Cauchy) problem which presumably exist uniquely, the time-reversal symmetry and associated conservation properties are certainly destroyed. The present work is analogous to (and suggested by) the Korteweg-de Vries regularization of the one-dimensional, nonlinear kinematic wave equation. Thus the regularizations applied to the original equations of hydrodynamics and ideal MHD retain conservation properties and the symmetries of the original equations. Integral invariants which generalize those known for the original systems are shown to imply bounded enstrophy. The regularization developed can also be applied to the corresponding dissipative models (such as the Navier-Stokes equations and the viscoresistive MHD equations) and may imply interesting regularity properties for the solutions of the latter as well. The models developed thus have intrinsic mathematical interest as well as possible applications to large-scale numerical simulations in systems where dissipative effects are extremely small or even absent.

  14. Regularities, Natural Patterns and Laws of Nature

    Directory of Open Access Journals (Sweden)

    Stathis Psillos

    2014-02-01

    Full Text Available  The goal of this paper is to sketch an empiricist metaphysics of laws of nature. The key idea is that there are regularities without regularity-enforcers. Differently put, there are natural laws without law-makers of a distinct metaphysical kind. This sketch will rely on the concept of a natural pattern and more significantly on the existence of a network of natural patterns in nature. The relation between a regularity and a pattern will be analysed in terms of mereology.  Here is the road map. In section 2, I will briefly discuss the relation between empiricism and metaphysics, aiming to show that an empiricist metaphysics is possible. In section 3, I will offer arguments against stronger metaphysical views of laws. Then, in section 4 I will motivate nomic objectivism. In section 5, I will address the question ‘what is a regularity?’ and will develop a novel answer to it, based on the notion of a natural pattern. In section 6, I will raise the question: ‘what is a law of nature?’, the answer to which will be: a law of nature is a regularity that is characterised by the unity of a natural pattern.

  15. Fractional Regularization Term for Variational Image Registration

    Directory of Open Access Journals (Sweden)

    Rafael Verdú-Monedero

    2009-01-01

    Full Text Available Image registration is a widely used task of image analysis with applications in many fields. Its classical formulation and current improvements are given in the spatial domain. In this paper a regularization term based on fractional order derivatives is formulated. This term is defined and implemented in the frequency domain by translating the energy functional into the frequency domain and obtaining the Euler-Lagrange equations which minimize it. The new regularization term leads to a simple formulation and design, being applicable to higher dimensions by using the corresponding multidimensional Fourier transform. The proposed regularization term allows for a real gradual transition from a diffusion registration to a curvature registration which is best suited to some applications and it is not possible in the spatial domain. Results with 3D actual images show the validity of this approach.

  16. Stream Processing Using Grammars and Regular Expressions

    DEFF Research Database (Denmark)

    Rasmussen, Ulrik Terp

    In this dissertation we study regular expression based parsing and the use of grammatical specifications for the synthesis of fast, streaming string-processing programs. In the first part we develop two linear-time algorithms for regular expression based parsing with Perl-style greedy...... disambiguation. The first algorithm operates in two passes in a semi-streaming fashion, using a constant amount of working memory and an auxiliary tape storage which is written in the first pass and consumed by the second. The second algorithm is a single-pass and optimally streaming algorithm which outputs...... as much of the parse tree as is semantically possible based on the input prefix read so far, and resorts to buffering as many symbols as is required to resolve the next choice. Optimality is obtained by performing a PSPACE-complete pre-analysis on the regular expression. In the second part we present...

  17. Surface counterterms and regularized holographic complexity

    Science.gov (United States)

    Yang, Run-Qiu; Niu, Chao; Kim, Keun-Young

    2017-09-01

    The holographic complexity is UV divergent. As a finite complexity, we propose a "regularized complexity" by employing a similar method to the holographic renor-malization. We add codimension-two boundary counterterms which do not contain any boundary stress tensor information. It means that we subtract only non-dynamic back-ground and all the dynamic information of holographic complexity is contained in the regularized part. After showing the general counterterms for both CA and CV conjectures in holographic spacetime dimension 5 and less, we give concrete examples: the BTZ black holes and the four and five dimensional Schwarzschild AdS black holes. We propose how to obtain the counterterms in higher spacetime dimensions and show explicit formulas only for some special cases with enough symmetries. We also compute the complexity of formation by using the regularized complexity.

  18. REX XML shallow parsing with regular expressions

    CERN Document Server

    Cameron, R D

    1999-01-01

    The syntax of XML is simple enough that it is possible to parse an XML document into a list of its markup and text items using a single regular expression. Such a shallow parse of an XML document can be very useful for the construction of a variety of lightweight XML processing tools. However, complex regular expressions can be difficult to construct and even more difficult to read. Using a form of literate programming for regular expressions, this paper documents a set of XML shallow parsing expressions that can be used a basis for simple, correct, efficient, robust and language-independent XML shallow parsing. Complete shallow parser implementations of less than 50 lines each in Perl, JavaScript and Lex/Flex are also given. (0 refs).

  19. Contour Propagation With Riemannian Elasticity Regularization

    DEFF Research Database (Denmark)

    Bjerre, Troels; Hansen, Mads Fogtmann; Sapru, W.

    2011-01-01

    guided corrections. This study compares manual delineations in replanning CT scans of head-and-neck patients to automatic contour propagation using deformable registration with Riemannian regularization. The potential benefit of locally assigned regularization parameters according to tissue type...... is investigated. Materials/Methods: Planning PET-CT scans plus 2 - 4 subsequent replanning CTs for five head-and-neck cancer patients were obtained. The Gross Tumor Volume (GTV) was manually delineated on the planning CT by an experienced clinician and manually propagated by pasting the set of contours from...... the planning CT onto the rescans and correcting to reflect actual anatomical changes. For deformable registration, a free-form, multi-level, B-spline deformation model with Riemannian elasticity, penalizing non-rigid local deformations, and volumetric changes, was used. Regularization parameters was defined...

  20. Generalized inverse beamforming with optimized regularization strategy

    Science.gov (United States)

    Zavala, P. A. G.; De Roeck, W.; Janssens, K.; Arruda, J. R. F.; Sas, P.; Desmet, W.

    2011-04-01

    A promising recent development on acoustic source localization and source strength estimation is the generalized inverse beamforming, which is based on the microphone array cross-spectral matrix eigenstructure. This method presents several advantages over the conventional beamforming, including a higher accuracy on the source center localization and strength estimation even with distributed coherent sources. This paper aims to improve the strength estimation of the generalized inverse beamforming method with an automated regularization factor definition. Also in this work, a virtual target grid is introduced, and source mapping and strength estimation are obtained disregarding, as much as possible, the reflections influence. Two simple problems are used to compare the generalized inverse performance with fixed regularization factor to performance obtained using the optimized regularization strategy. Numerical and experimental data are used, and two other strength estimation methods are also evaluated for reference.

  1. Regularization and Migration Policy in Europe

    Directory of Open Access Journals (Sweden)

    Philippe de Bruycker

    2001-05-01

    Full Text Available The following pages present, in a general way, the contents of Regularization of illegal immigrants in the European Union, which includes a comparative synthesis and statistical information for each of the eight countries involved; a description of actions since the beginning of the year 2000; and a systematic analysis of the different categories of foreigners, the types of regularization carried out, and the rules that have governed these actions.In relation to regularization, the author considers the political coherence of the actions taken by the member states as well as how they relate to two ever more crucial aspects of immigration policy –the integration of legal resident immigrants and the fight againstillegal immigration in the context of a control of migratory flows.

  2. Regular-fat dairy and human health

    DEFF Research Database (Denmark)

    Astrup, Arne; Bradley, Beth H Rice; Brenna, J Thomas

    2016-01-01

    to disseminate, explore and discuss the state of the science on the relationship between regular fat dairy products and health, symposia were programmed by dairy industry organizations in Europe and North America at The Eurofed Lipids Congress (2014) in France, The Dairy Nutrition Annual Symposium (2014......) in Canada, The American Society for Nutrition Annual Meeting held in conjunction with Experimental Biology (2015) in the United States, and The Federation of European Nutrition Societies (2015) in Germany. This synopsis of these symposia describes the complexity of dairy fat and the effects regular...

  3. Deconvolution and Regularization with Toeplitz Matrices

    DEFF Research Database (Denmark)

    Hansen, Per Christian

    2002-01-01

    treatment requires the use of regularization methods. The corresponding computational problem takes the form of structured matrix problem with a Toeplitz or block Toeplitz coefficient matrix. The aim of this paper is to present a tutorial survey of numerical algorithms for the practical treatment...... of these discretized deconvolution problems, with emphasis on methods that take the special structure of the matrix into account. Wherever possible, analogies to classical DFT-based deconvolution problems are drawn. Among other things, we present direct methods for regularization with Toeplitz matrices, and we show...

  4. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    Science.gov (United States)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  5. On a correspondence between regular and non-regular operator monotone functions

    DEFF Research Database (Denmark)

    Gibilisco, P.; Hansen, Frank; Isola, T.

    2009-01-01

    We prove the existence of a bijection between the regular and the non-regular operator monotone functions satisfying a certain functional equation. As an application we give a new proof of the operator monotonicity of certain functions related to the Wigner-Yanase-Dyson skew information....

  6. Semantics of Temporal Models with Multiple Temporal Dimensions

    DEFF Research Database (Denmark)

    Kraft, Peter; Sørensen, Jens Otto

    Semantics of temporal models with multi temporal dimensions are examined progressing from non-temporal models unto uni-temporal, and further unto bi- and tri-temporal models. An example of a uni-temporal model is the valid time model, an example of a bi-temporal model is the valid time/transactio...

  7. Orientating Children in Regular Schools towards Impairments.

    Science.gov (United States)

    Gorman, P. P.

    1979-01-01

    The author discusses audio-visual material and other education techniques which eventually may facilitate the integration of children with mental, physical, and sensory impairments in the regular classroom. Films, radio texts, theater plays, and books are considered. Emphasis is on the introduction of impairments to the nonhandicapped population.…

  8. Regular ultrafilters and finite square principles

    NARCIS (Netherlands)

    Kennedy, J.; Shelah, S.; Väänänen, J.

    2008-01-01

    We show that many singular cardinals lambda above a strongly compact cardinal have regular ultrafilters D that violate the finite square principle square(fin)(lambda,D) introduced in [3]. For such ultrafilters D and cardinals lambda there are models of size lambda for which M-lambda/D is not

  9. Broad temperature range antiferroelectric regular mixtures

    Science.gov (United States)

    Dabrowski, Roman; Czuprynski, Krzysztof; Gasowska, J.; Oton, Jose; Quintana, Xabier; Castillo, P. L.; Bennis, N.

    2004-09-01

    Tristate regular mixtures with different electro-optical properties such as threshold voltage, saturation voltage, holding ratio and response time are presented. The relation of properties with the structure of compounds is discussed. All of mixtures show only moderate dynamic and static contrast but big gray level scale without hysteresis for positive and negative field driving.

  10. Regular matrix transformation on triple sequence spaces

    Directory of Open Access Journals (Sweden)

    Shyamal Debnath

    2017-10-01

    Full Text Available The main aim of this paper is to introduce the necessary and sufficient condition for a particular type of transformation of the form A: (a......  be regular from a triple sequence space to another triple sequence space.

  11. Regular and context-free nominal traces

    DEFF Research Database (Denmark)

    Degano, Pierpaolo; Ferrari, Gian-Luigi; Mezzetti, Gianluca

    2017-01-01

    Two kinds of automata are presented, for recognising new classes of regular and context-free nominal languages. We compare their expressive power with analogous proposals in the literature, showing that they express novel classes of languages. Although many properties of classical languages hold ...

  12. On bigraded regularities of Rees algebra

    Indian Academy of Sciences (India)

    Ramakrishna Nanduri

    2017-08-03

    Aug 3, 2017 ... K-algebra were defined in terms of syzygies [18, Definition 5.4] (also see [10, Defini- tion 1.1.3]). Since R(I) ... way (see Definition 2.3). We also define the Tor bigraded regularities of R; see ...... referee for a thorough reading of the manuscript and suggesting some improvements. References. [1] Aramova A ...

  13. From recreational to regular drug use

    DEFF Research Database (Denmark)

    Järvinen, Margaretha; Ravn, Signe

    2011-01-01

    to extricate themselves from this pattern. Hence, when regular drug users talk about their future, it is not a future characterised by total abstinence from illegal drugs but a future where they have rolled back their drug use career to the recreational drug use pattern they started out with. Empirically...

  14. On regular riesz operators | Raubenheimer | Quaestiones ...

    African Journals Online (AJOL)

    The r-asymptotically quasi finite rank operators on Banach lattices are examples of regular Riesz operators. We characterise Riesz elements in a subalgebra of a Banach algebra in terms of Riesz elements in the Banach algebra. This enables us to characterise r-asymptotically quasi finite rank operators in terms of adjoint ...

  15. Regular Sleep Makes for Happier College Students

    Science.gov (United States)

    ... https://medlineplus.gov/news/fullstory_166856.html Regular Sleep Makes for Happier College Students When erratic snoozers improve shut-eye habits, ... Health and Human Services. More Health News on College Health Healthy Sleep Recent Health News Related MedlinePlus Health Topics College ...

  16. 28 CFR 540.44 - Regular visitors.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Regular visitors. 540.44 Section 540.44... and if there exists no reason to exclude them. (c) Friends and associates. The visiting privilege ordinarily will be extended to friends and associates having an established relationship with the inmate...

  17. Neural Classifier Construction using Regularization, Pruning

    DEFF Research Database (Denmark)

    Hintz-Madsen, Mads; Hansen, Lars Kai; Larsen, Jan

    1998-01-01

    In this paper we propose a method for construction of feed-forward neural classifiers based on regularization and adaptive architectures. Using a penalized maximum likelihood scheme, we derive a modified form of the entropic error measure and an algebraic estimate of the test error. In conjunction...

  18. Regularization algorithms based on total least squares

    DEFF Research Database (Denmark)

    Hansen, Per Christian; O'Leary, Dianne P.

    1996-01-01

    or truncated {\\em SVD}, are not designed for problems in which both the coefficient matrix and the right-hand side are known only approximately. For this reason, we develop {\\em TLS}\\/-based regularization methods that take this situation into account. Here, we survey two different approaches to incorporation...

  19. Stabilization, pole placement, and regular implementability

    NARCIS (Netherlands)

    Belur, MN; Trentelman, HL

    In this paper, we study control by interconnection of linear differential systems. We give necessary and sufficient conditions for regular implementability of a-given linear, differential system. We formulate the problems of stabilization and pole placement as problems of finding a suitable,

  20. Stabilization, Pole Placement, and Regular Implementability

    NARCIS (Netherlands)

    Belur, Madhu N.; Trentelman, H.L.

    2002-01-01

    In this paper, we study control by interconnection of linear differential systems. We give necessary and sufficient conditions for regular implementability of a given linear differential system. We formulate the problems of stabilization and pole placement as problems of finding a suitable,

  1. A Sim(2 invariant dimensional regularization

    Directory of Open Access Journals (Sweden)

    J. Alfaro

    2017-09-01

    Full Text Available We introduce a Sim(2 invariant dimensional regularization of loop integrals. Then we can compute the one loop quantum corrections to the photon self energy, electron self energy and vertex in the Electrodynamics sector of the Very Special Relativity Standard Model (VSRSM.

  2. Annotation of regular polysemy and underspecification

    DEFF Research Database (Denmark)

    Martínez Alonso, Héctor; Pedersen, Bolette Sandford; Bel, Núria

    2013-01-01

    We present the result of an annotation task on regular polysemy for a series of seman- tic classes or dot types in English, Dan- ish and Spanish. This article describes the annotation process, the results in terms of inter-encoder agreement, and the sense distributions obtained with two methods...

  3. Motor Activity Improves Temporal Expectancy

    Science.gov (United States)

    Fautrelle, Lilian; Mareschal, Denis; French, Robert; Addyman, Caspar; Thomas, Elizabeth

    2015-01-01

    Certain brain areas involved in interval timing are also important in motor activity. This raises the possibility that motor activity might influence interval timing. To test this hypothesis, we assessed interval timing in healthy adults following different types of training. The pre- and post-training tasks consisted of a button press in response to the presentation of a rhythmic visual stimulus. Alterations in temporal expectancy were evaluated by measuring response times. Training consisted of responding to the visual presentation of regularly appearing stimuli by either: (1) pointing with a whole-body movement, (2) pointing only with the arm, (3) imagining pointing with a whole-body movement, (4) simply watching the stimulus presentation, (5) pointing with a whole-body movement in response to a target that appeared at irregular intervals (6) reading a newspaper. Participants performing a motor activity in response to the regular target showed significant improvements in judgment times compared to individuals with no associated motor activity. Individuals who only imagined pointing with a whole-body movement also showed significant improvements. No improvements were observed in the group that trained with a motor response to an irregular stimulus, hence eliminating the explanation that the improved temporal expectations of the other motor training groups was purely due to an improved motor capacity to press the response button. All groups performed a secondary task equally well, hence indicating that our results could not simply be attributed to differences in attention between the groups. Our results show that motor activity, even when it does not play a causal or corrective role, can lead to improved interval timing judgments. PMID:25806813

  4. Experimental reduction of pain catastrophizing modulates pain report but not spinal nociception as verified by mediation analyses.

    Science.gov (United States)

    Terry, Ellen L; Thompson, Kathryn A; Rhudy, Jamie L

    2015-08-01

    Pain catastrophizing is associated with enhanced pain; however, the mechanisms by which it modulates pain are poorly understood. Evidence suggests that catastrophizing modulates supraspinal processing of pain but does not modulate spinal nociception (as assessed by nociceptive flexion reflex [NFR]). Unfortunately, most NFR studies have been correlational. To address this, this study experimentally reduced catastrophizing to determine whether it modulates spinal nociception (NFR). Healthy pain-free participants (N = 113) were randomly assigned to a brief 30-minute catastrophizing reduction manipulation or a control group that received pain education. Before and after manipulations, 2 types of painful stimuli were delivered to elicit (1) NFR (single trains of stimuli) and (2) temporal summation of NFR (3 stimulations at 2 Hz). After each set of stimuli, participants were asked to report their pain intensity and unpleasantness, as well as their situation-specific catastrophizing. Manipulation checks verified that catastrophizing was effectively reduced. Furthermore, pain intensity and unpleasantness to both stimulation types were reduced by the catastrophizing manipulation, effects that were mediated by catastrophizing. Although NFRs were not affected by the catastrophizing manipulation, temporal summation of NFR was reduced. However, this effect was not mediated by catastrophizing. These results indicate that reductions in catastrophizing lead to reductions in pain perception but do not modulate spinal nociception and provides further evidence that catastrophizing modulates pain at the supraspinal, not the spinal, level.

  5. Die verifiëring, verfyning en toepassing van leksikografiese liniale ...

    African Journals Online (AJOL)

    rbr

    Die doel van hierdie artikel is dus om die liniaal vir Afrikaans te verifieer en te verfyn. Verifiëring geskied met behulp van 'n .... gorie B, kategorie C, ens. In eenvoudige Afrikaans geformuleer, is die ... Absolute versus relatiewe liniaalwaardes. Die doel van Prinsloo en De Schryver met die samestelling van leksikografiese.

  6. Business rescue decision making through verifier determinants – ask the specialists

    Directory of Open Access Journals (Sweden)

    Marius Pretorius

    2013-11-01

    Practical/Managerial implications: Decision makers and affected persons could benefit from the insights obtained through this study. Confirming early warning signs through verifier determinants would be beneficial for entrepreneurs who are creditors, rescue practitioners, government regulators, court officials and educators alike. Contribution/Value add: Knowing the verifier determinants could assist decision making and improve the effectiveness of rescue strategies.

  7. 77 FR 70484 - Preoperational Testing of Onsite Electric Power Systems To Verify Proper Load Group Assignments...

    Science.gov (United States)

    2012-11-26

    ... COMMISSION Preoperational Testing of Onsite Electric Power Systems To Verify Proper Load Group Assignments... Power Systems to Verify Proper Load Group Assignments, Electrical Separation, and Redundancy.'' DG-1294... encompass preoperational testing of electrical power systems used to meet current Station Blackout...

  8. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    Science.gov (United States)

    Stephen M Ogle; Kenneth Davis; Thomas Lauvaux; Andrew Schuh; Dan Cooley; Tristram O West; Linda S Heath; Natasha L Miles; Scott Richardson; F Jay Breidt; James E Smith; Jessica L McCarty; Kevin R Gurney; Pieter Tans; A Scott. Denning

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country's contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated...

  9. High quality 4D cone-beam CT reconstruction using motion-compensated total variation regularization

    Science.gov (United States)

    Zhang, Hua; Ma, Jianhua; Bian, Zhaoying; Zeng, Dong; Feng, Qianjin; Chen, Wufan

    2017-04-01

    Four dimensional cone-beam computed tomography (4D-CBCT) has great potential clinical value because of its ability to describe tumor and organ motion. But the challenge in 4D-CBCT reconstruction is the limited number of projections at each phase, which result in a reconstruction full of noise and streak artifacts with the conventional analytical algorithms. To address this problem, in this paper, we propose a motion compensated total variation regularization approach which tries to fully explore the temporal coherence of the spatial structures among the 4D-CBCT phases. In this work, we additionally conduct motion estimation/motion compensation (ME/MC) on the 4D-CBCT volume by using inter-phase deformation vector fields (DVFs). The motion compensated 4D-CBCT volume is then viewed as a pseudo-static sequence, of which the regularization function was imposed on. The regularization used in this work is the 3D spatial total variation minimization combined with 1D temporal total variation minimization. We subsequently construct a cost function for a reconstruction pass, and minimize this cost function using a variable splitting algorithm. Simulation and real patient data were used to evaluate the proposed algorithm. Results show that the introduction of additional temporal correlation along the phase direction can improve the 4D-CBCT image quality.

  10. Regular Magnetic Black Hole Gravitational Lensing

    Science.gov (United States)

    Liang, Jun

    2017-05-01

    The Bronnikov regular magnetic black hole as a gravitational lens is studied. In nonlinear electrodynamics, photons do not follow null geodesics of background geometry, but move along null geodesics of a corresponding effective geometry. To study the Bronnikov regular magnetic black hole gravitational lensing in the strong deflection limit, the corresponding effective geometry should be obtained firstly. This is the most important and key step. We obtain the deflection angle in the strong deflection limit, and further calculate the angular positions and magnifications of relativistic images as well as the time delay between different relativistic images. The influence of the magnetic charge on the black hole gravitational lensing is also discussed. Supported by the Natural Science Foundation of Education Department of Shannxi Province under Grant No 15JK1077, and the Doctorial Scientific Research Starting Fund of Shannxi University of Science and Technology under Grant No BJ12-02.

  11. Exploring the structural regularities in networks

    CERN Document Server

    Shen, Hua-Wei; Guo, Jia-Feng

    2011-01-01

    In this paper, we consider the problem of exploring structural regularities of networks by dividing the nodes of a network into groups such that the members of each group have similar patterns of connections to other groups. Specifically, we propose a general statistical model to describe network structure. In this model, group is viewed as hidden or unobserved quantity and it is learned by fitting the observed network data using the expectation-maximization algorithm. Compared with existing models, the most prominent strength of our model is the high flexibility. This strength enables it to possess the advantages of existing models and overcomes their shortcomings in a unified way. As a result, not only broad types of structure can be detected without prior knowledge of what type of intrinsic regularities exist in the network, but also the type of identified structure can be directly learned from data. Moreover, by differentiating outgoing edges from incoming edges, our model can detect several types of stru...

  12. Regularized algorithm for Raman lidar data processing.

    Science.gov (United States)

    Shcherbakov, Valery

    2007-08-01

    A regularized algorithm that has the potential to improve the quality of Raman lidar data processing is presented. Compared to the conventional scheme, the proposed algorithm has the advantage, which results from the fact that it is based on a well-posed procedure. That is, the profile of the aerosol backscatter coefficient is computed directly, using the explicit relationships, without numerical differentiation. Thereafter, the profile of the lidar ratio is retrieved as a regularized solution of a first-kind Volterra integral equation. Once these two steps have been completed, the profile of the aerosol extinction coefficient is computed by a straightforward multiplication. The numerical simulations demonstrated that the proposed algorithm provides good accuracy and resolution of aerosol profile retrievals. The error analysis showed that the retrieved profiles are continuous functions of the measurement errors and of the a priori information uncertainties.

  13. Modeling Regular Replacement for String Constraint Solving

    Science.gov (United States)

    Fu, Xiang; Li, Chung-Chih

    2010-01-01

    Bugs in user input sanitation of software systems often lead to vulnerabilities. Among them many are caused by improper use of regular replacement. This paper presents a precise modeling of various semantics of regular substitution, such as the declarative, finite, greedy, and reluctant, using finite state transducers (FST). By projecting an FST to its input/output tapes, we are able to solve atomic string constraints, which can be applied to both the forward and backward image computation in model checking and symbolic execution of text processing programs. We report several interesting discoveries, e.g., certain fragments of the general problem can be handled using less expressive deterministic FST. A compact representation of FST is implemented in SUSHI, a string constraint solver. It is applied to detecting vulnerabilities in web applications

  14. Effort variation regularization in sound field reproduction

    DEFF Research Database (Denmark)

    Stefanakis, Nick; Jacobsen, Finn; Sarris, Ioannis

    2010-01-01

    . Specifically, it is suggested that the phase differential of the source driving signals should be in agreement with the phase differential of the desired sound pressure field. The performance of the suggested method is compared with that of conventional effort regularization, wave field synthesis (WFS......In this paper, active control is used in order to reproduce a given sound field in an extended spatial region. A method is proposed which minimizes the reproduction error at a number of control positions with the reproduction sources holding a certain relation within their complex strengths......), and adaptive wave field synthesis (AWFS), both under free-field conditions and in reverberant rooms. It is shown that effort variation regularization overcomes the problems associated with small spaces and with a low ratio of direct to reverberant energy, improving thus the reproduction accuracy...

  15. Basic analysis of regularized series and products

    CERN Document Server

    Jorgenson, Jay A

    1993-01-01

    Analytic number theory and part of the spectral theory of operators (differential, pseudo-differential, elliptic, etc.) are being merged under amore general analytic theory of regularized products of certain sequences satisfying a few basic axioms. The most basic examples consist of the sequence of natural numbers, the sequence of zeros with positive imaginary part of the Riemann zeta function, and the sequence of eigenvalues, say of a positive Laplacian on a compact or certain cases of non-compact manifolds. The resulting theory is applicable to ergodic theory and dynamical systems; to the zeta and L-functions of number theory or representation theory and modular forms; to Selberg-like zeta functions; andto the theory of regularized determinants familiar in physics and other parts of mathematics. Aside from presenting a systematic account of widely scattered results, the theory also provides new results. One part of the theory deals with complex analytic properties, and another part deals with Fourier analys...

  16. Chiral Perturbation Theory With Lattice Regularization

    CERN Document Server

    Ouimet, P P A

    2005-01-01

    In this work, alternative methods to regularize chiral perturbation theory are discussed. First, Long Distance Regularization will be considered in the presence of the decuplet of the lightest spin 32 baryons for several different observables. This serves motivation and introduction to the use of the lattice regulator for chiral perturbation theory. The mesonic, baryonic and anomalous sectors of chiral perturbation theory will be formulated on a lattice of space time points. The consistency of the lattice as a regulator will be discussed in the context of the meson and baryon masses. Order a effects will also be discussed for the baryon masses, sigma terms and magnetic moments. The work will close with an attempt to derive an effective Wess-Zumino-Witten Lagrangian for Wilson fermions at non-zero a. Following this discussion, there will be a proposal for a phenomenologically useful WZW Lagrangian at non-zero a.

  17. Testing the Equivalence of Regular Languages

    Directory of Open Access Journals (Sweden)

    Marco Almeida

    2009-07-01

    Full Text Available The minimal deterministic finite automaton is generally used to determine regular languages equality. Antimirov and Mosses proposed a rewrite system for deciding regular expressions equivalence of which Almeida et al. presented an improved variant. Hopcroft and Karp proposed an almost linear algorithm for testing the equivalence of two deterministic finite automata that avoids minimisation. In this paper we improve the best-case running time, present an extension of this algorithm to non-deterministic finite automata, and establish a relationship between this algorithm and the one proposed in Almeida et al. We also present some experimental comparative results. All these algorithms are closely related with the recent coalgebraic approach to automata proposed by Rutten.

  18. Extreme values, regular variation and point processes

    CERN Document Server

    Resnick, Sidney I

    1987-01-01

    Extremes Values, Regular Variation and Point Processes is a readable and efficient account of the fundamental mathematical and stochastic process techniques needed to study the behavior of extreme values of phenomena based on independent and identically distributed random variables and vectors It presents a coherent treatment of the distributional and sample path fundamental properties of extremes and records It emphasizes the core primacy of three topics necessary for understanding extremes the analytical theory of regularly varying functions; the probabilistic theory of point processes and random measures; and the link to asymptotic distribution approximations provided by the theory of weak convergence of probability measures in metric spaces The book is self-contained and requires an introductory measure-theoretic course in probability as a prerequisite Almost all sections have an extensive list of exercises which extend developments in the text, offer alternate approaches, test mastery and provide for enj...

  19. Spatio-Temporal Regularization for Longitudinal Registration to Subject-Specific 3d Template

    OpenAIRE

    Guizard, Nicolas; Fonov, Vladimir S.; Garc?a-Lorenzo, Daniel; Nakamura, Kunio; Aubert-Broche, B?reng?re; Collins, D. Louis

    2015-01-01

    Neurodegenerative diseases such as Alzheimer's disease present subtle anatomical brain changes before the appearance of clinical symptoms. Manual structure segmentation is long and tedious and although automatic methods exist, they are often performed in a cross-sectional manner where each time-point is analyzed independently. With such analysis methods, bias, error and longitudinal noise may be introduced. Noise due to MR scanners and other physiological effects may also introduce variabilit...

  20. Long-period seismic events with strikingly regular temporal patterns on Katla volcano's south flank (Iceland)

    OpenAIRE

    Sgattoni, Giulia; Jeddi, Zeinab; Guðmundsson, Ólafur; Einarsson, Páll; Tryggvason, Ari; LUND, Björn; Lucchi, Federico

    2015-01-01

    Katla is a threatening volcano in Iceland, partly covered by the M\\'yrdalsj\\"okull ice cap. The volcano has a large caldera with several active geothermal areas. A peculiar cluster of long-period seismic events started on Katla's south flank in July 2011, during an unrest episode in the caldera that culminated in a glacier outburst. The seismic events were tightly clustered at shallow depth in the Gvendarfell area, 4 km south of the caldera, under a small glacier stream on the southern margin...

  1. Regularization in Orbital Mechanics; Theory and Practice

    Science.gov (United States)

    Roa, Javier

    2017-09-01

    Regularized equations of motion can improve numerical integration for the propagation of orbits, and simplify the treatment of mission design problems. This monograph discusses standard techniques and recent research in the area. While each scheme is derived analytically, its accuracy is investigated numerically. Algebraic and topological aspects of the formulations are studied, as well as their application to practical scenarios such as spacecraft relative motion and new low-thrust trajectories.

  2. Implementing regularization implicitly via approximate eigenvector computation

    OpenAIRE

    Mahoney, Michael W.; Orecchia, Lorenzo

    2010-01-01

    Regularization is a powerful technique for extracting useful information from noisy data. Typically, it is implemented by adding some sort of norm constraint to an objective function and then exactly optimizing the modified objective function. This procedure often leads to optimization problems that are computationally more expensive than the original problem, a fact that is clearly problematic if one is interested in large-scale applications. On the other hand, a large body of empirical work...

  3. Regular black hole in three dimensions

    OpenAIRE

    Myung, Yun Soo; Yoon, Myungseok

    2008-01-01

    We find a new black hole in three dimensional anti-de Sitter space by introducing an anisotropic perfect fluid inspired by the noncommutative black hole. This is a regular black hole with two horizons. We compare thermodynamics of this black hole with that of non-rotating BTZ black hole. The first-law of thermodynamics is not compatible with the Bekenstein-Hawking entropy.

  4. Bifurcations in the regularized Ericksen bar model

    OpenAIRE

    Grinfeld, M.; Lord, G. J. (Gabriel J.)

    2007-01-01

    We consider the regularized Ericksen model of an elastic bar on an elastic foundation on an interval with Dirichlet boundary conditions as a two-parameter bifurcation problem. We explore, using local bifurcation analysis and continuation methods, the structure of bifurcations from double zero eigenvalues. Our results provide evidence in support of M\\"uller's conjecture \\cite{Muller} concerning the symmetry of local minimizers of the associated energy functional and describe in detail the stru...

  5. On bigraded regularities of Rees algebra

    Indian Academy of Sciences (India)

    For any homogeneous ideal I in K [ x 1 , . . . , x n ] of analytic spread ℓ , we show that for the Rees algebra R ( I ) , r e g ( 0 , 1 ) s y z \\sl ( R ( I ) ) = r e g ( 0 , 1 ) T \\sl ( R ( I ) ) . We compute a formula for the (0, 1)-regularity of R ( I ) , which is a bigraded analog of Theorem1.1 of Aramova and Herzog ( A m . J . M a t h . 122 ( 4 ) ...

  6. Preconditioners for regularized saddle point matrices

    Czech Academy of Sciences Publication Activity Database

    Axelsson, Owe

    2011-01-01

    Roč. 19, č. 2 (2011), s. 91-112 ISSN 1570-2820 Institutional research plan: CEZ:AV0Z30860518 Keywords : saddle point matrices * preconditioning * regularization * eigenvalue clustering Subject RIV: BA - General Mathematics Impact factor: 0.533, year: 2011 http://www.degruyter.com/view/j/jnma.2011.19.issue-2/jnum.2011.005/jnum.2011.005. xml

  7. A short proof of increased parabolic regularity

    Directory of Open Access Journals (Sweden)

    Stephen Pankavich

    2015-08-01

    Full Text Available We present a short proof of the increased regularity obtained by solutions to uniformly parabolic partial differential equations. Though this setting is fairly introductory, our new method of proof, which uses a priori estimates and an inductive method, can be extended to prove analogous results for problems with time-dependent coefficients, advection-diffusion or reaction diffusion equations, and nonlinear PDEs even when other tools, such as semigroup methods or the use of explicit fundamental solutions, are unavailable.

  8. Regularity bounds on Zakharov system evolutions

    Directory of Open Access Journals (Sweden)

    James Colliander

    2002-08-01

    Full Text Available Spatial regularity properties of certain global-in-time solutions of the Zakharov system are established. In particular, the evolving solution $u(t$ is shown to satisfy an estimate $|u(t|_{H^s} leq C |t|^{(s-1+}$, where $H^s$ is the standard spatial Sobolev norm. The proof is an adaptation of earlier work on the nonlinear Schrodinger equation which reduces matters to bilinear estimates.

  9. Automatic detection of regularly repeating vocalizations

    Science.gov (United States)

    Mellinger, David

    2005-09-01

    Many animal species produce repetitive sounds at regular intervals. This regularity can be used for automatic recognition of the sounds, providing improved detection at a given signal-to-noise ratio. Here, the detection of sperm whale sounds is examined. Sperm whales produce highly repetitive ``regular clicks'' at periods of about 0.2-2 s, and faster click trains in certain behavioral contexts. The following detection procedure was tested: a spectrogram was computed; values within a certain frequency band were summed; time windowing was applied; each windowed segment was autocorrelated; and the maximum of the autocorrelation within a certain periodicity range was chosen. This procedure was tested on sets of recordings containing sperm whale sounds and interfering sounds, both low-frequency recordings from autonomous hydrophones and high-frequency ones from towed hydrophone arrays. An optimization procedure iteratively varies detection parameters (spectrogram frame length and frequency range, window length, periodicity range, etc.). Performance of various sets of parameters was measured by setting a standard level of allowable missed calls, and the resulting optimium parameters are described. Performance is also compared to that of a neural network trained using the data sets. The method is also demonstrated for sounds of blue whales, minke whales, and seismic airguns. [Funding from ONR.

  10. Hyperspectral Image Recovery via Hybrid Regularization.

    Science.gov (United States)

    Arablouei, Reza; de Hoog, Frank

    2016-09-27

    Natural images tend to mostly consist of smooth regions with individual pixels having highly correlated spectra. This information can be exploited to recover hyperspectral images of natural scenes from their incomplete and noisy measurements. To perform the recovery while taking full advantage of the prior knowledge, we formulate a composite cost function containing a square-error data-fitting term and two distinct regularization terms pertaining to spatial and spectral domains. The regularization for the spatial domain is the sum of total-variation of the image frames corresponding to all spectral bands. The regularization for the spectral domain is the ��������-norm of the coefficient matrix obtained by applying a suitable sparsifying transform to the spectra of the pixels. We use an accelerated proximal-subgradient method to minimize the formulated cost function. We analyse the performance of the proposed algorithm and prove its convergence. Numerical simulations using real hyperspectral images exhibit that the proposed algorithm offers an excellent recovery performance with a number of measurements that is only a small fraction of the hyperspectral image data size. Simulation results also show that the proposed algorithm significantly outperforms an accelerated proximal-gradient algorithm that solves the classical basis-pursuit denoising problem to recover the hyperspectral image.

  11. Bypassing the Limits of Ll Regularization: Convex Sparse Signal Processing Using Non-Convex Regularization

    Science.gov (United States)

    Parekh, Ankit

    Sparsity has become the basis of some important signal processing methods over the last ten years. Many signal processing problems (e.g., denoising, deconvolution, non-linear component analysis) can be expressed as inverse problems. Sparsity is invoked through the formulation of an inverse problem with suitably designed regularization terms. The regularization terms alone encode sparsity into the problem formulation. Often, the ℓ1 norm is used to induce sparsity, so much so that ℓ1 regularization is considered to be `modern least-squares'. The use of ℓ1 norm, as a sparsity-inducing regularizer, leads to a convex optimization problem, which has several benefits: the absence of extraneous local minima, well developed theory of globally convergent algorithms, even for large-scale problems. Convex regularization via the ℓ1 norm, however, tends to under-estimate the non-zero values of sparse signals. In order to estimate the non-zero values more accurately, non-convex regularization is often favored over convex regularization. However, non-convex regularization generally leads to non-convex optimization, which suffers from numerous issues: convergence may be guaranteed to only a stationary point, problem specific parameters may be difficult to set, and the solution is sensitive to the initialization of the algorithm. The first part of this thesis is aimed toward combining the benefits of non-convex regularization and convex optimization to estimate sparse signals more effectively. To this end, we propose to use parameterized non-convex regularizers with designated non-convexity and provide a range for the non-convex parameter so as to ensure that the objective function is strictly convex. By ensuring convexity of the objective function (sum of data-fidelity and non-convex regularizer), we can make use of a wide variety of convex optimization algorithms to obtain the unique global minimum reliably. The second part of this thesis proposes a non-linear signal

  12. Votail: A Formally Specified and Verified Ballot Counting System for Irish PR-STV Elections

    DEFF Research Database (Denmark)

    Cochran, Dermot Robert; Kiniry, Joseph Roland

    2010-01-01

    Votail is an open source Java implementation of Irish Proportional Representation by Single Transferable Vote (PR-STV). Its functional requirements, derived from Irish electoral law, are formally specified using the Business Object Notation (BON) and refined to a Java Modeling Language (JML......) specification. Formal methods are used to verify and validate the correctness of the software. This is the first public release of a formally verified PR-STV open source system for ballot counting and the most recent of only about half a dozen releases of formally verified e-voting software...

  13. The Agrarian Regularization for Social Interest in the law 11.977 the Seven of July, 2009: The Excess Demanded in the Legitimacy of Possession Procedure

    OpenAIRE

    Faria, Edimur Ferreira de; Matosinhos, Ana Paula

    2016-01-01

    The purpose of this article is to analyze the institute of agrarian regularization for social interest provided for the Law 11.977/2009, as innovative and important instrument of public policy in the urban reordering of urban spaces and consequent social inclusion mainly the low-income population. This study will be addressed phases, focusing the legitimacy of possession. It shall be verified that the agrarian regularization for social interest actually has a relevance in the organization of ...

  14. How to Verify Plagiarism of the Paper Written in Macedonian and Translated in Foreign Language?

    Directory of Open Access Journals (Sweden)

    Mirko Spiroski

    2016-02-01

    CONCLUSION: Plagiarism of the original papers written in Macedonian and translated in other languages can be verified after computerised translation in other languages. Later on, original and translated documents can be compared with available software for plagiarism detection.

  15. Regular-, irregular-, and pseudo-character processing in Chinese: The regularity effect in normal adult readers

    Directory of Open Access Journals (Sweden)

    Dustin Kai Yan Lau

    2014-03-01

    Full Text Available Background Unlike alphabetic languages, Chinese uses a logographic script. However, the pronunciation of many character’s phonetic radical has the same pronunciation as the character as a whole. These are considered regular characters and can be read through a lexical non-semantic route (Weekes & Chen, 1999. Pseudocharacters are another way to study this non-semantic route. A pseudocharacter is the combination of existing semantic and phonetic radicals in their legal positions resulting in a non-existing character (Ho, Chan, Chung, Lee, & Tsang, 2007. Pseudocharacters can be pronounced by direct derivation from the sound of its phonetic radical. Conversely, if the pronunciation of a character does not follow that of the phonetic radical, it is considered as irregular and can only be correctly read through the lexical-semantic route. The aim of the current investigation was to examine reading aloud in normal adults. We hypothesized that the regularity effect, previously described for alphabetical scripts and acquired dyslexic patients of Chinese (Weekes & Chen, 1999; Wu, Liu, Sun, Chromik, & Zhang, 2014, would also be present in normal adult Chinese readers. Method Participants. Thirty (50% female native Hong Kong Cantonese speakers with a mean age of 19.6 years and a mean education of 12.9 years. Stimuli. Sixty regular-, 60 irregular-, and 60 pseudo-characters (with at least 75% of name agreement in Chinese were matched by initial phoneme, number of strokes and family size. Additionally, regular- and irregular-characters were matched by frequency (low and consistency. Procedure. Each participant was asked to read aloud the stimuli presented on a laptop using the DMDX software. The order of stimuli presentation was randomized. Data analysis. ANOVAs were carried out by participants and items with RTs and errors as dependent variables and type of stimuli (regular-, irregular- and pseudo-character as repeated measures (F1 or between subject

  16. Inductive Temporal Logic Programming

    OpenAIRE

    Kolter, Robert

    2009-01-01

    We study the extension of techniques from Inductive Logic Programming (ILP) to temporal logic programming languages. Therefore we present two temporal logic programming languages and analyse the learnability of programs from these languages from finite sets of examples. In first order temporal logic the following topics are analysed: - How can we characterize the denotational semantics of programs? - Which proof techniques are best suited? - How complex is the learning task? In propositional ...

  17. Convergence and Fluctuations of Regularized Tyler Estimators

    Science.gov (United States)

    Kammoun, Abla; Couillet, Romain; Pascal, Ferderic; Alouini, Mohamed-Slim

    2016-02-01

    This article studies the behavior of regularized Tyler estimators (RTEs) of scatter matrices. The key advantages of these estimators are twofold. First, they guarantee by construction a good conditioning of the estimate and second, being a derivative of robust Tyler estimators, they inherit their robustness properties, notably their resilience to the presence of outliers. Nevertheless, one major problem that poses the use of RTEs in practice is represented by the question of setting the regularization parameter $\\rho$. While a high value of $\\rho$ is likely to push all the eigenvalues away from zero, it comes at the cost of a larger bias with respect to the population covariance matrix. A deep understanding of the statistics of RTEs is essential to come up with appropriate choices for the regularization parameter. This is not an easy task and might be out of reach, unless one considers asymptotic regimes wherein the number of observations $n$ and/or their size $N$ increase together. First asymptotic results have recently been obtained under the assumption that $N$ and $n$ are large and commensurable. Interestingly, no results concerning the regime of $n$ going to infinity with $N$ fixed exist, even though the investigation of this assumption has usually predated the analysis of the most difficult $N$ and $n$ large case. This motivates our work. In particular, we prove in the present paper that the RTEs converge to a deterministic matrix when $n\\to\\infty$ with $N$ fixed, which is expressed as a function of the theoretical covariance matrix. We also derive the fluctuations of the RTEs around this deterministic matrix and establish that these fluctuations converge in distribution to a multivariate Gaussian distribution with zero mean and a covariance depending on the population covariance and the parameter $\\rho$.

  18. Convergence and fluctuations of Regularized Tyler estimators

    KAUST Repository

    Kammoun, Abla

    2015-10-26

    This article studies the behavior of regularized Tyler estimators (RTEs) of scatter matrices. The key advantages of these estimators are twofold. First, they guarantee by construction a good conditioning of the estimate and second, being a derivative of robust Tyler estimators, they inherit their robustness properties, notably their resilience to the presence of outliers. Nevertheless, one major problem that poses the use of RTEs in practice is represented by the question of setting the regularization parameter p. While a high value of p is likely to push all the eigenvalues away from zero, it comes at the cost of a larger bias with respect to the population covariance matrix. A deep understanding of the statistics of RTEs is essential to come up with appropriate choices for the regularization parameter. This is not an easy task and might be out of reach, unless one considers asymptotic regimes wherein the number of observations n and/or their size N increase together. First asymptotic results have recently been obtained under the assumption that N and n are large and commensurable. Interestingly, no results concerning the regime of n going to infinity with N fixed exist, even though the investigation of this assumption has usually predated the analysis of the most difficult N and n large case. This motivates our work. In particular, we prove in the present paper that the RTEs converge to a deterministic matrix when n → ∞ with N fixed, which is expressed as a function of the theoretical covariance matrix. We also derive the fluctuations of the RTEs around this deterministic matrix and establish that these fluctuations converge in distribution to a multivariate Gaussian distribution with zero mean and a covariance depending on the population covariance and the parameter.

  19. Indeterministic Temporal Logic

    Directory of Open Access Journals (Sweden)

    Trzęsicki Kazimierz

    2015-09-01

    Full Text Available The questions od determinism, causality, and freedom have been the main philosophical problems debated since the beginning of temporal logic. The issue of the logical value of sentences about the future was stated by Aristotle in the famous tomorrow sea-battle passage. The question has inspired Łukasiewicz’s idea of many-valued logics and was a motive of A. N. Prior’s considerations about the logic of tenses. In the scheme of temporal logic there are different solutions to the problem. In the paper we consider indeterministic temporal logic based on the idea of temporal worlds and the relation of accessibility between them.

  20. Two-pass greedy regular expression parsing

    DEFF Research Database (Denmark)

    Grathwohl, Niels Bjørn Bugge; Henglein, Fritz; Nielsen, Lasse

    2013-01-01

    We present new algorithms for producing greedy parses for regular expressions (REs) in a semi-streaming fashion. Our lean-log algorithm executes in time O(mn) for REs of size m and input strings of size n and outputs a compact bit-coded parse tree representation. It improves on previous algorithms...... by: operating in only 2 passes; using only O(m) words of random-access memory (independent of n); requiring only kn bits of sequentially written and read log storage, where k

  1. Multichannel image regularization using anisotropic geodesic filtering

    Energy Technology Data Exchange (ETDEWEB)

    Grazzini, Jacopo A [Los Alamos National Laboratory

    2010-01-01

    This paper extends a recent image-dependent regularization approach introduced in aiming at edge-preserving smoothing. For that purpose, geodesic distances equipped with a Riemannian metric need to be estimated in local neighbourhoods. By deriving an appropriate metric from the gradient structure tensor, the associated geodesic paths are constrained to follow salient features in images. Following, we design a generalized anisotropic geodesic filter; incorporating not only a measure of the edge strength, like in the original method, but also further directional information about the image structures. The proposed filter is particularly efficient at smoothing heterogeneous areas while preserving relevant structures in multichannel images.

  2. Regular and Chaotic Dynamics of Flexible Plates

    Directory of Open Access Journals (Sweden)

    J. Awrejcewicz

    2014-01-01

    Full Text Available Nonlinear dynamics of flexible rectangular plates subjected to the action of longitudinal and time periodic load distributed on the plate perimeter is investigated. Applying both the classical Fourier and wavelet analysis we illustrate three different Feigenbaum type scenarios of transition from a regular to chaotic dynamics. We show that the system vibrations change with respect not only to the change of control parameters, but also to all fixed parameters (system dynamics changes when the independent variable, time, increases. In addition, we show that chaotic dynamics may appear also after the second Hopf bifurcation. Curves of equal deflections (isoclines lose their previous symmetry while transiting into chaotic vibrations.

  3. Spherical Averages on Regular and Semiregular Graphs

    OpenAIRE

    Douma, Femke

    2008-01-01

    In 1966, P. Guenther proved the following result: Given a continuous function f on a compact surface M of constant curvature -1 and its periodic lift g to the universal covering, the hyperbolic plane, then the averages of the lift g over increasing spheres converge to the average of the function f over the surface M. In this article, we prove similar results for functions on the vertices and edges of regular and semiregular graphs, with special emphasis on the convergence rate. We also consid...

  4. Equivariant semidefinite lifts of regular polygons

    OpenAIRE

    Fawzi, Hamza; Saunderson, James; Parrilo, Pablo A.

    2014-01-01

    Given a polytope P in $\\mathbb{R}^n$, we say that P has a positive semidefinite lift (psd lift) of size d if one can express P as the linear projection of an affine slice of the positive semidefinite cone $\\mathbf{S}^d_+$. If a polytope P has symmetry, we can consider equivariant psd lifts, i.e. those psd lifts that respect the symmetry of P. One of the simplest families of polytopes with interesting symmetries are regular polygons in the plane, which have played an important role in the stud...

  5. Equivariant semidefinite lifts of regular polygons

    OpenAIRE

    Fawzi, Hamza; Saunderson, J.; Parrilo, PA

    2017-01-01

    Given a polytope P in $\\mathbb{R}^n$, we say that P has a positive semidefinite lift (psd lift) of size d if one can express P as the linear projection of an affine slice of the positive semidefinite cone $\\mathbf{S}^d_+$. If a polytope P has symmetry, we can consider equivariant psd lifts, i.e. those psd lifts that respect the symmetry of P. One of the simplest families of polytopes with interesting symmetries are regular polygons in the plane, which have played an importan...

  6. Foaming behaviour of organic and regular milk

    OpenAIRE

    Pijnenburg, J.; Sala, G.; Valenberg, van, H.J.F.; Meinders, M.B.J.

    2012-01-01

    Organic milk is used more and more by consumers to froth milk that is used e.g. for the preparation of a capuccino. Frequently, organic milk turns out not to foam properly. This report describes a study to find the main couse of this bad foamability of organic milk. The focus of the research was to get insight in the foaming behaviour of a specific brand, indicated as A. The foamability and stability of different milk, both organic and regular, as well as skimmed, semi-skimmed, and full fat, ...

  7. Wave regularity in curve integrable spacetimes

    CERN Document Server

    Sanchez, Yafet Sanchez

    2015-01-01

    The idea of defining a gravitational singularity as an obstruction to the dynamical evolution of a test field (described by a PDE) rather than the dynamical evolution of a particle (described by a geodesics) is explored. In particular, the concept of wave regularity is introduced which serves to show that the classical singularities in curve integrable spacetimes do not interrupt the well-posedness of the wave equation. The techniques used also provide arguments that can be extended to establish when a classically singular spacetime remains singular in a semi-classical picture.

  8. Electronic Structure of Regular Bacterial Surface Layers

    Science.gov (United States)

    Vyalikh, Denis V.; Danzenbächer, Steffen; Mertig, Michael; Kirchner, Alexander; Pompe, Wolfgang; Dedkov, Yuriy S.; Molodtsov, Serguei L.

    2004-12-01

    We report photoemission and near-edge x-ray absorption fine structure measurements of the occupied and unoccupied valence electronic states of the regular surface layer of Bacillus sphaericus, which is widely used as the protein template for the fabrication of metallic nanostructures. The two-dimensional protein crystal shows a semiconductorlike behavior with a gap value of ˜3.0 eV and the Fermi energy close to the bottom of the lowest unoccupied molecular orbital. We anticipate that these results will open up new possibilities for the electric addressability of biotemplated low-dimensional hybrid structures.

  9. Strategies for regular segmented reductions on GPU

    DEFF Research Database (Denmark)

    Larsen, Rasmus Wriedt; Henriksen, Troels

    2017-01-01

    We present and evaluate an implementation technique for regular segmented reductions on GPUs. Existing techniques tend to be either consistent in performance but relatively inefficient in absolute terms, or optimised for specific workloads and thereby exhibiting bad performance for certain input...... is in the context of the Futhark compiler, the implementation technique is applicable to any library or language that has a need for segmented reductions. We evaluate the technique on four microbenchmarks, two of which we also compare to implementations in the CUB library for GPU programming, as well as on two...

  10. A prospective randomised study of dense Infinity cytological brush versus regularly used brush in pancreaticobiliary malignancy.

    Science.gov (United States)

    Kylänpää, Leena; Boyd, Sonja; Ristimäki, Ari; Lindström, Outi; Udd, Marianne; Halttunen, Jorma

    2016-01-01

    Endoscopic retrograde cholangiopancreatography (ERCP) with a cytological sample is a valuable tool in the diagnosis of the aetiology of biliary stricture. Our aim was to evaluate whether a more dense Infinity® cytological brush is more sensitive in diagnosing malignancy than the regularly used brush. We recruited 60 patients with a biliary stricture suspicious for malignancy for a randomised controlled trial. Patients were randomly assigned to an Infinity® brush group (n = 30) and a regularly used cytology brush group (n = 30). All the patients had verified cancer during follow-up. Crossing the brush over the stricture was possible in each case without dilatation of the biliary duct. Brush cytology yield was good or excellent in 86.7% of cases with the Infinity® brush and 96.7% with the regular brush (p = 0.161). The cytological sample showed clear malignancy in three patients (10.0%) in the Infinity® group and in 12 (40.0%) patients of the regular brush group (p = 0.007). The cytological diagnosis was highly suspicious for malignancy or malignant in 14 patients (46.7%) in the Infinity® group and in 23 patients (76.7%) in the regular brush group (p = 0.017). The result was benign in 10 patients (33.3%) in the Infinity® group and in four patients (13.6%) in the regular brush group (p = 0.067). With the standardised technique, the sensitivity of brush cytology is fairly good. The dense Infinity® brush does not show any advantage regarding sensitivity compared with the conventional cytology brush.

  11. Learning optimal spatially-dependent regularization parameters in total variation image denoising

    Science.gov (United States)

    Van Chung, Cao; De los Reyes, J. C.; Schönlieb, C. B.

    2017-07-01

    We consider a bilevel optimization approach in function space for the choice of spatially dependent regularization parameters in TV image denoising models. First- and second-order optimality conditions for the bilevel problem are studied when the spatially-dependent parameter belongs to the Sobolev space {{H}1}≤ft(Ω \\right) . A combined Schwarz domain decomposition-semismooth Newton method is proposed for the solution of the full optimality system and local superlinear convergence of the semismooth Newton method is verified. Exhaustive numerical computations are finally carried out to show the suitability of the approach.

  12. Special microstructured fibers with irregular and regular claddings for supercontinuum generation

    Science.gov (United States)

    Minkovich, Vladimir P.; Vaca Pereira G., M.; Villatoro, Joel; Sotsky, Alexander B.; Illarramendi, Maria Asunción; Zubia, Joseba

    2017-08-01

    In this report, we inform in detail on fabrication of a special (3 rings of air-holes) index-giuding air-silica microstructured optical fiber (IG MOF) with different air-hole diameters in the cladding (irregular cladding) and its application for a broadband supercontinuum (SC) generation by femtosecond laser pulses. For comparison, supercontinuum generation in a special nonlinear air-silica IG MOF with regular cladding is also investigated. Dispersion properties of the investigated fibers were numerically predicted and experimentally verified. Broadband SC generation from visual wavelengths up to 1600 nm in such fibers, both with the length of 1 m, was observed.

  13. How color, regularity, and good Gestalt determine backward masking.

    Science.gov (United States)

    Sayim, Bilge; Manassi, Mauro; Herzog, Michael

    2014-06-18

    The strength of visual backward masking depends on the stimulus onset asynchrony (SOA) between target and mask. Recently, it was shown that the conjoint spatial layout of target and mask is as crucial as SOA. Particularly, masking strength depends on whether target and mask group with each other. The same is true in crowding where the global spatial layout of the flankers and target-flanker grouping determine crowding strength. Here, we presented a vernier target followed by different flanker configurations at varying SOAs. Similar to crowding, masking of a red vernier target was strongly reduced for arrays of 10 green compared with 10 red flanking lines. Unlike crowding, single green lines flanking the red vernier showed strong masking. Irregularly arranged flanking lines yielded stronger masking than did regularly arranged lines, again similar to crowding. While cuboid flankers reduced crowding compared with single lines, this was not the case in masking. We propose that, first, masking is reduced when the flankers are part of a larger spatial structure. Second, spatial factors counteract color differences between the target and the flankers. Third, complex Gestalts, such as cuboids, seem to need longer processing times to show ungrouping effects as observed in crowding. Strong parallels between masking and crowding suggest similar underlying mechanism; however, temporal factors in masking additionally modulate performance, acting as an additional grouping cue. © 2014 ARVO.

  14. Human action recognition with group lasso regularized-support vector machine

    Science.gov (United States)

    Luo, Huiwu; Lu, Huanzhang; Wu, Yabei; Zhao, Fei

    2016-05-01

    The bag-of-visual-words (BOVW) and Fisher kernel are two popular models in human action recognition, and support vector machine (SVM) is the most commonly used classifier for the two models. We show two kinds of group structures in the feature representation constructed by BOVW and Fisher kernel, respectively, since the structural information of feature representation can be seen as a prior for the classifier and can improve the performance of the classifier, which has been verified in several areas. However, the standard SVM employs L2-norm regularization in its learning procedure, which penalizes each variable individually and cannot express the structural information of feature representation. We replace the L2-norm regularization with group lasso regularization in standard SVM, and a group lasso regularized-support vector machine (GLRSVM) is proposed. Then, we embed the group structural information of feature representation into GLRSVM. Finally, we introduce an algorithm to solve the optimization problem of GLRSVM by alternating directions method of multipliers. The experiments evaluated on KTH, YouTube, and Hollywood2 datasets show that our method achieves promising results and improves the state-of-the-art methods on KTH and YouTube datasets.

  15. Supporting Regularized Logistic Regression Privately and Efficiently.

    Science.gov (United States)

    Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei

    2016-01-01

    As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc.

  16. Multiple graph regularized nonnegative matrix factorization

    KAUST Repository

    Wang, Jim Jing-Yan

    2013-10-01

    Non-negative matrix factorization (NMF) has been widely used as a data representation method based on components. To overcome the disadvantage of NMF in failing to consider the manifold structure of a data set, graph regularized NMF (GrNMF) has been proposed by Cai et al. by constructing an affinity graph and searching for a matrix factorization that respects graph structure. Selecting a graph model and its corresponding parameters is critical for this strategy. This process is usually carried out by cross-validation or discrete grid search, which are time consuming and prone to overfitting. In this paper, we propose a GrNMF, called MultiGrNMF, in which the intrinsic manifold is approximated by a linear combination of several graphs with different models and parameters inspired by ensemble manifold regularization. Factorization metrics and linear combination coefficients of graphs are determined simultaneously within a unified object function. They are alternately optimized in an iterative algorithm, thus resulting in a novel data representation algorithm. Extensive experiments on a protein subcellular localization task and an Alzheimer\\'s disease diagnosis task demonstrate the effectiveness of the proposed algorithm. © 2013 Elsevier Ltd. All rights reserved.

  17. Supporting Regularized Logistic Regression Privately and Efficiently.

    Directory of Open Access Journals (Sweden)

    Wenfa Li

    Full Text Available As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc.

  18. Maze navigation by honeybees: learning path regularity.

    Science.gov (United States)

    Zhang, S; Mizutani, A; Srinivasan, M V

    2000-01-01

    We investigated the ability of honeybees to learn mazes of four types: constant-turn mazes, in which the appropriate turn is always in the same direction in each decision chamber; zig-zag mazes, in which the appropriate turn is alternately left and right in successive decision chambers; irregular mazes, in which there is no readily apparent pattern to the turns; and variable irregular mazes, in which the bees were trained to learn several irregular mazes simultaneously. The bees were able to learn to navigate all four types of maze. Performance was best in the constant-turn mazes, somewhat poorer in the zig-zag mazes, poorer still in the irregular mazes, and poorest in the variable irregular mazes. These results demonstrate that bees do not navigate such mazes simply by memorizing the entire sequence of appropriate turns. Rather, performance in the various configurations depends on the existence of regularity in the structure of the maze and on the ease with which this regularity is recognized and learned.

  19. Handicap Labelings of 4-Regular Graphs

    Directory of Open Access Journals (Sweden)

    Petr Kovar

    2017-01-01

    Full Text Available Let G be a simple graph, let f : V(G→{1,2,...,|V(G|} be a bijective mapping. The weight of v ∈ V(G is the sum of labels of all vertices adjacent to v. We say that f is a distance magic labeling of G if the weight of every vertex is the same constant k and we say that f is a handicap magic labeling of G if the weight of every vertex v is l + f(v for some constant l. Graphs that allow such labelings are called distance magic or handicap, respectively. Distance magic and handicap labelings of regular graphs are used for scheduling incomplete tournaments. While distance magic labelings correspond to so called equalized tournaments, handicap labelings can be used to schedule incomplete tournaments that are more challenging to stronger teams or players, hence they increase competition and yield attractive schemes in which every games counts. We summarize known results on distance magic and handicap labelings and construct a new infinite class of 4-regular handicap graphs.

  20. From Regular to Strictly Locally Testable Languages

    Directory of Open Access Journals (Sweden)

    Stefano Crespi Reghizzi

    2011-08-01

    Full Text Available A classical result (often credited to Y. Medvedev states that every language recognized by a finite automaton is the homomorphic image of a local language, over a much larger so-called local alphabet, namely the alphabet of the edges of the transition graph. Local languages are characterized by the value k=2 of the sliding window width in the McNaughton and Papert's infinite hierarchy of strictly locally testable languages (k-slt. We generalize Medvedev's result in a new direction, studying the relationship between the width and the alphabetic ratio telling how much larger the local alphabet is. We prove that every regular language is the image of a k-slt language on an alphabet of doubled size, where the width logarithmically depends on the automaton size, and we exhibit regular languages for which any smaller alphabetic ratio is insufficient. More generally, we express the trade-off between alphabetic ratio and width as a mathematical relation derived from a careful encoding of the states. At last we mention some directions for theoretical development and application.

  1. Words cluster phonetically beyond phonotactic regularities.

    Science.gov (United States)

    Dautriche, Isabelle; Mahowald, Kyle; Gibson, Edward; Christophe, Anne; Piantadosi, Steven T

    2017-06-01

    Recent evidence suggests that cognitive pressures associated with language acquisition and use could affect the organization of the lexicon. On one hand, consistent with noisy channel models of language (e.g., Levy, 2008), the phonological distance between wordforms should be maximized to avoid perceptual confusability (a pressure for dispersion). On the other hand, a lexicon with high phonological regularity would be simpler to learn, remember and produce (e.g., Monaghan et al., 2011) (a pressure for clumpiness). Here we investigate wordform similarity in the lexicon, using measures of word distance (e.g., phonological neighborhood density) to ask whether there is evidence for dispersion or clumpiness of wordforms in the lexicon. We develop a novel method to compare lexicons to phonotactically-controlled baselines that provide a null hypothesis for how clumpy or sparse wordforms would be as the result of only phonotactics. Results for four languages, Dutch, English, German and French, show that the space of monomorphemic wordforms is clumpier than what would be expected by the best chance model according to a wide variety of measures: minimal pairs, average Levenshtein distance and several network properties. This suggests a fundamental drive for regularity in the lexicon that conflicts with the pressure for words to be as phonologically distinct as possible. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. The Existence of Quasi Regular and Bi-Regular Self-Complementary 3-Uniform Hypergraphs

    Directory of Open Access Journals (Sweden)

    Kamble Lata N.

    2016-05-01

    Full Text Available A k-uniform hypergraph H = (V ;E is called self-complementary if there is a permutation σ : V → V , called a complementing permutation, such that for every k-subset e of V , e ∈ E if and only if σ(e ∉ E. In other words, H is isomorphic with H′ = (V ; V(k − E. In this paper we define a bi-regular hypergraph and prove that there exists a bi-regular self-complementary 3-uniform hypergraph on n vertices if and only if n is congruent to 0 or 2 modulo 4. We also prove that there exists a quasi regular self-complementary 3-uniform hypergraph on n vertices if and only if n is congruent to 0 modulo 4.

  3. Temporal properties of stereopsis

    NARCIS (Netherlands)

    Gheorghiu, E.

    2005-01-01

    The goal of the research presented in this thesis was to investigate temporal properties of disparity processing and depth perception in human subjects, in response to dynamic stimuli. The results presented in various chapters, reporting findings about different temporal aspects of disparity

  4. Temporal Linear System Structure

    NARCIS (Netherlands)

    Willigenburg, van L.G.; Koning, de W.L.

    2008-01-01

    Piecewise constant rank systems and the differential Kalman decomposition are introduced in this note. Together these enable the detection of temporal uncontrollability/unreconstructability of linear continuous-time systems. These temporal properties are not detected by any of the four conventional

  5. Temporal Photon Differentials

    DEFF Research Database (Denmark)

    Schjøth, Lars; Frisvad, Jeppe Revall; Erleben, Kenny

    2010-01-01

    , constituting a temporal smoothing of rapidly changing illumination. In global illumination temporal smoothing can be achieved with distribution ray tracing (Cook et al., 1984). Unfortunately, this, and resembling methods, requires a high temporal resolution as samples has to be drawn from in-between frames. We...... present a novel method which is able to produce high quality temporal smoothing for indirect illumination without using in-between frames. Our method is based on ray differentials (Igehy, 1999) as it has been extended in (Sporring et al., 2009). Light rays are traced as bundles creating footprints, which......The finite frame rate also used in computer animated films is cause of adverse temporal aliasing effects. Most noticeable of these is a stroboscopic effect that is seen as intermittent movement of fast moving illumination. This effect can be mitigated using non-zero shutter times, effectively...

  6. The Method of a Standalone Functional Verifying Operability of Sonar Control Systems

    Directory of Open Access Journals (Sweden)

    A. A. Sotnikov

    2014-01-01

    Full Text Available This article describes a method of standalone verifying sonar control system, which is based on functional checking of control system operability.The main features of realized method are a development of the valid mathematic model for simulation of sonar signals at the point of hydroacoustic antenna, a valid representation of the sonar control system modes as a discrete Markov model, providing functional object verification in real time mode.Some ways are proposed to control computational complexity in case of insufficient computing resources of the simulation equipment, namely the way of model functionality reduction and the way of adequacy reduction.Experiments were made using testing equipment, which was developed by department of Research Institute of Information Control System at Bauman Moscow State Technical University to verify technical validity of industrial sonar complexes.On-board software was artificially changed to create malfunctions in functionality of sonar control systems during the verifying process in order to estimate verifying system performances.The method efficiency was proved by the theory and experiment results in comparison with the basic methodology of verifying technical systems.This method could be also used in debugging of on-board software of sonar complexes and in development of new promising algorithms of sonar signal processing.

  7. Modeling correlated human dynamics with temporal preference

    Science.gov (United States)

    Wang, Peng; Zhou, Tao; Han, Xiao-Pu; Wang, Bing-Hong

    2014-03-01

    We empirically study the activity pattern of individual blog-posting and observe the interevent time distributions decay as power-laws at both individual and population level. As different from previous studies, we find significant short-term memory in it. Moreover, the memory coefficient first decays in a power law and then turns to an exponential form. Our findings produce evidence for the strong short-term memory in human dynamics and challenge previous models. Accordingly, we propose a simple model based on temporal preference, which can well reproduce both the heavy-tailed nature and the strong memory effects. This work helps in understanding the temporal regularities of online human behaviors.

  8. Temporal properties of stereopsis

    Science.gov (United States)

    Gheorghiu, E.

    2005-03-01

    The goal of the research presented in this thesis was to investigate temporal properties of disparity processing and depth perception in human subjects, in response to dynamic stimuli. The results presented in various chapters, reporting findings about different temporal aspects of disparity processing, are based on psychophysical experiments and computational model analysis. In chapter 1 we investigated which processes of binocular depth perception in dynamic random-dot stereograms (DRS), i.e., tolerance for interocular delays and temporal integration of correlation, are responsible for the temporal flexibility of the stereoscopic system. Our results demonstrate that (i) disparities from simultaneous monocular inputs dominate those from interocular delayed inputs; (ii) stereopsis is limited by temporal properties of monocular luminance mechanisms; (iii) depth perception in DRS results from cross-correlation-like operation on two simultaneous monocular inputs that represent the retinal images after having been subjected to a process of monocular temporal integration of luminance. In chapter 2 we examined what temporal information is exploited by the mechanisms underlying stereoscopic motion in depth. We investigated systematically the influence of temporal frequency on binocular depth perception in temporally correlated and temporally uncorrelated DRS. Our results show that disparity-defined depth is judged differently in temporally correlated and uncorrelated DRS above a temporal frequency of about 3 Hz. The results and simulations indicate that: (i) above about 20 Hz, the complete absence of stereomotion is caused by temporal integration of luminance; (ii) the difference in perceived depth in temporally correlated and temporally uncorrelated DRS for temporal frequencies between 20 and 3 Hz, is caused by temporal integration of disparity. In chapter 3 we investigated temporal properties of stereopsis at different spatial scales in response to sustained and

  9. Thermodynamics of regular accelerating black holes

    Science.gov (United States)

    Astorino, Marco

    2017-03-01

    Using the covariant phase space formalism, we compute the conserved charges for a solution, describing an accelerating and electrically charged Reissner-Nordstrom black hole. The metric is regular provided that the acceleration is driven by an external electric field, in spite of the usual string of the standard C-metric. The Smarr formula and the first law of black hole thermodynamics are fulfilled. The resulting mass has the same form of the Christodoulou-Ruffini irreducible mass. On the basis of these results, we can extrapolate the mass and thermodynamics of the rotating C-metric, which describes a Kerr-Newman-(A)dS black hole accelerated by a pulling string.

  10. Explicit formulas for regularized products and series

    CERN Document Server

    Jorgenson, Jay; Goldfeld, Dorian

    1994-01-01

    The theory of explicit formulas for regularized products and series forms a natural continuation of the analytic theory developed in LNM 1564. These explicit formulas can be used to describe the quantitative behavior of various objects in analytic number theory and spectral theory. The present book deals with other applications arising from Gaussian test functions, leading to theta inversion formulas and corresponding new types of zeta functions which are Gaussian transforms of theta series rather than Mellin transforms, and satisfy additive functional equations. Their wide range of applications includes the spectral theory of a broad class of manifolds and also the theory of zeta functions in number theory and representation theory. Here the hyperbolic 3-manifolds are given as a significant example.

  11. Generalized equations of state and regular universes

    CERN Document Server

    Contreras, Felipe; González, Esteban

    2015-01-01

    We found non singular solutions for universes filled with a fluid which obey a Generalized Equation of State of the form $P(\\rho)=-A\\rho+\\gamma\\rho^{\\lambda}$. An emergent universe is obtained if $A=1$ and $\\lambda =1/2$. If the matter source is reinterpret as that of a scalar matter field with some potential, the corresponding potential is derived. For a closed universe, an exact bounce solution is found for $A=1/3$ and the same $\\lambda $. We also explore how the composition of theses universes can be interpreted in terms of known fluids. It is of interest to note that accelerated solutions previously found for the late time evolution also represent regular solutions at early times.

  12. Regularity and Complexity in Dynamical Systems

    CERN Document Server

    Luo, Albert C J

    2012-01-01

    Regularity and Complexity in Dynamical Systems describes periodic and chaotic behaviors in dynamical systems, including continuous, discrete, impulsive,discontinuous, and switching systems. In traditional analysis, the periodic and chaotic behaviors in continuous, nonlinear dynamical systems were extensively discussed even if unsolved. In recent years, there has been an increasing amount of interest in periodic and chaotic behaviors in discontinuous dynamical systems because such dynamical systems are prevalent in engineering. Usually,the smoothening of discontinuous dynamical system is adopted in order to use the theory of continuous dynamical systems. However, such technique cannot provide suitable results in such discontinuous systems. In this book, an alternative way is presented to discuss the periodic and chaotic behaviors in discontinuous dynamical systems. This book also: Illustrates new concepts and methodology in discontinuous dynamical systems Uses different ideas to describe complicated dynamical...

  13. Regularization destriping of remote sensing imagery

    Directory of Open Access Journals (Sweden)

    R. Basnayake

    2017-07-01

    Full Text Available We illustrate the utility of variational destriping for ocean color images from both multispectral and hyperspectral sensors. In particular, we examine data from a filter spectrometer, the Visible Infrared Imaging Radiometer Suite (VIIRS on the Suomi National Polar Partnership (NPP orbiter, and an airborne grating spectrometer, the Jet Population Laboratory's (JPL hyperspectral Portable Remote Imaging Spectrometer (PRISM sensor. We solve the destriping problem using a variational regularization method by giving weights spatially to preserve the other features of the image during the destriping process. The target functional penalizes the neighborhood of stripes (strictly, directionally uniform features while promoting data fidelity, and the functional is minimized by solving the Euler–Lagrange equations with an explicit finite-difference scheme. We show the accuracy of our method from a benchmark data set which represents the sea surface temperature off the coast of Oregon, USA. Technical details, such as how to impose continuity across data gaps using inpainting, are also described.

  14. Regularization destriping of remote sensing imagery

    Science.gov (United States)

    Basnayake, Ranil; Bollt, Erik; Tufillaro, Nicholas; Sun, Jie; Gierach, Michelle

    2017-07-01

    We illustrate the utility of variational destriping for ocean color images from both multispectral and hyperspectral sensors. In particular, we examine data from a filter spectrometer, the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar Partnership (NPP) orbiter, and an airborne grating spectrometer, the Jet Population Laboratory's (JPL) hyperspectral Portable Remote Imaging Spectrometer (PRISM) sensor. We solve the destriping problem using a variational regularization method by giving weights spatially to preserve the other features of the image during the destriping process. The target functional penalizes the neighborhood of stripes (strictly, directionally uniform features) while promoting data fidelity, and the functional is minimized by solving the Euler-Lagrange equations with an explicit finite-difference scheme. We show the accuracy of our method from a benchmark data set which represents the sea surface temperature off the coast of Oregon, USA. Technical details, such as how to impose continuity across data gaps using inpainting, are also described.

  15. A Formal Approach to Verify Parameterized Protocols in Mobile Cyber-Physical Systems

    Directory of Open Access Journals (Sweden)

    Long Zhang

    2017-01-01

    Full Text Available Mobile cyber-physical systems (CPSs are very hard to verify, because of asynchronous communication and the arbitrary number of components. Verification via model checking typically becomes impracticable due to the state space explosion caused by the system parameters and concurrency. In this paper, we propose a formal approach to verify the safety properties of parameterized protocols in mobile CPS. By using counter abstraction, the protocol is modeled as a Petri net. Then, a novel algorithm, which uses IC3 (the state-of-the-art model checking algorithm as the back-end engine, is presented to verify the Petri net model. The experimental results show that our new approach can greatly scale the verification capabilities compared favorably against several recently published approaches. In addition to solving the instances fast, our method is significant for its lower memory consumption.

  16. Reports of envenomation by brown recluse spiders exceed verified specimens of Loxosceles spiders in South Carolina.

    Science.gov (United States)

    Frithsen, Ivar L; Vetter, Richard S; Stocks, Ian C

    2007-01-01

    To determine whether the number of brown recluse spider bites diagnosed by South Carolina physicians coincides with evidence of brown recluse spiders found in the state. Brown recluse spider bite diagnosis data were extracted from 1990 and 2004 surveys of South Carolina physicians. This was compared with the known historical evidence of brown recluse spiders collected in South Carolina and derived from various sources, including state agencies, arachnologists, and museum specimens. South Carolina physicians diagnosed 478 brown recluse spider bites in 1990 and 738 in 2004. Dating to 1953, 44 brown recluse spider specimens have been verified from 6 locations in South Carolina. The number of brown recluse bites reportedly diagnosed in South Carolina greatly outnumbers the verified brown recluse specimens that have been collected in the state. The pattern of bite diagnoses outnumbering verified brown recluse specimens has been reported in other areas outside of this spider's known endemic range.

  17. Dynamic Symmetric Key Mobile Commerce Scheme Based on Self-Verified Mechanism

    Directory of Open Access Journals (Sweden)

    Jiachen Yang

    2014-01-01

    Full Text Available In terms of the security and efficiency of mobile e-commerce, the authors summarized the advantages and disadvantages of several related schemes, especially the self-verified mobile payment scheme based on the elliptic curve cryptosystem (ECC and then proposed a new type of dynamic symmetric key mobile commerce scheme based on self-verified mechanism. The authors analyzed the basic algorithm based on self-verified mechanisms and detailed the complete transaction process of the proposed scheme. The authors analyzed the payment scheme based on the security and high efficiency index. The analysis shows that the proposed scheme not only meets the high efficiency of mobile electronic payment premise, but also takes the security into account. The user confirmation mechanism at the end of the proposed scheme further strengthens the security of the proposed scheme. In brief, the proposed scheme is more efficient and practical than most of the existing schemes.

  18. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    Science.gov (United States)

    Yampolskiy, Roman V.

    2017-09-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification.

  19. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  20. The cerebellum predicts the temporal consequences of observed motor acts.

    Science.gov (United States)

    Avanzino, Laura; Bove, Marco; Pelosin, Elisa; Ogliastro, Carla; Lagravinese, Giovanna; Martino, Davide

    2015-01-01

    It is increasingly clear that we extract patterns of temporal regularity between events to optimize information processing. The ability to extract temporal patterns and regularity of events is referred as temporal expectation. Temporal expectation activates the same cerebral network usually engaged in action selection, comprising cerebellum. However, it is unclear whether the cerebellum is directly involved in temporal expectation, when timing information is processed to make predictions on the outcome of a motor act. Healthy volunteers received one session of either active (inhibitory, 1 Hz) or sham repetitive transcranial magnetic stimulation covering the right lateral cerebellum prior the execution of a temporal expectation task. Subjects were asked to predict the end of a visually perceived human body motion (right hand handwriting) and of an inanimate object motion (a moving circle reaching a target). Videos representing movements were shown in full; the actual tasks consisted of watching the same videos, but interrupted after a variable interval from its onset by a dark interval of variable duration. During the 'dark' interval, subjects were asked to indicate when the movement represented in the video reached its end by clicking on the spacebar of the keyboard. Performance on the timing task was analyzed measuring the absolute value of timing error, the coefficient of variability and the percentage of anticipation responses. The active group exhibited greater absolute timing error compared with the sham group only in the human body motion task. Our findings suggest that the cerebellum is engaged in cognitive and perceptual domains that are strictly connected to motor control.

  1. Origins of forecast skill of weather and climate events on verifiable time scales

    CSIR Research Space (South Africa)

    Landman, WA

    2012-07-01

    Full Text Available Verification of weather and seasonal forecasts, as well as the statistical analysis of the spatial and temporal description of forecast and observed fields, are necessary to improve on our understanding of the capabilities of models to describe...

  2. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    Science.gov (United States)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  3. A Non-interleaving Timed Process Algebra and a Process Logic for Verifying Composition of Agents

    OpenAIRE

    磯部, 祥尚; 大蒔, 和仁; Yoshinao, Isobe; Kazuhito, Ohmaki; 産業技術総合研究所; 産業技術総合研究所; National Institute of Advanced Industrial Science and Technology.; National Institute of Advanced Industrial Science and Technology.

    2003-01-01

    We present formal frameworks tCCA, tLCA, and tICCA for verifying composition of agents. Behaviors of composite agents are described in tCCA and specifications for them are described in tLCA. Since consistency between specifications in tLCA is undecidable as proven in this paper, we propose to use intermediate specifications described in tICCA instead of directly checking the consistency, and then give useful propositions for verifying composition of agents in tICCA.

  4. Association between cotinine-verified smoking status and hypertension in 167,868 Korean adults.

    Science.gov (United States)

    Kim, Byung Jin; Han, Ji Min; Kang, Jung Gyu; Kim, Bum Soo; Kang, Jin Ho

    2017-10-01

    Previous studies showed inconsistent results concerning the relationship between chronic smoking and blood pressure. Most of the studies involved self-reported smoking status. This study was performed to evaluate the association of urinary cotinine or self-reported smoking status with hypertension and blood pressure in Korean adults. Among individuals enrolled in the Kangbuk Samsung Health Study and Kangbuk Samsung Cohort Study, 167,868 participants (men, 55.7%; age, 37.5 ± 6.9 years) between 2011 and 2013 who had urinary cotinine measurements were included. Individuals with urinary cotinine levels ≥50 ng/mL were defined as cotinine-verified current smokers. The prevalence of hypertension and cotinine-verified current smokers in the overall population was 6.8% and 22.7%, respectively (10.0% in men and 2.8% in women for hypertension: 37.7% in men and 3.9% in women for cotinine-verified current smokers). In a multivariate regression analysis adjusted for age, sex, body mass index, waist circumference, alcohol drinking, vigorous exercise, and diabetes, cotinine-verified current smoking was associated with lower prevalence of hypertension compared with cotinine-verified never smoking (OR[95% CI], 0.79 [0.75, 0.84]). Log-transformed cotinine levels and unobserved smoking were negatively associated with hypertension, respectively (0.96 [0.96, 0.97] and 0.55 [0.39, 0.79]). In a multivariate linear regression analysis, the cotinine-verified current smoking was inversely associated with systolic and diastolic blood pressure (BP) (regression coefficient[95% CI], -1.23[-1.39, -1.07] for systolic BP and -0.71 [-0.84, -0.58] for diastolic BP). In subgroup analyses according to sex, the inverse associations between cotinine-verified current smoking and hypertension were observed only in men. This large observational study showed that cotinine-verified current smoking and unobserved smoking were inversely associated with hypertension in Korean adults, especially only in

  5. Regular Exercisers Have Stronger Pelvic Floor Muscles than Non-Regular Exercisers at Midpregnancy.

    Science.gov (United States)

    Bø, Kari; Ellstrøm Engh, Marie; Hilde, Gunvor

    2017-12-26

    Today, all healthy pregnant women are encouraged to be physically active throughout pregnancy, with recommendations to participate in at least 30 min of aerobic activity on most days of the week, in addition to perform strength training of the major muscle groups 2-3 days per week, and also pelvic floor muscle training. There is, however, an ongoing debate whether general physical activity enhances or declines pelvic floor muscle function. To compare vaginal resting pressure, pelvic floor muscle strength and endurance in regular exercisers (exercise ≥ 30 minutes ≥ 3 times per week) and non-exercisers at mid-pregnancy. Furthermore, to assess whether regular general exercise or pelvic floor muscle strength was associated with urinary incontinence. This was a cross-sectional study at mean gestational week 20.9 (± 1.4) including 218 nulliparous pregnant women, mean age 28.6 years (range 19-40) and pre-pregnancy body mass index 23.9 kg/m 2 (SD 4.0). Vaginal resting pressure, pelvic floor muscle strength and pelvic floor muscle endurance were measured by a high precision pressure transducer connected to a vaginal balloon. International Consultation on Incontinence Questionnaire Urinary Incontinence Short Form was used to assess urinary incontinence. Differences between groups were analyzed using Independent Sample T-test. Linear regression analysis was conducted to adjust for pre-pregnancy body mass index, age, smoking during pregnancy and regular pelvic floor muscle training during pregnancy. P-value was set to ≤ 0.05. Regular exercisers had statistically significant stronger ( mean 6.4 cm H 2 O (95% CI: 1.7, 11.2)) and more enduring ( mean 39.9 cm H 2 Osec (95% CI: 42.2, 75.7)) pelvic floor muscles. Only pelvic floor muscle strength remained statistically significant, when adjusting for possible confounders. Pelvic floor muscle strength and not regular general exercise was associated with urinary continence (adjusted B: -6.4 (95% CI: -11.5, -1.4)). Regular

  6. Morphisms on infinite alphabets, countable states automata and regular sequences

    Science.gov (United States)

    Zhang, Jie-Meng; Chen, Jin; Guo, Ying-Jun; Wen, Zhi-Xiong

    2017-06-01

    In this paper, we prove that a class of regular sequences can be viewed as projections of fixed points of uniform morphisms on a countable alphabet, and also can be generated by countable states automata. Moreover, we prove that the regularity of some regular sequences is invariant under some codings.

  7. Exclusion of children with intellectual disabilities from regular ...

    African Journals Online (AJOL)

    Study investigated why teachers exclude children with intellectual disability from the regular classrooms in Nigeria. Participants were, 169 regular teachers randomly selected from Oyo and Ogun states. Questionnaire was used to collect data result revealed that 57.4% regular teachers could not cope with children with ID ...

  8. Towards Temporal Graph Databases

    OpenAIRE

    Campos, Alexander; Mozzino, Jorge; Vaisman, Alejandro

    2016-01-01

    In spite of the extensive literature on graph databases (GDBs), temporal GDBs have not received too much attention so far. Temporal GBDs can capture, for example, the evolution of social networks across time, a relevant topic in data analysis nowadays. In this paper we propose a data model and query language (denoted TEG-QL) for temporal GDBs, based on the notion of attribute graphs. This allows a straightforward translation to Neo4J, a well-known GBD. We present extensive examples of the use...

  9. Dictionary learning-based spatiotemporal regularization for 3D dense speckle tracking

    Science.gov (United States)

    Lu, Allen; Zontak, Maria; Parajuli, Nripesh; Stendahl, John C.; Boutagy, Nabil; Eberle, Melissa; O'Donnell, Matthew; Sinusas, Albert J.; Duncan, James S.

    2017-03-01

    Speckle tracking is a common method for non-rigid tissue motion analysis in 3D echocardiography, where unique texture patterns are tracked through the cardiac cycle. However, poor tracking often occurs due to inherent ultrasound issues, such as image artifacts and speckle decorrelation; thus regularization is required. Various methods, such as optical flow, elastic registration, and block matching techniques have been proposed to track speckle motion. Such methods typically apply spatial and temporal regularization in a separate manner. In this paper, we propose a joint spatiotemporal regularization method based on an adaptive dictionary representation of the dense 3D+time Lagrangian motion field. Sparse dictionaries have good signal adaptive and noise-reduction properties; however, they are prone to quantization errors. Our method takes advantage of the desirable noise suppression, while avoiding the undesirable quantization error. The idea is to enforce regularization only on the poorly tracked trajectories. Specifically, our method 1.) builds data-driven 4-dimensional dictionary of Lagrangian displacements using sparse learning, 2.) automatically identifies poorly tracked trajectories (outliers) based on sparse reconstruction errors, and 3.) performs sparse reconstruction of the outliers only. Our approach can be applied on dense Lagrangian motion fields calculated by any method. We demonstrate the effectiveness of our approach on a baseline block matching speckle tracking and evaluate performance of the proposed algorithm using tracking and strain accuracy analysis.

  10. Analysis of regularized long-wave equation associated with a new fractional operator with Mittag-Leffler type kernel

    Science.gov (United States)

    Kumar, Devendra; Singh, Jagdev; Baleanu, Dumitru; Sushila

    2018-02-01

    In this work, we aim to present a new fractional extension of regularized long-wave equation. The regularized long-wave equation is a very important mathematical model in physical sciences, which unfolds the nature of shallow water waves and ion acoustic plasma waves. The existence and uniqueness of the solution of the regularized long-wave equation associated with Atangana-Baleanu fractional derivative having Mittag-Leffler type kernel is verified by implementing the fixed-point theorem. The numerical results are derived with the help of an iterative algorithm. In order to show the effects of various parameters and variables on the displacement, the numerical results are presented in graphical and tabular form.

  11. Multiple parameters anomalies for verifying the geosystem spheres coupling effect: a case study of the 2010 Ms7.1 Yushu earthquake in China

    Directory of Open Access Journals (Sweden)

    Shuo Zheng

    2014-08-01

    Full Text Available In the research of earthquake anomaly recognition, the coupling effect of multiple geosystem spheres can be expected to reasonably interpretating the correlation between various anomalous signals before strong earthquake. Specially, the development of the Lithosphere–Atmosphere–Ionosphere (LAI coupling model has been accepted as verified by some experimental, thermal and electromagnetic data. However, quasi-synchronous anomalies of the multiple parameters, including thermal, radon and electromagnetic data, have not been reported in a single event case for verifying the geosystem spheres coupling effect. In this paper, we firstly summarized the reported studies on the power spectrum density (PSD in the ELF/VLF band and radon data recorded from Guza seismic station. Then, historical surface latent heat flux (SLHF data from the NCEP/NCAR Reanalysis Project was employed for investigating anomalous change in a month before the April 14, 2010, Ms7.1 Yushu earthquake which is one of the typical intra-continental earthquakes in Tibet Plateau. The results from spatial and temporal analysis revealed that anomalous fields of PSD and SLHF data were located close to the epicenter and the ends of some active faults at Bayan Har Block and all anomalous dates converged between April 8 and 11 (6 to 3 days before the Yushu earthquake. Therefore, we suggest that the anomalies of multiple parameters before the main shock are related with the Yushu earthquake. This paper could give an ideal case study to verify the geosystem spheres coupling effect happened in a single event.

  12. Descriptional complexity of non-unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2017-09-01

    Full Text Available . Assent, I., Seibert, S.: An upper bound for transforming self-verifying au- tomata into deterministic ones. RAIRO-Theoretical Informatics and Applications- Informatique The´orique et Applications 41(3) (2007) 261–265 7. Hromkovicˇ, J., Schnitger, G...

  13. Developing a corpus to verify the performance of a tone labelling algorithm

    CSIR Research Space (South Africa)

    Raborife, M

    2011-11-01

    Full Text Available The authors report on a study that involved the development of a corpus used to verify the performance of two tone labelling algorithms, with one algorithm being an improvement on the other. These algorithms were developed for speech synthesis...

  14. Verifying the Efficacy of Vocational Guidance Programs: Procedures, Problems, and Potential Directions

    Science.gov (United States)

    Perry, Justin C.; Dauwalder, Jean Pierre; Bonnett, Heather R.

    2009-01-01

    This article summarizes 12 presentations in Group 7 of the 2007 joint symposium of the International Association for Educational and Vocational Guidance, Society for Vocational Psychology, and National Career Development Association held in Padua, Italy, that focused on procedures for verifying the efficacy of vocational guidance programs. Three…

  15. 40 CFR 8.9 - Measures to assess and verify environmental impacts.

    Science.gov (United States)

    2010-07-01

    ... environmental impacts. 8.9 Section 8.9 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL ENVIRONMENTAL IMPACT ASSESSMENT OF NONGOVERNMENTAL ACTIVITIES IN ANTARCTICA § 8.9 Measures to assess and verify environmental impacts. (a) The operator shall conduct appropriate monitoring of key environmental indicators as...

  16. Astronaut Edwin Aldrin in EMU verifies fit of Portable Life Support System

    Science.gov (United States)

    1969-01-01

    Astronaut Edwin E. Aldrin Jr., wearing an Extravehicular Mobility Unit (EMU), verifies fit of the Portable Life Support System (PLSS) strap length during lunar surface training at the Kennedy Space Center. Aldrin is the prime crew lunar module pilot of the Apollo 11 lunar landing mission. Aldrin's PLSS backpack is attached to a lunar weight simulator.

  17. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    Energy Technology Data Exchange (ETDEWEB)

    Tolk, Keith M.; Stoker, Gerald C.

    1999-07-20

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements.

  18. Why Verifying Diagnostic Decisions with a Checklist Can Help: Insights from Eye Tracking

    Science.gov (United States)

    Sibbald, Matthew; de Bruin, Anique B. H.; Yu, Eric; van Merrienboer, Jeroen J. G.

    2015-01-01

    Making a diagnosis involves ratifying or verifying a proposed answer. Formalizing this verification process with checklists, which highlight key variables involved in the diagnostic decision, is often advocated. However, the mechanisms by which a checklist might allow clinicians to improve their verification process have not been well studied. We…

  19. 25 CFR 20.602 - How does the Bureau verify eligibility for social services?

    Science.gov (United States)

    2010-04-01

    ... FINANCIAL ASSISTANCE AND SOCIAL SERVICES PROGRAMS Administrative Procedures § 20.602 How does the Bureau verify eligibility for social services? (a) You, the applicant, are the primary source of information... to your social services worker any changes in circumstances that may affect your eligibility or the...

  20. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility`s WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator`s waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits.

  1. Temporal Lobe Seizure

    Science.gov (United States)

    ... pregnancy Temporal lobe seizure Symptoms & causes Diagnosis & treatment Advertisement Mayo Clinic does not endorse companies or products. ... a Job Site Map About This Site Twitter Facebook Google YouTube Pinterest Mayo Clinic is a not- ...

  2. Temporal Lobe Seizure

    Science.gov (United States)

    ... functions, including having odd feelings — such as euphoria, deja vu or fear. During a temporal lobe seizure, you ... include: A sudden sense of unprovoked fear A deja vu experience — a feeling that what's happening has happened ...

  3. Multisensory temporal numerosity judgment

    NARCIS (Netherlands)

    Philippi, T.; Erp, J.B.F. van; Werkhoven, P.J.

    2008-01-01

    In temporal numerosity judgment, observers systematically underestimate the number of pulses. The strongest underestimations occur when stimuli are presented with a short interstimulus interval (ISI) and are stronger for vision than for audition and touch. We investigated if multisensory

  4. Neocortical Temporal Lobe Epilepsy

    Science.gov (United States)

    Bercovici, Eduard; Kumar, Balagobal Santosh; Mirsattari, Seyed M.

    2012-01-01

    Complex partial seizures (CPSs) can present with various semiologies, while mesial temporal lobe epilepsy (mTLE) is a well-recognized cause of CPS, neocortical temporal lobe epilepsy (nTLE) albeit being less common is increasingly recognized as separate disease entity. Differentiating the two remains a challenge for epileptologists as many symptoms overlap due to reciprocal connections between the neocortical and the mesial temporal regions. Various studies have attempted to correctly localize the seizure focus in nTLE as patients with this disorder may benefit from surgery. While earlier work predicted poor outcomes in this population, recent work challenges those ideas yielding good outcomes in part due to better localization using improved anatomical and functional techniques. This paper provides a comprehensive review of the diagnostic workup, particularly the application of recent advances in electroencephalography and functional brain imaging, in neocortical temporal lobe epilepsy. PMID:22953057

  5. Massive temporal lobe cholesteatoma

    National Research Council Canada - National Science Library

    Waidyasekara, Pasan; Dowthwaite, Samuel A; Stephenson, Ellison; Bhuta, Sandeep; McMonagle, Brent

    2015-01-01

    .... There had been no relevant symptoms in the interim until 6 weeks prior to this presentation. Imaging demonstrated a large right temporal lobe mass contiguous with the middle ear and mastoid cavity with features consistent with cholesteatoma...

  6. Massive Temporal Lobe Cholesteatoma

    National Research Council Canada - National Science Library

    Waidyasekara, Pasan; Dowthwaite, Samuel A; Stephenson, Ellison; Bhuta, Sandeep; McMonagle, Brent

    2015-01-01

    .... There had been no relevant symptoms in the interim until 6 weeks prior to this presentation. Imaging demonstrated a large right temporal lobe mass contiguous with the middle ear and mastoid cavity with features consistent with cholesteatoma...

  7. Iron status of regular voluntary blood donors

    Directory of Open Access Journals (Sweden)

    Mahida Vilsu

    2008-01-01

    Full Text Available Background: Our blood bank is a regional blood transfusion centre, which accepts blood only from voluntary donors. Aim: The aim is to study iron status of regular voluntary donors who donated their blood at least twice in a year. Materials and Methods: Prior to blood donation, blood samples of 220 male and 30 female voluntary donors were collected. Control included 100 each male and female healthy individuals in the 18- to 60-year age group, who never donated blood and did not have any chronic infection. In the study and control groups, about 10% subjects consumed non-vegetarian diet. After investigation, 85 males and 56 females having haemoglobin (Hb levels above 12.5 g/dl were selected as controls. Donors were divided into ≤10, 11-20, 21-50 and> 50 blood donation categories. Majority of the donors in> 50 donation category donated blood four times in a year, whereas the remaining donors donated two to three times per year. Haematological parameters were measured on fully automatic haematology analyzer, serum iron and total iron-binding capacity (TIBC by biochemical methods, ferritin using ELISA kits and transferrin using immunoturbidometry kits. Iron/TIBC ratio x 100 gave percentage of transferrin saturation value. Statistical Analysis: Statistical evaluation was done by mean, standard deviation, pair t -test, χ2 and anova ( F -test. Results: Preliminary analysis revealed that there was no significant difference in the iron profile of vegetarian and non-vegetarian subjects or controls and the donors donating < 20 times. Significant increase or decrease was observed in mean values of various haematological and iron parameters in donors who donated blood for> 20 times ( P < 0.001, compared to controls. Anaemia, iron deficiency and depletion of iron stores were more prevalent in female donors ( P < 0.05 compared to males and especially in those male donors who donated their blood for more than 20 times. Conclusion: Regular voluntary blood

  8. Accreting fluids onto regular black holes via Hamiltonian approach

    Energy Technology Data Exchange (ETDEWEB)

    Jawad, Abdul [COMSATS Institute of Information Technology, Department of Mathematics, Lahore (Pakistan); Shahzad, M.U. [COMSATS Institute of Information Technology, Department of Mathematics, Lahore (Pakistan); University of Central Punjab, CAMS, UCP Business School, Lahore (Pakistan)

    2017-08-15

    We investigate the accretion of test fluids onto regular black holes such as Kehagias-Sfetsos black holes and regular black holes with Dagum distribution function. We analyze the accretion process when different test fluids are falling onto these regular black holes. The accreting fluid is being classified through the equation of state according to the features of regular black holes. The behavior of fluid flow and the existence of sonic points is being checked for these regular black holes. It is noted that the three-velocity depends on critical points and the equation of state parameter on phase space. (orig.)

  9. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    Science.gov (United States)

    Ogle, Stephen M.; Davis, Kenneth; Lauvaux, Thomas; Schuh, Andrew; Cooley, Dan; West, Tristram O.; Heath, Linda S.; Miles, Natasha L.; Richardson, Scott; Breidt, F. Jay; Smith, James E.; McCarty, Jessica L.; Gurney, Kevin R.; Tans, Pieter; Denning, A. Scott

    2015-03-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country’s contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated emissions associated with managing lands for carbon sequestration and other activities, which often have large uncertainties. We report here on the challenges and results associated with a case study using atmospheric measurements of CO2 concentrations and inverse modeling to verify nationally-reported biogenic CO2 emissions. The biogenic CO2 emissions inventory was compiled for the Mid-Continent region of United States based on methods and data used by the US government for reporting to the UNFCCC, along with additional sources and sinks to produce a full carbon balance. The biogenic emissions inventory produced an estimated flux of -408 ± 136 Tg CO2 for the entire study region, which was not statistically different from the biogenic flux of -478 ± 146 Tg CO2 that was estimated using the atmospheric CO2 concentration data. At sub-regional scales, the spatial density of atmospheric observations did not appear sufficient to verify emissions in general. However, a difference between the inventory and inversion results was found in one isolated area of West-central Wisconsin. This part of the region is dominated by forestlands, suggesting that further investigation may be warranted into the forest C stock or harvested wood product data from this portion of the study area. The results suggest that observations of atmospheric CO2 concentration data and inverse modeling could be used to verify biogenic emissions, and provide more confidence in biogenic GHG emissions reporting to the UNFCCC.

  10. A sub-domain based regularization method with prior information for human thorax imaging using electrical impedance tomography

    Science.gov (United States)

    In Kang, Suk; Khambampati, Anil Kumar; Jeon, Min Ho; Kim, Bong Seok; Kim, Kyung Youn

    2016-02-01

    Electrical impedance tomography (EIT) is a non-invasive imaging technique that can be used as a bed-side monitoring tool for human thorax imaging. EIT has high temporal resolution characteristics but at the same time it suffers from poor spatial resolution due to ill-posedness of the inverse problem. Often regularization methods are used as a penalty term in the cost function to stabilize the sudden changes in resistivity. In human thorax monitoring, with conventional regularization methods employing Tikhonov type regularization, the reconstructed image is smoothed between the heart and the lungs, that is, it makes it difficult to distinguish the exact boundaries of the lungs and the heart. Sometimes, obtaining structural information of the object prior to this can be incorporated into the regularization method to improve the spatial resolution along with helping create clear and distinct boundaries between the objects. However, the boundary of the heart is changed rapidly due to the cardiac cycle hence there is no information concerning the exact boundary of the heart. Therefore, to improve the spatial resolution for human thorax monitoring during the cardiac cycle, in this paper, a sub-domain based regularization method is proposed assuming the lungs and part of background region is known. In the proposed method, the regularization matrix is modified anisotropically to include sub-domains as prior information, and the regularization parameter is assigned with different weights to each sub-domain. Numerical simulations and phantom experiments for 2D human thorax monitoring are performed to evaluate the performance of the proposed regularization method. The results show a better reconstruction performance with the proposed regularization method.

  11. A regularity-based modeling of oil borehole logs

    Science.gov (United States)

    Gaci, Said; Zaourar, Naima

    2013-04-01

    Multifractional Brownian motions (mBms) are successfully used to describe borehole logs behavior. These local fractal models allow to investigate the depth-evolution of regularity of the logs, quantified by the Hölder exponent (H). In this study, a regularity analysis is carried out on datasets recorded in Algerian oil boreholes located in different geological settings. The obtained regularity profiles show a clear correlation with lithology. Each lithological discontinuity corresponds to a jump in H value. Moreover, for a given borehole, all the regularity logs are significantly correlated and lead to similar lithological segmentations. Therefore, the Hölderian regularity is a robust property which can be used to characterize lithological heterogeneities. However, this study does not draw any relation between the recorded physical property and its estimated regularity degree for all the analyzed logs. Keywords: well logs, regularity, Hölder exponent, multifractional Brownian motion

  12. Regularities and irregularities in order flow data

    Science.gov (United States)

    Theissen, Martin; Krause, Sebastian M.; Guhr, Thomas

    2017-11-01

    We identify and analyze statistical regularities and irregularities in the recent order flow of different NASDAQ stocks, focusing on the positions where orders are placed in the order book. This includes limit orders being placed outside of the spread, inside the spread and (effective) market orders. Based on the pairwise comparison of the order flow of different stocks, we perform a clustering of stocks into groups with similar behavior. This is useful to assess systemic aspects of stock price dynamics. We find that limit order placement inside the spread is strongly determined by the dynamics of the spread size. Most orders, however, arrive outside of the spread. While for some stocks order placement on or next to the quotes is dominating, deeper price levels are more important for other stocks. As market orders are usually adjusted to the quote volume, the impact of market orders depends on the order book structure, which we find to be quite diverse among the analyzed stocks as a result of the way limit order placement takes place.

  13. Flip to Regular Triangulation and Convex Hull.

    Science.gov (United States)

    Gao, Mingcen; Cao, Thanh-Tung; Tan, Tiow-Seng

    2017-02-01

    Flip is a simple and local operation to transform one triangulation to another. It makes changes only to some neighboring simplices, without considering any attribute or configuration global in nature to the triangulation. Thanks to this characteristic, several flips can be independently applied to different small, non-overlapping regions of one triangulation. Such operation is favored when designing algorithms for data-parallel, massively multithreaded hardware, such as the GPU. However, most existing flip algorithms are designed to be executed sequentially, and usually need some restrictions on the execution order of flips, making them hard to be adapted to parallel computation. In this paper, we present an in depth study of flip algorithms in low dimensions, with the emphasis on the flexibility of their execution order. In particular, we propose a series of provably correct flip algorithms for regular triangulation and convex hull in 2D and 3D, with implementations for both CPUs and GPUs. Our experiment shows that our GPU implementation for constructing these structures from a given point set achieves up to two orders of magnitude of speedup over other popular single-threaded CPU implementation of existing algorithms.

  14. Regularities development of entrepreneurial structures in regions

    Directory of Open Access Journals (Sweden)

    Julia Semenovna Pinkovetskaya

    2012-12-01

    Full Text Available Consider regularities and tendencies for the three types of entrepreneurial structures — small enterprises, medium enterprises and individual entrepreneurs. The aim of the research was to confirm the possibilities of describing indicators of aggregate entrepreneurial structures with the use of normal law distribution functions. Presented proposed by the author the methodological approach and results of construction of the functions of the density distribution for the main indicators for the various objects: the Russian Federation, regions, as well as aggregates ofentrepreneurial structures, specialized in certain forms ofeconomic activity. All the developed functions, as shown by the logical and statistical analysis, are of high quality and well-approximate the original data. In general, the proposed methodological approach is versatile and can be used in further studies of aggregates of entrepreneurial structures. The received results can be applied in solving a wide range of problems justify the need for personnel and financial resources at the federal, regional and municipal levels, as well as the formation of plans and forecasts of development entrepreneurship and improvement of this sector of the economy.

  15. Toroidal regularization of the guiding center Lagrangian

    Science.gov (United States)

    Burby, J. W.; Ellison, C. L.

    2017-11-01

    In the Lagrangian theory of guiding center motion, an effective magnetic field B*=B +(m /e )v∥∇× b appears prominently in the equations of motion. Because the parallel component of this field can vanish, there is a range of parallel velocities where the Lagrangian guiding center equations of motion are either ill-defined or very badly behaved. Moreover, the velocity dependence of B* greatly complicates the identification of canonical variables and therefore the formulation of symplectic integrators for guiding center dynamics. This letter introduces a simple coordinate transformation that alleviates both these problems simultaneously. In the new coordinates, the Liouville volume element is equal to the toroidal contravariant component of the magnetic field. Consequently, the large-velocity singularity is completely eliminated. Moreover, passing from the new coordinate system to canonical coordinates is extremely simple, even if the magnetic field is devoid of flux surfaces. We demonstrate the utility of this approach in regularizing the guiding center Lagrangian by presenting a new and stable one-step variational integrator for guiding centers moving in arbitrary time-dependent electromagnetic fields.

  16. Identifying Cognitive States Using Regularity Partitions.

    Science.gov (United States)

    Pappas, Ioannis; Pardalos, Panos

    2015-01-01

    Functional Magnetic Resonance (fMRI) data can be used to depict functional connectivity of the brain. Standard techniques have been developed to construct brain networks from this data; typically nodes are considered as voxels or sets of voxels with weighted edges between them representing measures of correlation. Identifying cognitive states based on fMRI data is connected with recording voxel activity over a certain time interval. Using this information, network and machine learning techniques can be applied to discriminate the cognitive states of the subjects by exploring different features of data. In this work we wish to describe and understand the organization of brain connectivity networks under cognitive tasks. In particular, we use a regularity partitioning algorithm that finds clusters of vertices such that they all behave with each other almost like random bipartite graphs. Based on the random approximation of the graph, we calculate a lower bound on the number of triangles as well as the expectation of the distribution of the edges in each subject and state. We investigate the results by comparing them to the state of the art algorithms for exploring connectivity and we argue that during epochs that the subject is exposed to stimulus, the inspected part of the brain is organized in an efficient way that enables enhanced functionality.

  17. Identifying Cognitive States Using Regularity Partitions.

    Directory of Open Access Journals (Sweden)

    Ioannis Pappas

    Full Text Available Functional Magnetic Resonance (fMRI data can be used to depict functional connectivity of the brain. Standard techniques have been developed to construct brain networks from this data; typically nodes are considered as voxels or sets of voxels with weighted edges between them representing measures of correlation. Identifying cognitive states based on fMRI data is connected with recording voxel activity over a certain time interval. Using this information, network and machine learning techniques can be applied to discriminate the cognitive states of the subjects by exploring different features of data. In this work we wish to describe and understand the organization of brain connectivity networks under cognitive tasks. In particular, we use a regularity partitioning algorithm that finds clusters of vertices such that they all behave with each other almost like random bipartite graphs. Based on the random approximation of the graph, we calculate a lower bound on the number of triangles as well as the expectation of the distribution of the edges in each subject and state. We investigate the results by comparing them to the state of the art algorithms for exploring connectivity and we argue that during epochs that the subject is exposed to stimulus, the inspected part of the brain is organized in an efficient way that enables enhanced functionality.

  18. Regularity and approximability of electronic wave functions

    CERN Document Server

    Yserentant, Harry

    2010-01-01

    The electronic Schrödinger equation describes the motion of N-electrons under Coulomb interaction forces in a field of clamped nuclei. The solutions of this equation, the electronic wave functions, depend on 3N variables, with three spatial dimensions for each electron. Approximating these solutions is thus inordinately challenging, and it is generally believed that a reduction to simplified models, such as those of the Hartree-Fock method or density functional theory, is the only tenable approach. This book seeks to show readers that this conventional wisdom need not be ironclad: the regularity of the solutions, which increases with the number of electrons, the decay behavior of their mixed derivatives, and the antisymmetry enforced by the Pauli principle contribute properties that allow these functions to be approximated with an order of complexity which comes arbitrarily close to that for a system of one or two electrons. The text is accessible to a mathematical audience at the beginning graduate level as...

  19. One-dimensional QCD in thimble regularization

    Science.gov (United States)

    Di Renzo, F.; Eruzzi, G.

    2018-01-01

    QCD in 0 +1 dimensions is numerically solved via thimble regularization. In the context of this toy model, a general formalism is presented for S U (N ) theories. The sign problem that the theory displays is a genuine one, stemming from a (quark) chemical potential. Three stationary points are present in the original (real) domain of integration, so that contributions from all the thimbles associated to them are to be taken into account: we show how semiclassical computations can provide hints on the regions of parameter space where this is absolutely crucial. Known analytical results for the chiral condensate and the Polyakov loop are correctly reproduced: this is in particular trivial at high values of the number of flavors Nf. In this regime we notice that the single thimble dominance scenario takes place (the dominant thimble is the one associated to the identity). At low values of Nf computations can be more difficult. It is important to stress that this is not at all a consequence of the original sign problem (not even via the residual phase). The latter is always under control, while accidental, delicate cancelations of contributions coming from different thimbles can be in place in (restricted) regions of the parameter space.

  20. Elementary Particle Spectroscopy in Regular Solid Rewrite

    Science.gov (United States)

    Trell, Erik

    2008-10-01

    The Nilpotent Universal Computer Rewrite System (NUCRS) has operationalized the radical ontological dilemma of Nothing at All versus Anything at All down to the ground recursive syntax and principal mathematical realisation of this categorical dichotomy as such and so governing all its sui generis modalities, leading to fulfilment of their individual terms and compass when the respective choice sequence operations are brought to closure. Focussing on the general grammar, NUCRS by pure logic and its algebraic notations hence bootstraps Quantum Mechanics, aware that it "is the likely keystone of a fundamental computational foundation" also for e.g. physics, molecular biology and neuroscience. The present work deals with classical geometry where morphology is the modality, and ventures that the ancient regular solids are its specific rewrite system, in effect extensively anticipating the detailed elementary particle spectroscopy, and further on to essential structures at large both over the inorganic and organic realms. The geodetic antipode to Nothing is extension, with natural eigenvector the endless straight line which when deployed according to the NUCRS as well as Plotelemeian topographic prescriptions forms a real three-dimensional eigenspace with cubical eigenelements where observed quark-skewed quantum-chromodynamical particle events self-generate as an Aristotelean phase transition between the straight and round extremes of absolute endlessness under the symmetry- and gauge-preserving, canonical coset decomposition SO(3)×O(5) of Lie algebra SU(3). The cubical eigen-space and eigen-elements are the parental state and frame, and the other solids are a range of transition matrix elements and portions adapting to the spherical root vector symmetries and so reproducibly reproducing the elementary particle spectroscopy, including a modular, truncated octahedron nano-composition of the Electron which piecemeal enter into molecular structures or compressed to each

  1. A formal algorithm for verifying the validity of clustering results based on model checking.

    Science.gov (United States)

    Huang, Shaobin; Cheng, Yuan; Lang, Dapeng; Chi, Ronghua; Liu, Guofeng

    2014-01-01

    The limitations in general methods to evaluate clustering will remain difficult to overcome if verifying the clustering validity continues to be based on clustering results and evaluation index values. This study focuses on a clustering process to analyze crisp clustering validity. First, we define the properties that must be satisfied by valid clustering processes and model clustering processes based on program graphs and transition systems. We then recast the analysis of clustering validity as the problem of verifying whether the model of clustering processes satisfies the specified properties with model checking. That is, we try to build a bridge between clustering and model checking. Experiments on several datasets indicate the effectiveness and suitability of our algorithms. Compared with traditional evaluation indices, our formal method can not only indicate whether the clustering results are valid but, in the case the results are invalid, can also detect the objects that have led to the invalidity.

  2. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  3. A formal algorithm for verifying the validity of clustering results based on model checking.

    Directory of Open Access Journals (Sweden)

    Shaobin Huang

    Full Text Available The limitations in general methods to evaluate clustering will remain difficult to overcome if verifying the clustering validity continues to be based on clustering results and evaluation index values. This study focuses on a clustering process to analyze crisp clustering validity. First, we define the properties that must be satisfied by valid clustering processes and model clustering processes based on program graphs and transition systems. We then recast the analysis of clustering validity as the problem of verifying whether the model of clustering processes satisfies the specified properties with model checking. That is, we try to build a bridge between clustering and model checking. Experiments on several datasets indicate the effectiveness and suitability of our algorithms. Compared with traditional evaluation indices, our formal method can not only indicate whether the clustering results are valid but, in the case the results are invalid, can also detect the objects that have led to the invalidity.

  4. Efficient Verifiable Range and Closest Point Queries in Zero-Knowledge

    Directory of Open Access Journals (Sweden)

    Ghosh Esha

    2016-10-01

    Full Text Available We present an efficient method for answering one-dimensional range and closest-point queries in a verifiable and privacy-preserving manner. We consider a model where a data owner outsources a dataset of key-value pairs to a server, who answers range and closest-point queries issued by a client and provides proofs of the answers. The client verifies the correctness of the answers while learning nothing about the dataset besides the answers to the current and previous queries. Our work yields for the first time a zero-knowledge privacy assurance to authenticated range and closest-point queries. Previous work leaked the size of the dataset and used an inefficient proof protocol. Our construction is based on hierarchical identity-based encryption. We prove its security and analyze its efficiency both theoretically and with experiments on synthetic and real data (Enron email and Boston taxi datasets.

  5. Leaflet anatomy verifies relationships within Syagrus (Arecaceae and aids in identification

    Directory of Open Access Journals (Sweden)

    Larry Noblick

    2013-09-01

    Full Text Available The current investigation was carried out to examine how palm anatomy may coincide with the current molecular analysis including the three recognized clades of Syagrus Mart. and to justify the splitting of acaulescent Syagrus species (e.g. S. petraea (Mart. Becc. into several species. Free-hand cross-sections of leaflets were made and the comparison of these verifies the relationships suggested by the molecular data. Free-hand leaflet sections were also found to be useful in the identification of otherwise difficult-to-identify acaulescent Syagrus species. The result and conclusion is that anatomical data is valuable in helping to verify molecular data and that splitting the acaulescent species of Syagrus is justified by the differences discovered in their field habit and anatomy. These differences were used to produce an identification key that is based on the anatomy.

  6. Regularization and a general linear model for event-related potential estimation.

    Science.gov (United States)

    Kristensen, Emmanuelle; Guerin-Dugué, Anne; Rivet, Bertrand

    2017-12-01

    The usual event-related potential (ERP) estimation is the average across epochs time-locked on stimuli of interest. These stimuli are repeated several times to improve the signal-to-noise ratio (SNR) and only one evoked potential is estimated inside the temporal window of interest. Consequently, the average estimation does not take into account other neural responses within the same epoch that are due to short inter stimuli intervals. These adjacent neural responses may overlap and distort the evoked potential of interest. This overlapping process is a significant issue for the eye fixation-related potential (EFRP) technique in which the epochs are time-locked on the ocular fixations. The inter fixation intervals are not experimentally controlled and can be shorter than the neural response's latency. To begin, the Tikhonov regularization, applied to the classical average estimation, was introduced to improve the SNR for a given number of trials. The generalized cross validation was chosen to obtain the optimal value of the ridge parameter. Then, to deal with the issue of overlapping, the general linear model (GLM), was used to extract all neural responses inside an epoch. Finally, the regularization was also applied to it. The models (the classical average and the GLM with and without regularization) were compared on both simulated data and real datasets from a visual scene exploration in co-registration with an eye-tracker, and from a P300 Speller experiment. The regularization was found to improve the estimation by average for a given number of trials. The GLM was more robust and efficient, its efficiency actually reinforced by the regularization.

  7. Calculating integral dose using data exported from a commercial record and verify system.

    Science.gov (United States)

    Fox, C; Hardcastle, N; Lim, A; Khor, R

    2015-06-01

    Integral dose has been useful in investigations into the incidence of second primary malignancies in radiotherapy patients. This note outlines an approach to calculation of integral dose for a group of prostate patients using only data exported from a commercial record and verify system. Even though it was necessary to make some assumptions about patient anatomy, comparison with integral dose calculated from data exported from the planning system showed good agreement.

  8. Verifying the model of predicting entrepreneurial intention among students of business and non-business orientation

    OpenAIRE

    Zoran Sušanj; Ana Jakopec; Irena Miljković Krečar

    2015-01-01

    This study aims to verify whether certain entrepreneurial characteristics, like entrepreneurial potential and entrepreneurial propensity, affect the level of entrepreneurial self-efficacy and desirability of entrepreneurship, and further have direct and indirect effect on entrepreneurial intentions. Furthermore, this study seeks to compare the strength of the relationship between these variables among groups of students who receive some entrepreneurship education and students outside the busi...

  9. The AutoProof Verifier: Usability by Non-Experts and on Standard Code

    Directory of Open Access Journals (Sweden)

    Carlo A. Furia

    2015-08-01

    Full Text Available Formal verification tools are often developed by experts for experts; as a result, their usability by programmers with little formal methods experience may be severely limited. In this paper, we discuss this general phenomenon with reference to AutoProof: a tool that can verify the full functional correctness of object-oriented software. In particular, we present our experiences of using AutoProof in two contrasting contexts representative of non-expert usage. First, we discuss its usability by students in a graduate course on software verification, who were tasked with verifying implementations of various sorting algorithms. Second, we evaluate its usability in verifying code developed for programming assignments of an undergraduate course. The first scenario represents usability by serious non-experts; the second represents usability on "standard code", developed without full functional verification in mind. We report our experiences and lessons learnt, from which we derive some general suggestions for furthering the development of verification tools with respect to improving their usability.

  10. Recognition Memory is Improved by a Structured Temporal Framework During Encoding.

    Science.gov (United States)

    Thavabalasingam, Sathesan; O'Neil, Edward B; Zeng, Zheng; Lee, Andy C H

    2015-01-01

    In order to function optimally within our environment, we continuously extract temporal patterns from our experiences and formulate expectations that facilitate adaptive behavior. Given that our memories are embedded within spatiotemporal contexts, an intriguing possibility is that mnemonic processes are sensitive to the temporal structure of events. To test this hypothesis, in a series of behavioral experiments we manipulated the regularity of interval durations at encoding to create temporally structured and unstructured frameworks. Our findings revealed enhanced recognition memory (d') for stimuli that were explicitly encoded within a temporally structured vs. unstructured framework. Encoding information within a temporally structured framework was also associated with a reduction in the negative effects of proactive interference and was linked to greater recollective recognition memory. Furthermore, rhythmic temporal structure was found to enhance recognition memory for incidentally encoded information. Collectively, these results support the possibility that we possess a greater capacity to learn and subsequently remember temporally structured information.

  11. Recognition memory is improved by a structured temporal framework during encoding

    Directory of Open Access Journals (Sweden)

    Sathesan eThavabalasingam

    2016-01-01

    Full Text Available In order to function optimally within our environment, we continuously extract temporal patterns from our experiences and formulate expectations that facilitate adaptive behavior. Given that our memories are embedded within spatiotemporal contexts, an intriguing possibility is that mnemonic processes are sensitive to the temporal structure of events. To test this hypothesis, in a series of behavioral experiments we manipulated the regularity of interval durations at encoding to create temporally structured and unstructured frameworks. Our findings revealed enhanced recognition memory (d’ for stimuli that were explicitly encoded within a temporally structured versus unstructured framework. Encoding information within a temporally structured framework was also associated with a reduction in the negative effects of proactive interference and was linked to greater recollective recognition memory. Furthermore, rhythmic temporal structure was found to enhance recognition memory for incidentally encoded information. Collectively, these results support the possibility that we possess a greater capacity to learn and subsequently remember temporally structured information.

  12. Learning about time within the spinal cord: evidence that spinal neurons can abstract and store an index of regularity

    OpenAIRE

    Lee, Kuan H.; Turtle, Joel D.; Huang, Yung-Jen; Strain, Misty M.; Baumbauer, Kyle M.; Grau, James W.

    2015-01-01

    Prior studies have shown that intermittent noxious stimulation has divergent effects on spinal cord plasticity depending upon whether it occurs in a regular [fixed time (FT)] or irregular [variable time (VT)] manner: In spinally transected animals, VT stimulation to the tail or hind leg impaired spinal learning whereas an extended exposure to FT stimulation had a restorative/protective effect. These observations imply that lower level systems are sensitive to temporal relations. Using spinall...

  13. Temporal network epidemiology

    CERN Document Server

    Holme, Petter

    2017-01-01

    This book covers recent developments in epidemic process models and related data on temporally varying networks. It is widely recognized that contact networks are indispensable for describing, understanding, and intervening to stop the spread of infectious diseases in human and animal populations; “network epidemiology” is an umbrella term to describe this research field. More recently, contact networks have been recognized as being highly dynamic. This observation, also supported by an increasing amount of new data, has led to research on temporal networks, a rapidly growing area. Changes in network structure are often informed by epidemic (or other) dynamics, in which case they are referred to as adaptive networks. This volume gathers contributions by prominent authors working in temporal and adaptive network epidemiology, a field essential to understanding infectious diseases in real society.

  14. Vivienda temporal para refugiados

    OpenAIRE

    Amonarraiz Gutiérrez, Ana

    2015-01-01

    El proyecto se centra en el diseño y desarrollo de un espacio destinado a vivienda temporal para dar hogar a personas que han perdido su casa. Este tipo de vivienda es fundamental dentro del proceso de recuperación post-desastre ya que la construcción inmediata de viviendas permanentes es utópica. El objetivo principal es la construcción de una vivienda temporal formada por elementos prefabricados, logrando así una mayor rapidez en su montaje. Esto también permitirá que cualquier component...

  15. Tanning beauty ideals among Swedish adults who exercise regularly

    OpenAIRE

    Cedercreutz, Isabella

    2016-01-01

    Tanning beauty ideals among Swedish adults who exercise regularly Introduction: The majority of the Swedish population exercise regularly, and it has been reported that they believe having an attractive body is important. While research has shown that Swedes wish to be tanned, it is unknown whether there are any correlations to their exercise habits. Aims: The primary aim was to determine tanned skin tone ideals and tanning beauty ideals among regularly exercising Swedish adults. Associati...

  16. Existence domains for invariant reactions in binary regular solution ...

    Indian Academy of Sciences (India)

    Unknown

    two phases (e.g. a liquid and a solid phase) has been examined using the regular solution model. The necessary conditions for the ... Binary phase diagrams; invariant reactions; regular solution model. 1. Introduction. Using the regular ...... Nb–Ta, Nb–W, Os–Re, Os–Ru, Pd–Pt, Pt–Rh,. Re–Ru, Ta–W, V–W]. R + T MN [Cr–V, ...

  17. Phase-regularized polygon computer-generated holograms.

    Science.gov (United States)

    Im, Dajeong; Moon, Eunkyoung; Park, Yohan; Lee, Deokhwan; Hahn, Joonku; Kim, Hwi

    2014-06-15

    The dark-line defect problem in the conventional polygon computer-generated hologram (CGH) is addressed. To resolve this problem, we clarify the physical origin of the defect and address the concept of phase-regularization. A novel synthesis algorithm for a phase-regularized polygon CGH for generating photorealistic defect-free holographic images is proposed. The optical reconstruction results of the phase-regularized polygon CGHs without the dark-line defects are presented.

  18. Dimensional regularization in position space and a forest formula for regularized Epstein-Glaser renormalization

    Energy Technology Data Exchange (ETDEWEB)

    Keller, Kai Johannes

    2010-04-15

    The present work contains a consistent formulation of the methods of dimensional regularization (DimReg) and minimal subtraction (MS) in Minkowski position space. The methods are implemented into the framework of perturbative Algebraic Quantum Field Theory (pAQFT). The developed methods are used to solve the Epstein-Glaser recursion for the construction of time-ordered products in all orders of causal perturbation theory. A solution is given in terms of a forest formula in the sense of Zimmermann. A relation to the alternative approach to renormalization theory using Hopf algebras is established. (orig.)

  19. The entire regularization path for the support vector domain description

    DEFF Research Database (Denmark)

    Sjöstrand, Karl; Larsen, Rasmus

    2006-01-01

    -class support vector machine classifier. Recently, it was shown that the regularization path of the support vector machine is piecewise linear, and that the entire path can be computed efficiently. This pa- per shows that this property carries over to the support vector domain description. Using our results...... the solution to the one-class classification can be solved for any amount of regularization with roughly the same computational complexity required to solve for a particularly value of the regularization parameter. The possibility of evaluating the results for any amount of regularization not only offers more...

  20. The analyzation of 2D complicated regular polygon photonic lattice

    Science.gov (United States)

    Lv, Jing; Gao, Yuanmei

    2017-06-01

    We have numerically simulated the light intensity distribution, phase distribution, far-field diffraction of the two dimensional (2D) regular octagon and regular dodecagon lattices in detail. In addition, using the plane wave expansion (PWE) method, we numerically calculate the energy band of the two lattices. Both of the photonic lattices have the band gap. And the regular octagon lattice possesses the wide complete band gap while the regular dodecagon lattice has the incomplete gap. Moreover, we simulated the preliminary transmission image of photonic lattices. It may inspire the academic research both in light control and soliton.

  1. Adaptive Regularization of Neural Networks Using Conjugate Gradient

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Andersen et al. (1997) and Larsen et al. (1996, 1997) suggested a regularization scheme which iteratively adapts regularization parameters by minimizing validation error using simple gradient descent. In this contribution we present an improved algorithm based on the conjugate gradient technique........ Numerical experiments with feedforward neural networks successfully demonstrate improved generalization ability and lower computational cost......Andersen et al. (1997) and Larsen et al. (1996, 1997) suggested a regularization scheme which iteratively adapts regularization parameters by minimizing validation error using simple gradient descent. In this contribution we present an improved algorithm based on the conjugate gradient technique...

  2. Multidimensional Wavelet-based Regularized Reconstruction for Parallel Acquisition in Neuroimaging

    CERN Document Server

    Chaari, Lotfi; Badillo, Solveig; Pesquet, Jean-Christophe; Ciuciu, Philippe

    2012-01-01

    Parallel MRI is a fast imaging technique that enables the acquisition of highly resolved images in space or/and in time. The performance of parallel imaging strongly depends on the reconstruction algorithm, which can proceed either in the original k-space (GRAPPA, SMASH) or in the image domain (SENSE-like methods). To improve the performance of the widely used SENSE algorithm, 2D- or slice-specific regularization in the wavelet domain has been deeply investigated. In this paper, we extend this approach using 3D-wavelet representations in order to handle all slices together and address reconstruction artifacts which propagate across adjacent slices. The gain induced by such extension (3D-Unconstrained Wavelet Regularized -SENSE: 3D-UWR-SENSE) is validated on anatomical image reconstruction where no temporal acquisition is considered. Another important extension accounts for temporal correlations that exist between successive scans in functional MRI (fMRI). In addition to the case of 2D+t acquisition schemes ad...

  3. Temporal compressive sensing systems

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Bryan W.

    2017-12-12

    Methods and systems for temporal compressive sensing are disclosed, where within each of one or more sensor array data acquisition periods, one or more sensor array measurement datasets comprising distinct linear combinations of time slice data are acquired, and where mathematical reconstruction allows for calculation of accurate representations of the individual time slice datasets.

  4. Temporal Concurrent Constraint Programming

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Palamidessi, Catuscia; Valencia, Frank Dan

    2002-01-01

    The ntcc calculus is a model of non-deterministic temporal concurrent constraint programming. In this paper we study behavioral notions for this calculus. In the underlying computational model, concurrent constraint processes are executed in discrete time intervals. The behavioral notions studied...

  5. Temporal Concurrent Constraint Programming

    DEFF Research Database (Denmark)

    Valencia, Frank Dan

    Concurrent constraint programming (ccp) is a formalism for concurrency in which agents interact with one another by telling (adding) and asking (reading) information in a shared medium. Temporal ccp extends ccp by allowing agents to be constrained by time conditions. This dissertation studies...

  6. Mesial temporal sclerosis

    African Journals Online (AJOL)

    Kurt

    2005-07-29

    Jul 29, 2005 ... Introduction. Mesial temporal sclerosis is the commonest cause of partial complex seizures. The aetiology of this condi- tion is controversial, but it is postulat- ed that both acquired and develop- mental processes may be involved. Familial cases have also been reported. Magnetic resonance imaging. (MRI) ...

  7. Communication, Technology, Temporality

    Directory of Open Access Journals (Sweden)

    Mark A. Martinez

    2012-08-01

    Full Text Available This paper proposes a media studies that foregrounds technological objects as communicative and historical agents. Specifically, I take the digital computer as a powerful catalyst of crises in communication theories and certain key features of modernity. Finally, the computer is the motor of “New Media” which is at once a set of technologies, a historical epoch, and a field of knowledge. As such the computer shapes “the new” and “the future” as History pushes its origins further in the past and its convergent quality pushes its future as a predominate medium. As treatment of information and interface suggest, communication theories observe computers, and technologies generally, for the mediated languages they either afford or foreclose to us. My project describes the figures information and interface for the different ways they can be thought of as aspects of communication. I treat information not as semantic meaning, formal or discursive language, but rather as a physical organism. Similarly an interface is not a relationship between a screen and a human visual intelligence, but is instead a reciprocal, affective and physical process of contact. I illustrate that historically there have been conceptions of information and interface complimentary to mine, fleeting as they have been in the face of a dominant temporality of mediation. I begin with a theoretically informed approach to media history, and extend it to a new theory of communication. In doing so I discuss a model of time common to popular, scientific, and critical conceptions of media technologies especially in theories of computer technology. This is a predominate model with particular rules of temporal change and causality for thinking about mediation, and limits the conditions of possibility for knowledge production about communication. I suggest a new model of time as integral to any event of observation and analysis, and that human mediation does not exhaust the

  8. Initial-fit approach versus verified prescription: comparing self-perceived hearing aid benefit.

    Science.gov (United States)

    Abrams, Harvey B; Chisolm, Theresa H; McManus, Megan; McArdle, Rachel

    2012-01-01

    Despite evidence suggesting inaccuracy in the default fittings provided by hearing aid manufacturers, the use of probe-microphone measures for the verification of fitting accuracy is routinely used by fewer than half of practicing audiologists. The present study examined whether self-perception of hearing aid benefit, as measured through the Abbreviated Profile of Hearing Aid Benefit (APHAB; Cox and Alexander, 1995), differed as a function of hearing aid fitting method, specifically, manufacturer's initial-fit approach versus a verified prescription. The prescriptive fit began at NAL-NL1 targets, with adjustments based on participant request. Each of the two fittings included probe-microphone measurement. A counterbalanced, cross-over, repeated-measures, single-blinded design was utilized to address the research objectives. Twenty-two experienced hearing aid users from the general Bay Pines VA Healthcare System audiology clinic population were randomized into one of two intervention groups. At the first visit, half of the participants were fit with new hearing aids via the manufacturer's initial fit while the second half were fit to a verified prescription using probe-microphone measurement. After a wear period of 4-6 wk, the participants' hearing aids were refit via the alternate method and worn for an additional 4-6 wk. Participants were blinded to the method of fitting by utilizing probe-microphone measures with both approaches. The APHAB was administered at baseline and at the end of each intervention trial. At the end of the second trial period, the participants were asked to identify which hearing aid fitting was "preferred." The APHAB data were subjected to a general linear model repeated-measures analysis of variance. For the three APHAB communication subscales (i.e., Ease of Communication, Reverberation, and Background Noise) mean scores obtained with the verified prescription were higher than those obtained with the initial-fit approach, indicating

  9. Two-way regularization for MEG source reconstruction via multilevel coordinate descent

    KAUST Repository

    Siva Tian, Tian

    2013-12-01

    Magnetoencephalography (MEG) source reconstruction refers to the inverse problem of recovering the neural activity from the MEG time course measurements. A spatiotemporal two-way regularization (TWR) method was recently proposed by Tian et al. to solve this inverse problem and was shown to outperform several one-way regularization methods and spatiotemporal methods. This TWR method is a two-stage procedure that first obtains a raw estimate of the source signals and then refines the raw estimate to ensure spatial focality and temporal smoothness using spatiotemporal regularized matrix decomposition. Although proven to be effective, the performance of two-stage TWR depends on the quality of the raw estimate. In this paper we directly solve the MEG source reconstruction problem using a multivariate penalized regression where the number of variables is much larger than the number of cases. A special feature of this regression is that the regression coefficient matrix has a spatiotemporal two-way structure that naturally invites a two-way penalty. Making use of this structure, we develop a computationally efficient multilevel coordinate descent algorithm to implement the method. This new one-stage TWR method has shown its superiority to the two-stage TWR method in three simulation studies with different levels of complexity and a real-world MEG data analysis. © 2013 Wiley Periodicals, Inc., A Wiley Company.

  10. Biochemically verified smoking cessation and vaping beliefs among vape store customers.

    Science.gov (United States)

    Tackett, Alayna P; Lechner, William V; Meier, Ellen; Grant, DeMond M; Driskill, Leslie M; Tahirkheli, Noor N; Wagener, Theodore L

    2015-05-01

    To evaluate biochemically verified smoking status and electronic nicotine delivery systems (ENDS) use behaviors and beliefs among a sample of customers from vapor stores (stores specializing in ENDS). A cross-sectional survey of 215 adult vapor store customers at four retail locations in the Midwestern United States; a subset of participants (n = 181) also completed exhaled carbon monoxide (CO) testing to verify smoking status. Outcomes evaluated included ENDS preferences, harm beliefs, use behaviors, smoking history and current biochemically verified smoking status. Most customers reported starting ENDS as a means of smoking cessation (86%), using newer-generation devices (89%), vaping non-tobacco/non-menthol flavors (72%) and using e-liquid with nicotine strengths of ≤20 mg/ml (72%). There was a high rate of switching (91.4%) to newer-generation ENDS among those who started with a first-generation product. Exhaled CO readings confirmed that 66% of the tested sample had quit smoking. Among those who continued to smoke, mean cigarettes per day decreased from 22.1 to 7.5 (P vaping longer [odds ratio (OR) = 4.659, 95% confidence interval (CI) = 2.001-10.846], using newer-generation devices (OR = 2.950, 95% CI = 1.037-8.395) and using non-tobacco and non-menthol flavors (OR = 2.626, 95% CI = 1.133-6.085) were more likely to have quit smoking. Among vapor store customers in the United States who use electronic nicotine delivery devices to stop smoking, vaping longer, using newer-generation devices and using non-tobacco and non-menthol flavored e-liquid appear to be associated with higher rates of smoking cessation. © 2015 Society for the Study of Addiction.

  11. A two-dimensional deformable phantom for quantitatively verifying deformation algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, Neil; Chuang, Cynthia; Pouliot, Jean [Department of Radiation Oncology, University of California San Francisco, San Francisco, California 94143-1708 (United States)

    2011-08-15

    Purpose: The incorporation of deformable image registration into the treatment planning process is rapidly advancing. For this reason, the methods used to verify the underlying deformation algorithms must evolve equally fast. This manuscript proposes a two-dimensional deformable phantom, which can objectively verify the accuracy of deformation algorithms, as the next step for improving these techniques. Methods: The phantom represents a single plane of the anatomy for a head and neck patient. Inflation of a balloon catheter inside the phantom simulates tumor growth. CT and camera images of the phantom are acquired before and after its deformation. Nonradiopaque markers reside on the surface of the deformable anatomy and are visible through an acrylic plate, which enables an optical camera to measure their positions; thus, establishing the ground-truth deformation. This measured deformation is directly compared to the predictions of deformation algorithms, using several similarity metrics. The ratio of the number of points with more than a 3 mm deformation error over the number that are deformed by more than 3 mm is used for an error metric to evaluate algorithm accuracy. Results: An optical method of characterizing deformation has been successfully demonstrated. For the tests of this method, the balloon catheter deforms 32 out of the 54 surface markers by more than 3 mm. Different deformation errors result from the different similarity metrics. The most accurate deformation predictions had an error of 75%. Conclusions: The results presented here demonstrate the utility of the phantom for objectively verifying deformation algorithms and determining which is the most accurate. They also indicate that the phantom would benefit from more electron density heterogeneity. The reduction of the deformable anatomy to a two-dimensional system allows for the use of nonradiopaque markers, which do not influence deformation algorithms. This is the fundamental advantage of this

  12. Object-related regularities are processed automatically: Evidence from the visual mismatch negativity

    Directory of Open Access Journals (Sweden)

    Dagmar eMüller

    2013-06-01

    Full Text Available One of the most challenging tasks of our visual systems is to structure and integrate the enormous amount of incoming information into distinct coherent objects. It is an ongoing debate whether or not the formation of visual objects requires attention. Implicit behavioural measures suggest that object formation can occur for task-irrelevant and unattended visual stimuli. The present study investigated pre-attentive visual object formation by combining implicit behavioural measures and an electrophysiological indicator of pre-attentive visual irregularity detection, the visual mismatch negativity (vMMN of the event-related potential. Our displays consisted of two symmetrically arranged, task-irrelevant ellipses, the objects. In addition, there were two discs of either high or low luminance presented on the objects, which served as targets. Participants had to indicate whether the targets were of the same or different luminance. In separate conditions, the targets either usually were enclosed in the same object or in two different objects (standards. Occasionally, the regular target-to-object assignment was changed (deviants. That is, standards and deviants were exclusively defined on the basis of the task-irrelevant target-to-object assignment but not on the basis of some feature regularity. Although participants did not notice the regularity nor the occurrence of the deviation in the sequences, task-irrelevant deviations resulted in increased reaction times. Moreover, compared with physically identical standard displays deviating target-to-object assignments elicited a negative potential in the 246 – 280 ms time window over posterio-temporal electrode positions which was identified as vMMN. With variable resolution electromagnetic tomography (VARETA object-related vMMN was localized to the inferior temporal gyrus. Our results support the notion that the visual system automatically structures even task-irrelevant aspects of the incoming

  13. Verifying the competition between haloperidol and biperiden in serum albumin through a model based on spectrofluorimetry

    Science.gov (United States)

    Muniz da Silva Fragoso, Viviane; Patrícia de Morais e Coura, Carla; Paulino, Erica Tex; Valdez, Ethel Celene Narvaez; Silva, Dilson; Cortez, Celia Martins

    2017-11-01

    The aim of this work was to apply mathematical-computational modeling to study the interactions of haloperidol (HLP) and biperiden (BPD) with human (HSA) and bovine (BSA) serum albumin in order to verify the competition of these drugs for binding sites in HSA, using intrinsic tryptophan fluorescence quenching data. The association constants estimated for HPD-HSA was 2.17(±0.05) × 107 M-1, BPD-HSA was 2.01(±0.03) × 108 M-1 at 37 °C. Results have shown that drugs do not compete for the same binding sites in albumin.

  14. Force10 networks performance in world's first transcontinental 10 gigabit ethernet network verified by Ixia

    CERN Multimedia

    2003-01-01

    Force10 Networks, Inc., today announced that the performance of the Force10 E-Series switch/routers deployed in a transcontinental network has been verified as line-rate 10 GE throughput by Ixia, a leading provider of high-speed, network performance and conformance analysis systems. The network, the world's first transcontinental 10 GE wide area network, consists of a SURFnet OC-192 lambda between Geneva and the StarLight facility in Chicago via Amsterdam and another OC-192 lambda between this same facility in Chicago and Carleton University in Ottawa, Canada provided by CANARIE and ORANO (1/2 page).

  15. Geant4 based Monte Carlo simulation for verifying the modified sum-peak method.

    Science.gov (United States)

    Aso, Tsukasa; Ogata, Yoshimune; Makino, Ryuta

    2017-09-14

    The modified sum-peak method can practically estimate radioactivity by using solely the peak and the sum peak count rate. In order to efficiently verify the method in various experimental conditions, a Geant4 based Monte Carlo simulation for a high-purity germanium detector system was applied. The energy spectra in the detector were simulated for a 60Co point source in various source to detector distances. The calculated radioactivity shows good agreement with the number of decays in the simulation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Feature extraction using regular expression in detecting proper ...

    African Journals Online (AJOL)

    Feature extraction using regular expression in detecting proper noun for Malay news articles based on KNN algorithm. S Sulaiman, R.A. Wahid, F Morsidi. Abstract. No Abstract. Keywords: data mining; named entity recognition; regular expression; natural language processing. Full Text: EMAIL FREE FULL TEXT EMAIL ...

  17. Chimeric mitochondrial peptides from contiguous regular and swinger RNA

    Directory of Open Access Journals (Sweden)

    Hervé Seligmann

    2016-01-01

    Full Text Available Previous mass spectrometry analyses described human mitochondrial peptides entirely translated from swinger RNAs, RNAs where polymerization systematically exchanged nucleotides. Exchanges follow one among 23 bijective transformation rules, nine symmetric exchanges (X ↔ Y, e.g. A ↔ C and fourteen asymmetric exchanges (X → Y → Z → X, e.g. A → C → G → A, multiplying by 24 DNA's protein coding potential. Abrupt switches from regular to swinger polymerization produce chimeric RNAs. Here, human mitochondrial proteomic analyses assuming abrupt switches between regular and swinger transcriptions, detect chimeric peptides, encoded by part regular, part swinger RNA. Contiguous regular- and swinger-encoded residues within single peptides are stronger evidence for translation of swinger RNA than previously detected, entirely swinger-encoded peptides: regular parts are positive controls matched with contiguous swinger parts, increasing confidence in results. Chimeric peptides are 200× rarer than swinger peptides (3/100,000 versus 6/1000. Among 186 peptides with >8 residues for each regular and swinger parts, regular parts of eleven chimeric peptides correspond to six among the thirteen recognized, mitochondrial protein-coding genes. Chimeric peptides matching partly regular proteins are rarer and less expressed than chimeric peptides matching non-coding sequences, suggesting targeted degradation of misfolded proteins. Present results strengthen hypotheses that the short mitogenome encodes far more proteins than hitherto assumed. Entirely swinger-encoded proteins could exist.

  18. Maximal regularity for non-autonomous stochastic evolution ...

    Indian Academy of Sciences (India)

    Tôn Vi?t T?

    2017-11-17

    Nov 17, 2017 ... We construct unique strict solutions to the equation and show their maximal regularity. The abstract results are then applied to a stochastic partial differential equation. Keywords. Evolution operators; stochastic linear evolution equations; strict solutions; maximal regularity; UMD Banach spaces of type 2.

  19. 39 CFR 3010.7 - Schedule of regular rate changes.

    Science.gov (United States)

    2010-07-01

    ... of mailers of each class of mail in developing the schedule. (e) Whenever the Postal Service deems it... 39 Postal Service 1 2010-07-01 2010-07-01 false Schedule of regular rate changes. 3010.7 Section... PRODUCTS General Provisions § 3010.7 Schedule of regular rate changes. (a) The Postal Service shall...

  20. Analytic regularization of the Yukawa model at finite temperature

    CERN Document Server

    Malbouisson, A P C; Svaiter, N F

    1996-01-01

    We analyse the one-loop fermionic contribution for the scalar effective potential in the temperature dependent Yukawa model. In order to regularize the model a mix between dimensional and analytic regularization procedures is used. We find a general expression for the fermionic contribution in arbitrary spacetime dimension. It is found that in D=3 this contribution is finite.

  1. Degree-regular triangulations of torus and Klein bottle

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    torial 2-manifolds are the boundaries of the tetrahedron, the octahedron, the icosahedron and the 6-vertex real projective plane [4, 5]. The combinatorial manifolds T3,3,0 and T6,2,2. (in Examples 2 and 3) are combinatorially regular. Schulte and Wills [10, 11] have con- structed two combinatorially regular triangulations of ...

  2. Inclusion Professional Development Model and Regular Middle School Educators

    Science.gov (United States)

    Royster, Otelia; Reglin, Gary L.; Losike-Sedimo, Nonofo

    2014-01-01

    The purpose of this study was to determine the impact of a professional development model on regular education middle school teachers' knowledge of best practices for teaching inclusive classes and attitudes toward teaching these classes. There were 19 regular education teachers who taught the core subjects. Findings for Research Question 1…

  3. On the Generating Power of Regularly Controlled Bidirectional Grammars

    NARCIS (Netherlands)

    Asveld, P.R.J.; Hogendorp, J.A.; Hogendorp, J.A.

    1991-01-01

    RCB-grammars or regularly controlled bidirectional grammars are context-free grammars of which the rules can be used in a productive and in a reductive fashion. In addition, the application of these rules is controlled by a regular language. Several modes of derivation can be distinguished for this

  4. On the Generating Power of Regularly Controlled Bidirectional Grammars

    NARCIS (Netherlands)

    Asveld, P.R.J.; Hogendorp, Jan Anne

    1989-01-01

    RCB-grammars or regularly controlled bidirectional grammars are context-free grammars of which the rules can be used in a productive and in a reductive fashion. In addition, the application of these rules is controlled by a regular language. Several modes of derivation can be distinguished for this

  5. Analysis of Logic Programs Using Regular Tree Languages

    DEFF Research Database (Denmark)

    Gallagher, John Patrick

    2012-01-01

    The eld of nite tree automata provides fundamental notations and tools for reasoning about set of terms called regular or recognizable tree languages. We consider two kinds of analysis using regular tree languages, applied to logic programs. The rst approach is to try to discover automatically...

  6. Strictly-regular number system and data structures

    DEFF Research Database (Denmark)

    Elmasry, Amr Ahmed Abd Elmoneim; Jensen, Claus; Katajainen, Jyrki

    2010-01-01

    We introduce a new number system that we call the strictly-regular system, which efficiently supports the operations: digit-increment, digit-decrement, cut, concatenate, and add. Compared to other number systems, the strictly-regular system has distinguishable properties. It is superior to the re...

  7. Flicker regularity is crucial for entrainment of alpha oscillations

    Directory of Open Access Journals (Sweden)

    Annika Notbohm

    2016-10-01

    Full Text Available Previous studies have shown that alpha oscillations (8-13 Hz in human electroencephalogram (EEG modulate perception via phase-dependent inhibition. If entrained to an external driving force, inhibition maxima and minima of the oscillation appear more distinct in time and make potential phase-dependent perception predictable.There is an ongoing debate about whether visual stimulation is suitable to entrain alpha oscillations. On the one hand, it has been argued that a series of light flashes results in transient event-related responses (ERPs superimposed on the ongoing EEG. On the other hand, it has been demonstrated that alpha oscillations become entrained to a series of light flashes if they are presented at a certain temporal regularity. This raises the question under which circumstances a sequence of light flashes causes entrainment, i.e. whether an arrhythmic stream of light flashes would also result in entrainment.Here, we measured detection rates in response to visual targets at two opposing stimulation phases during rhythmic and arrhythmic light stimulation. We introduce a new measure called ‘behavioral modulation depth’ to determine differences in perception. This measure is capable of correcting for inevitable artifacts that occur in visual detection tasks during visual stimulation. The physical concept of entrainment predicts that increased stimulation intensity should produce stronger entrainment. Thus, two experiments with medium (Experiment 1 and high (Experiment 2 stimulation intensity were performed. Data from the first experiment show that the behavioral modulation depth (alpha phase-dependent differences in detection threshold increases with increasing entrainment of alpha oscillations. Furthermore, individual alpha phase delays of entrained alpha oscillations determine the behavioral modulation depth: the largest behavioral modulation depth can be found if targets presented during the minimum of the entrained oscillation

  8. Laplacian manifold regularization method for fluorescence molecular tomography

    Science.gov (United States)

    He, Xuelei; Wang, Xiaodong; Yi, Huangjian; Chen, Yanrong; Zhang, Xu; Yu, Jingjing; He, Xiaowei

    2017-04-01

    Sparse regularization methods have been widely used in fluorescence molecular tomography (FMT) for stable three-dimensional reconstruction. Generally, ℓ1-regularization-based methods allow for utilizing the sparsity nature of the target distribution. However, in addition to sparsity, the spatial structure information should be exploited as well. A joint ℓ1 and Laplacian manifold regularization model is proposed to improve the reconstruction performance, and two algorithms (with and without Barzilai-Borwein strategy) are presented to solve the regularization model. Numerical studies and in vivo experiment demonstrate that the proposed Gradient projection-resolved Laplacian manifold regularization method for the joint model performed better than the comparative algorithm for ℓ1 minimization method in both spatial aggregation and location accuracy.

  9. Regularized Regression and Density Estimation based on Optimal Transport

    KAUST Repository

    Burger, M.

    2012-03-11

    The aim of this paper is to investigate a novel nonparametric approach for estimating and smoothing density functions as well as probability densities from discrete samples based on a variational regularization method with the Wasserstein metric as a data fidelity. The approach allows a unified treatment of discrete and continuous probability measures and is hence attractive for various tasks. In particular, the variational model for special regularization functionals yields a natural method for estimating densities and for preserving edges in the case of total variation regularization. In order to compute solutions of the variational problems, a regularized optimal transport problem needs to be solved, for which we discuss several formulations and provide a detailed analysis. Moreover, we compute special self-similar solutions for standard regularization functionals and we discuss several computational approaches and results. © 2012 The Author(s).

  10. Secure and Verifiable Electronic Voting in Practice: the use of vVote in the Victorian State Election

    OpenAIRE

    Burton, Craig; Culnane, Chris; Schneider, Steve

    2015-01-01

    The November 2014 Australian State of Victoria election was the first statutory political election worldwide at State level which deployed an end-to-end verifiable electronic voting system in polling places. This was the first time blind voters have been able to cast a fully secret ballot in a verifiable way, and the first time a verifiable voting system has been used to collect remote votes in a political election. The code is open source, and the output from the election is verifiable. The ...

  11. How to Verify Plagiarism of the Paper Written in Macedonian and Translated in Foreign Language?

    Science.gov (United States)

    Spiroski, Mirko

    2016-03-15

    The aim of this study was to show how to verify plagiarism of the paper written in Macedonian and translated in foreign language. Original article "Ethics in Medical Research Involving Human Subjects", written in Macedonian, was submitted as an assay-2 for the subject Ethics and published by Ilina Stefanovska, PhD candidate from the Iustinianus Primus Faculty of Law, Ss Cyril and Methodius University of Skopje (UKIM), Skopje, Republic of Macedonia in Fabruary, 2013. Suspected article for plagiarism was published by Prof. Dr. Gordana Panova from the Faculty of Medical Sciences, University Goce Delchev, Shtip, Republic of Macedonia in English with the identical title and identical content in International scientific on-line journal "SCIENCE & TECHNOLOGIES", Publisher "Union of Scientists - Stara Zagora". Original document (written in Macedonian) was translated with Google Translator; suspected article (published in English pdf file) was converted into Word document, and compared both documents with several programs for plagiarism detection. It was found that both documents are identical in 71%, 78% and 82%, respectively, depending on the computer program used for plagiarism detection. It was obvious that original paper was entirely plagiarised by Prof. Dr. Gordana Panova, including six references from the original paper. Plagiarism of the original papers written in Macedonian and translated in other languages can be verified after computerised translation in other languages. Later on, original and translated documents can be compared with available software for plagiarism detection.

  12. Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol

    Science.gov (United States)

    Huang, Xiaowan; Singh, Anu; Smolka, Scott A.

    2010-01-01

    We use the UPPAAL model checker for Timed Automata to verify the Timing-Sync time-synchronization protocol for sensor networks (TPSN). The TPSN protocol seeks to provide network-wide synchronization of the distributed clocks in a sensor network. Clock-synchronization algorithms for sensor networks such as TPSN must be able to perform arithmetic on clock values to calculate clock drift and network propagation delays. They must be able to read the value of a local clock and assign it to another local clock. Such operations are not directly supported by the theory of Timed Automata. To overcome this formal-modeling obstacle, we augment the UPPAAL specification language with the integer clock derived type. Integer clocks, which are essentially integer variables that are periodically incremented by a global pulse generator, greatly facilitate the encoding of the operations required to synchronize clocks as in the TPSN protocol. With this integer-clock-based model of TPSN in hand, we use UPPAAL to verify that the protocol achieves network-wide time synchronization and is devoid of deadlock. We also use the UPPAAL Tracer tool to illustrate how integer clocks can be used to capture clock drift and resynchronization during protocol execution

  13. Development of an integrated reporting system for verifying hemolysis, icterus, and lipemia in clinical chemistry results.

    Science.gov (United States)

    Shin, Dong Hoon; Kim, Juwon; Uh, Young; Lee, Se Il; Seo, Dong Min; Kim, Kab Seung; Jang, Jae Yun; Lee, Man Hee; Yoon, Kwang Ro; Yoon, Kap Jun

    2014-07-01

    Hemolysis, icterus, and lipemia (HIL) cause preanalytical interference and vary unpredictably with different analytical equipments and measurement methods. We developed an integrated reporting system for verifying HIL status in order to identify the extent of interference by HIL on clinical chemistry results. HIL interference data from 30 chemical analytes were provided by the manufacturers and were used to generate a table of clinically relevant interference values that indicated the extent of bias at specific index values (alert index values). The HIL results generated by the Vista 1500 system (Siemens Healthcare Diagnostics, USA), Advia 2400 system (Siemens Healthcare Diagnostics), and Modular DPE system (Roche Diagnostics, Switzerland) were analyzed and displayed on physicians' personal computers. Analytes 11 and 29 among the 30 chemical analytes were affected by interference due to hemolysis, when measured using the Vista and Modular systems, respectively. The hemolysis alert indices for the Vista and Modular systems were 0.1-25.8% and 0.1-64.7%, respectively. The alert indices for icterus and lipemia were <1.4% and 0.7% in the Vista system and 0.7% and 1.0% in the Modular system, respectively. The HIL alert index values for chemical analytes varied depending on the chemistry analyzer. This integrated HIL reporting system provides an effective screening tool for verifying specimen quality with regard to HIL and simplifies the laboratory workflow.

  14. Procedures for measuring and verifying gastric tube placement in newborns: an integrative review

    Directory of Open Access Journals (Sweden)

    Flávia de Souza Barbosa Dias

    Full Text Available ABSTRACT Objective: to investigate evidence in the literature on procedures for measuring gastric tube insertion in newborns and verifying its placement, using alternative procedures to radiological examination. Method: an integrative review of the literature carried out in the Cochrane, LILACS, CINAHL, EMBASE, MEDLINE and Scopus databases using the descriptors “Intubation, gastrointestinal” and “newborns” in original articles. Results: seventeen publications were included and categorized as “measuring method” or “technique for verifying placement”. Regarding measuring methods, the measurements of two morphological distances and the application of two formulas, one based on weight and another based on height, were found. Regarding the techniques for assessing placement, the following were found: electromagnetic tracing, diaphragm electrical activity, CO2 detection, indigo carmine solution, epigastrium auscultation, gastric secretion aspiration, color inspection, and evaluation of pH, enzymes and bilirubin. Conclusion: the measuring method using nose to earlobe to a point midway between the xiphoid process and the umbilicus measurement presents the best evidence. Equations based on weight and height need to be experimentally tested. The return of secretion into the tube aspiration, color assessment and secretion pH are reliable indicators to identify gastric tube placement, and are the currently indicated techniques.

  15. Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask

    Energy Technology Data Exchange (ETDEWEB)

    Zimmer, A.; Koploy, M.

    1991-12-01

    In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing.

  16. MUSE: An Efficient and Accurate Verifiable Privacy-Preserving Multikeyword Text Search over Encrypted Cloud Data

    Directory of Open Access Journals (Sweden)

    Zhu Xiangyang

    2017-01-01

    Full Text Available With the development of cloud computing, services outsourcing in clouds has become a popular business model. However, due to the fact that data storage and computing are completely outsourced to the cloud service provider, sensitive data of data owners is exposed, which could bring serious privacy disclosure. In addition, some unexpected events, such as software bugs and hardware failure, could cause incomplete or incorrect results returned from clouds. In this paper, we propose an efficient and accurate verifiable privacy-preserving multikeyword text search over encrypted cloud data based on hierarchical agglomerative clustering, which is named MUSE. In order to improve the efficiency of text searching, we proposed a novel index structure, HAC-tree, which is based on a hierarchical agglomerative clustering method and tends to gather the high-relevance documents in clusters. Based on the HAC-tree, a noncandidate pruning depth-first search algorithm is proposed, which can filter the unqualified subtrees and thus accelerate the search process. The secure inner product algorithm is used to encrypted the HAC-tree index and the query vector. Meanwhile, a completeness verification algorithm is given to verify search results. Experiment results demonstrate that the proposed method outperforms the existing works, DMRS and MRSE-HCI, in efficiency and accuracy, respectively.

  17. Sample Acquisition and Analytical Chemistry Challenges to Verifying Compliance to Aviators Breathing Oxygen (ABO) Purity Specification

    Science.gov (United States)

    Graf, John

    2015-01-01

    NASA has been developing and testing two different types of oxygen separation systems. One type of oxygen separation system uses pressure swing technology, the other type uses a solid electrolyte electrochemical oxygen separation cell. Both development systems have been subjected to long term testing, and performance testing under a variety of environmental and operational conditions. Testing these two systems revealed that measuring the product purity of oxygen, and determining if an oxygen separation device meets Aviator's Breathing Oxygen (ABO) specifications is a subtle and sometimes difficult analytical chemistry job. Verifying product purity of cryogenically produced oxygen presents a different set of analytical chemistry challenges. This presentation will describe some of the sample acquisition and analytical chemistry challenges presented by verifying oxygen produced by an oxygen separator - and verifying oxygen produced by cryogenic separation processes. The primary contaminant that causes gas samples to fail to meet ABO requirements is water. The maximum amount of water vapor allowed is 7 ppmv. The principal challenge of verifying oxygen produced by an oxygen separator is that it is produced relatively slowly, and at comparatively low temperatures. A short term failure that occurs for just a few minutes in the course of a 1 week run could cause an entire tank to be rejected. Continuous monitoring of oxygen purity and water vapor could identify problems as soon as they occur. Long term oxygen separator tests were instrumented with an oxygen analyzer and with an hygrometer: a GE Moisture Monitor Series 35. This hygrometer uses an aluminum oxide sensor. The user's manual does not report this, but long term exposure to pure oxygen causes the aluminum oxide sensor head to bias dry. Oxygen product that exceeded the 7 ppm specification was improperly accepted, because the sensor had biased. The bias is permanent - exposure to air does not cause the sensor to

  18. Low-dose 4D cone-beam CT via joint spatiotemporal regularization of tensor framelet and nonlocal total variation

    Science.gov (United States)

    Han, Hao; Gao, Hao; Xing, Lei

    2017-08-01

    Excessive radiation exposure is still a major concern in 4D cone-beam computed tomography (4D-CBCT) due to its prolonged scanning duration. Radiation dose can be effectively reduced by either under-sampling the x-ray projections or reducing the x-ray flux. However, 4D-CBCT reconstruction under such low-dose protocols is prone to image artifacts and noise. In this work, we propose a novel joint regularization-based iterative reconstruction method for low-dose 4D-CBCT. To tackle the under-sampling problem, we employ spatiotemporal tensor framelet (STF) regularization to take advantage of the spatiotemporal coherence of the patient anatomy in 4D images. To simultaneously suppress the image noise caused by photon starvation, we also incorporate spatiotemporal nonlocal total variation (SNTV) regularization to make use of the nonlocal self-recursiveness of anatomical structures in the spatial and temporal domains. Under the joint STF-SNTV regularization, the proposed iterative reconstruction approach is evaluated first using two digital phantoms and then using physical experiment data in the low-dose context of both under-sampled and noisy projections. Compared with existing approaches via either STF or SNTV regularization alone, the presented hybrid approach achieves improved image quality, and is particularly effective for the reconstruction of low-dose 4D-CBCT data that are not only sparse but noisy.

  19. Charge generation layers for solution processed tandem organic light emitting diodes with regular device architecture.

    Science.gov (United States)

    Höfle, Stefan; Bernhard, Christoph; Bruns, Michael; Kübel, Christian; Scherer, Torsten; Lemmer, Uli; Colsmann, Alexander

    2015-04-22

    Tandem organic light emitting diodes (OLEDs) utilizing fluorescent polymers in both sub-OLEDs and a regular device architecture were fabricated from solution, and their structure and performance characterized. The charge carrier generation layer comprised a zinc oxide layer, modified by a polyethylenimine interface dipole, for electron injection and either MoO3, WO3, or VOx for hole injection into the adjacent sub-OLEDs. ToF-SIMS investigations and STEM-EDX mapping verified the distinct functional layers throughout the layer stack. At a given device current density, the current efficiencies of both sub-OLEDs add up to a maximum of 25 cd/A, indicating a properly working tandem OLED.

  20. Regularized finite element modeling of progressive failure in soils within nonlocal softening plasticity

    Science.gov (United States)

    Huang, Maosong; Qu, Xie; Lü, Xilin

    2017-11-01

    By solving a nonlinear complementarity problem for the consistency condition, an improved implicit stress return iterative algorithm for a generalized over-nonlocal strain softening plasticity was proposed, and the consistent tangent matrix was obtained. The proposed algorithm was embodied into existing finite element codes, and it enables the nonlocal regularization of ill-posed boundary value problem caused by the pressure independent and dependent strain softening plasticity. The algorithm was verified by the numerical modeling of strain localization in a plane strain compression test. The results showed that a fast convergence can be achieved and the mesh-dependency caused by strain softening can be effectively eliminated. The influences of hardening modulus and material characteristic length on the simulation were obtained. The proposed algorithm was further used in the simulations of the bearing capacity of a strip footing; the results are mesh-independent, and the progressive failure process of the soil was well captured.

  1. ADHD and temporality

    DEFF Research Database (Denmark)

    Nielsen, Mikka

    According to the official diagnostic manual, ADHD is defined by symptoms of inattention, hyperactivity, and impulsivity and patterns of behaviour are characterized as failure to pay attention to details, excessive talking, fidgeting, or inability to remain seated in appropriate situations (DSM-5......). In this paper, however, I will ask if we can understand what we call ADHD in a different way than through the symptom descriptions and will advocate for a complementary, phenomenological understanding of ADHD as a certain being in the world – more specifically as a matter of a phenomenological difference...... in temporal experience and/or rhythm. Inspired by both psychiatry’s experiments with people diagnosed with ADHD and their assessment of time and phenomenological perspectives on mental disorders and temporal disorientation I explore the experience of ADHD as a disruption in the phenomenological experience...

  2. Temporal lobe epilepsy semiology.

    Science.gov (United States)

    Blair, Robert D G

    2012-01-01

    Epilepsy represents a multifaceted group of disorders divided into two broad categories, partial and generalized, based on the seizure onset zone. The identification of the neuroanatomic site of seizure onset depends on delineation of seizure semiology by a careful history together with video-EEG, and a variety of neuroimaging technologies such as MRI, fMRI, FDG-PET, MEG, or invasive intracranial EEG recording. Temporal lobe epilepsy (TLE) is the commonest form of focal epilepsy and represents almost 2/3 of cases of intractable epilepsy managed surgically. A history of febrile seizures (especially complex febrile seizures) is common in TLE and is frequently associated with mesial temporal sclerosis (the commonest form of TLE). Seizure auras occur in many TLE patients and often exhibit features that are relatively specific for TLE but few are of lateralizing value. Automatisms, however, often have lateralizing significance. Careful study of seizure semiology remains invaluable in addressing the search for the seizure onset zone.

  3. Optimized Temporal Monitors for SystemC

    Science.gov (United States)

    Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.

    2012-01-01

    SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.

  4. Temporal clauses with conditional value in legal language (contrastive study

    Directory of Open Access Journals (Sweden)

    Iva Svobodová

    2017-12-01

    Full Text Available This study is a summary of a qualitative and quantitative linguistic investigation of the Portuguese and Brazilian Penal Code, and contains important results pertaining to the temporal clauses introduced by the connector quando. We focused on its semantic conditional features, suggesting the name pseudo-temporal clauses for these hypotactic constructions and observe the relation between the occurrence of different verbal modes and the semantic interpretation. We verify, in both the texts, the temporal clauses introduced by quando have an evident conditional value. On the other side, the compared texts are different regarding the selection of the verbal mode. The Portuguese Penal Code preferred, evidently, the subjunctive mode and in the Brazilian one the indicative mode. We explained this divergence by semantic and pragmatic factors.

  5. Food Web Structure in Temporally-Forced Ecosystems.

    Science.gov (United States)

    McMeans, Bailey C; McCann, Kevin S; Humphries, Murray; Rooney, Neil; Fisk, Aaron T

    2015-11-01

    Temporal variation characterizes many of Earth's ecosystems. Despite this, little is known about how food webs respond to regular variation in time, such as occurs broadly with season. We argue that season, and likely any periodicity, structures food webs along a temporal axis in an analogous way to that previously recognized in space; predators shift their diet as different resource compartments and trophic levels become available through time. These characteristics are likely (i) central to ecosystem function and stability based on theory, and (ii) widespread across ecosystem types based on empirical observations. The temporal food web perspective outlined here could provide new insight into the ecosystem-level consequences of altered abiotic and biotic processes that might accompany globally changing environments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Learning regularization parameters for general-form Tikhonov

    Science.gov (United States)

    Chung, Julianne; Español, Malena I.

    2017-07-01

    Computing regularization parameters for general-form Tikhonov regularization can be an expensive and difficult task, especially if multiple parameters or many solutions need to be computed in real time. In this work, we assume training data is available and describe an efficient learning approach for computing regularization parameters that can be used for a large set of problems. We consider an empirical Bayes risk minimization framework for finding regularization parameters that minimize average errors for the training data. We first extend methods from Chung et al (2011 SIAM J. Sci. Comput. 33 3132-52) to the general-form Tikhonov problem. Then we develop a learning approach for multi-parameter Tikhonov problems, for the case where all involved matrices are simultaneously diagonalizable. For problems where this is not the case, we describe an approach to compute near-optimal regularization parameters by using operator approximations for the original problem. Finally, we propose a new class of regularizing filters, where solutions correspond to multi-parameter Tikhonov solutions, that requires less data than previously proposed optimal error filters, avoids the generalized SVD, and allows flexibility and novelty in the choice of regularization matrices. Numerical results for 1D and 2D examples using different norms on the errors show the effectiveness of our methods.

  7. Two hybrid regularization frameworks for solving the electrocardiography inverse problem

    Energy Technology Data Exchange (ETDEWEB)

    Jiang Mingfeng; Xia Ling; Shou Guofa; Liu Feng [Department of Biomedical Engineering, Zhejiang University, Hangzhou 310027 (China); Crozier, Stuart [School of Information Technology and Electrical Engineering, University of Queensland, St. Lucia, Brisbane, Queensland 4072 (Australia)], E-mail: xialing@zju.edu.cn

    2008-09-21

    In this paper, two hybrid regularization frameworks, LSQR-Tik and Tik-LSQR, which integrate the properties of the direct regularization method (Tikhonov) and the iterative regularization method (LSQR), have been proposed and investigated for solving ECG inverse problems. The LSQR-Tik method is based on the Lanczos process, which yields a sequence of small bidiagonal systems to approximate the original ill-posed problem and then the Tikhonov regularization method is applied to stabilize the projected problem. The Tik-LSQR method is formulated as an iterative LSQR inverse, augmented with a Tikhonov-like prior information term. The performances of these two hybrid methods are evaluated using a realistic heart-torso model simulation protocol, in which the heart surface source method is employed to calculate the simulated epicardial potentials (EPs) from the action potentials (APs), and then the acquired EPs are used to calculate simulated body surface potentials (BSPs). The results show that the regularized solutions obtained by the LSQR-Tik method are approximate to those of the Tikhonov method, the computational cost of the LSQR-Tik method, however, is much less than that of the Tikhonov method. Moreover, the Tik-LSQR scheme can reconstruct the epcicardial potential distribution more accurately, specifically for the BSPs with large noisy cases. This investigation suggests that hybrid regularization methods may be more effective than separate regularization approaches for ECG inverse problems.

  8. Two hybrid regularization frameworks for solving the electrocardiography inverse problem

    Science.gov (United States)

    Jiang, Mingfeng; Xia, Ling; Shou, Guofa; Liu, Feng; Crozier, Stuart

    2008-09-01

    In this paper, two hybrid regularization frameworks, LSQR-Tik and Tik-LSQR, which integrate the properties of the direct regularization method (Tikhonov) and the iterative regularization method (LSQR), have been proposed and investigated for solving ECG inverse problems. The LSQR-Tik method is based on the Lanczos process, which yields a sequence of small bidiagonal systems to approximate the original ill-posed problem and then the Tikhonov regularization method is applied to stabilize the projected problem. The Tik-LSQR method is formulated as an iterative LSQR inverse, augmented with a Tikhonov-like prior information term. The performances of these two hybrid methods are evaluated using a realistic heart-torso model simulation protocol, in which the heart surface source method is employed to calculate the simulated epicardial potentials (EPs) from the action potentials (APs), and then the acquired EPs are used to calculate simulated body surface potentials (BSPs). The results show that the regularized solutions obtained by the LSQR-Tik method are approximate to those of the Tikhonov method, the computational cost of the LSQR-Tik method, however, is much less than that of the Tikhonov method. Moreover, the Tik-LSQR scheme can reconstruct the epcicardial potential distribution more accurately, specifically for the BSPs with large noisy cases. This investigation suggests that hybrid regularization methods may be more effective than separate regularization approaches for ECG inverse problems.

  9. Three regularities of recognition memory: the role of bias.

    Science.gov (United States)

    Hilford, Andrew; Maloney, Laurence T; Glanzer, Murray; Kim, Kisok

    2015-12-01

    A basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities. A survey of data based on group ROCs from 36 studies validated the likelihood ratio assumption by showing that its three implied regularities are ubiquitous. The study noted, however, that bias, another basic factor in Signal Detection Theory, can obscure the Mirror Effect. In this paper we examine how bias affects the regularities at the theoretical level. The theoretical analysis shows: (1) how bias obscures the Mirror Effect, not the other two regularities, and (2) four ways to counter that obscuring. We then report the results of five experiments that support the theoretical analysis. The analyses and the experimental results also demonstrate: (1) that the three regularities govern individual, as well as group, performance, (2) alternative explanations of the regularities are ruled out, and (3) that Signal Detection Theory, correctly applied, gives a simple and unified explanation of recognition memory data.

  10. The cerebellum predicts the temporal consequences of observed motor acts.

    Directory of Open Access Journals (Sweden)

    Laura Avanzino

    Full Text Available It is increasingly clear that we extract patterns of temporal regularity between events to optimize information processing. The ability to extract temporal patterns and regularity of events is referred as temporal expectation. Temporal expectation activates the same cerebral network usually engaged in action selection, comprising cerebellum. However, it is unclear whether the cerebellum is directly involved in temporal expectation, when timing information is processed to make predictions on the outcome of a motor act. Healthy volunteers received one session of either active (inhibitory, 1 Hz or sham repetitive transcranial magnetic stimulation covering the right lateral cerebellum prior the execution of a temporal expectation task. Subjects were asked to predict the end of a visually perceived human body motion (right hand handwriting and of an inanimate object motion (a moving circle reaching a target. Videos representing movements were shown in full; the actual tasks consisted of watching the same videos, but interrupted after a variable interval from its onset by a dark interval of variable duration. During the 'dark' interval, subjects were asked to indicate when the movement represented in the video reached its end by clicking on the spacebar of the keyboard. Performance on the timing task was analyzed measuring the absolute value of timing error, the coefficient of variability and the percentage of anticipation responses. The active group exhibited greater absolute timing error compared with the sham group only in the human body motion task. Our findings suggest that the cerebellum is engaged in cognitive and perceptual domains that are strictly connected to motor control.

  11. Metabolic effects of regular physical exercise in healthy population.

    Science.gov (United States)

    Caro, Juan; Navarro, Inmaculada; Romero, Pedro; Lorente, Rosario I; Priego, María Antonia; Martínez-Hervás, Sergio; Real, Jose T; Ascaso, Juan F

    2013-04-01

    To assess the effect of moderate regular aerobic physical activity not associated to body weight changes on insulin resistance and the associated metabolic changes in general population. A cross-sectional, observational study in an adult population (n=101 subjects aged 30-70 years) with no personal history of disease and with stable weight in the three months prior to the study. The group with regular exercise performed 30-60 minutes of moderate regular physical exercise 5 days per week (7.5-15 hours MET per week), while a control group performed no regular physical exercise and had a sedentary lifestyle. Subjects were age- and sex-matched. Lipids, lipoproteins, and HOMA index were measured using standard methods. The group with regular physical activity consisted of 48 subjects (21 male/27 female), while the group with no regular physical activity included 53 subjects (31 male/22 female). No significant differences were found between the groups in age, sex, BMI, waist circumference, and blood pressure. Significant differences were found between the groups in fasting serum triglyceride, HDL-C, and apoB levels. Fasting plasma insulin levels (12.1 ± 4.13 vs 14.9 ± 4.8 mU/L, P= .004) and HOMA index (2.8 ± 1.1 vs 3.5 ± 4.1, P= .001) were significantly lower in the group with regular physical activity as compared to the sedentary group. Prevalence rates of metabolic syndrome were 20.7% and 45.8% (P=.01) in the regular physical activity and sedentary groups respectively. Moderate regular physical activity is associated to higher insulin sensitivity, an improved lipid profile, and a decrease in components of metabolic syndrome with no change in weight or BMI. Copyright © 2012 SEEN. Published by Elsevier Espana. All rights reserved.

  12. Temporal clustering of paleoearthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Bell, J.W.

    1994-12-31

    The character of the earthquake cycle is important in understanding and modeling behavior of seismogenic sources. Of particular importance is the question of whether the cyclic tectonic loading and release process is periodic or aperiodic, because recurrence data are incorporated into most seismic hazard estimates. Most probabilistic seismic hazard models utilize a periodic recurrence model, in which it is assumed that the average long-term recurrence interval adequately represents real fault behavior. This is essentially the classical Reid concept where uniform stress accumulation is used to estimate the timing of regular, periodic earthquakes having uniform stress drops. Within the last decade, however, a growing body of seismic data, largely derived from plate boundary faults, suggests that actual fault behavior is far more complex than that suggested by the original Reid concept, and that the seismic cycle is not so simple. For example, Shimazaki and Nakata proposed two additional earthquake recurrence models based on studied of less-regular Japanese earthquakes: the time-predictable and the slip-predictable models. Thatcher further illustrated that some major plate-boundary faults do not display periodic behavior as commonly assumed, and that such activity is actually the exception rather than the rule.

  13. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    Science.gov (United States)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  14. [The capacities of current test systems to verify early HIV infection].

    Science.gov (United States)

    Baranova, E N; Sharipova, I N; Denisova, N M; Susekina, M E; Puzyrev, V F; Sarkisian, K A; Vorob'eva, M S; Burkov, A N; Ulanova, T I

    2009-01-01

    The purpose of the present investigation was to comparatively evaluate the performance characteristics of the test systems designed to verify the positive results of screening survey for HIV infection, such as the solid-phase immunoassay DS-EIA-HIV-AB/AG-SPECTR (Diagnosticheskiye Sistemy (Diagnostic Systems) Research-and-Production Association, Nizhni Novgorod) and tests based on immune blotting (IB). The investigation examined 15 seroconversion panels produced by ZeptoMetrix (USA) and BBI (USA). The use of the DS-EIA-HIV-AB/AG-SPECTR test system determined 88 of the 167 seroconversion panels as HIV positive. The IB-based tests revealed only 45 of the 167 samples as positive. Consequently, the application of the DS-EIA-HIV-AB/AG-SPECTR test system is more effective than the IB-based tests in early HIV infection.

  15. A robot for verifying the precision of total reaction time measurement

    Directory of Open Access Journals (Sweden)

    Tânia Brusque Crocetta

    2015-03-01

    Full Text Available The level of variability in psychomotor behavior and the use of several distinct sets of equipments in Reaction Time (RT assessments might jeopardize the validity and reliability of such measures. This study presents the development and verification of Emboici Robot-a robot capable of performing accurate RT assessments consisting of response to a visual stimulus by pressing a button-whose purpose is to measure the accuracy of RT assessments. We evaluated the accuracy and precision on four different days, each providing 300 measurements. These assessments generated a RT of 46.95ms (+6.04. No significant effects were found in the RTs obtained and, as a result, there is evidence that the Emboici Robotis stable, reliable, and precise. The robot can be a viable solution for verifying precision and accuracy of any given software with simple RT assessments with visual stimulus requiring as response the pressing of a button or key.

  16. Verifying the agreed framework between the United States and North Korea

    Energy Technology Data Exchange (ETDEWEB)

    May, M.M. [Center for International Security and Cooperation, Stanford University, Stanford, California (United States)

    2001-07-01

    Under the 1994 Agreed Framework (AF) between the United States and the Democratic People Republic of Korea (DPRK), the US and its allies will provide two nuclear-power reactors and other benefits to the DPRK in exchange for an agreement by the DPRK to declare how much nuclear-weapon material it has produced; to identify, freeze, and eventually dismantle specified facilities for producing this material; and to remain a party to the nuclear Non- Proliferation Treaty (NPT) and allow the implementation of its safeguards agreement. This study assesses the verifiability of these provisions. The study concludes verification can be accomplished, given cooperation and openness from the DPRK. Special effort will be needed from the IAEA, as well as support from the US and the Republic of Korea. (author)

  17. How to verify lightning protection efficiency for electrical systems? Testing procedures and practical applications

    Energy Technology Data Exchange (ETDEWEB)

    Birkl, Josef; Zahlmann, Peter [DEHN and SOEHNE, Neumarkt (Germany)], Emails: Josef.Birkl@technik.dehn.de, Peter.Zahlmann@technik.dehn.de

    2007-07-01

    There are increasing numbers of applications, installing Surge Protective Devices (SPDs), through which partial lightning currents flow, and highly sensitive, electronic devices to be protected closely next to each other due to the design of electric distribution systems and switchgear installations which is getting more and more compact. In these cases, the protective function of the SPDs has to be co-ordinated with the individual immunity of the equipment against energetic, conductive impulse voltages and impulse currents. In order to verify the immunity against partial lightning currents of the complete system laboratory tests on a system level are a suitable approach. The proposed test schemes for complete systems have been successfully performed on various applications. Examples will be presented. (author)

  18. Lightweight ECC based RFID authentication integrated with an ID verifier transfer protocol.

    Science.gov (United States)

    He, Debiao; Kumar, Neeraj; Chilamkurti, Naveen; Lee, Jong-Hyouk

    2014-10-01

    The radio frequency identification (RFID) technology has been widely adopted and being deployed as a dominant identification technology in a health care domain such as medical information authentication, patient tracking, blood transfusion medicine, etc. With more and more stringent security and privacy requirements to RFID based authentication schemes, elliptic curve cryptography (ECC) based RFID authentication schemes have been proposed to meet the requirements. However, many recently published ECC based RFID authentication schemes have serious security weaknesses. In this paper, we propose a new ECC based RFID authentication integrated with an ID verifier transfer protocol that overcomes the weaknesses of the existing schemes. A comprehensive security analysis has been conducted to show strong security properties that are provided from the proposed authentication scheme. Moreover, the performance of the proposed authentication scheme is analyzed in terms of computational cost, communicational cost, and storage requirement.

  19. IDMS: A System to Verify Component Interface Completeness and Compatibility for Product Integration

    Science.gov (United States)

    Areeprayolkij, Wantana; Limpiyakorn, Yachai; Gansawat, Duangrat

    The growing approach of Component-Based software Development has had a great impact on today system architectural design. However, the design of subsystems that lacks interoperability and reusability can cause problems during product integration. At worst, this may result in project failure. In literature, it is suggested that the verification of interface descriptions and management of interface changes are factors essential to the success of product integration process. This paper thus presents an automation approach to facilitate reviewing component interfaces for completeness and compatibility. The Interface Descriptions Management System (IDMS) has been implemented to ease and fasten the interface review activities using UML component diagrams as input. The method of verifying interface compatibility is accomplished by traversing the component dependency graph called Component Compatibility Graph (CCG). CCG is the visualization of which each node represents a component, and each edge represents communications between associated components. Three case studies were studied to subjectively evaluate the correctness and usefulness of IDMS.

  20. Effect of glucocorticosteroid injections in tennis elbow verified on colour Doppler ultrasonography: evidence of inflammation

    DEFF Research Database (Denmark)

    Torp-Pedersen, T.E.; Torp-Pedersen, S.T.; Qvistgaard, E.

    2008-01-01

    -guided corticosteroid injection in patients with LE. DESIGN: Case-only, blinded intervention study. SETTING: Secondary care at a government hospital. PATIENTS: 62 patients with LE verified by colour Doppler US. INTERVENTION: One US-guided corticosteroid injection was given into the CEO. MAIN OUTCOME MEASURES: Patients...... were evaluated at baseline before the injection and at 2 weeks of follow-up. Outcome measures were changes in pain score and US parameters (resistive index (RI) and the amount of colour within the CEO). Prognosticators for outcome were: use of computer mouse, symptom duration, elbow strain, RI, colour...... injection has a marked short-term effect on pain and Doppler parameters. The reduction in hyperaemia mediated by an anti-inflammatory drug can be interpreted as evidence of an inflammatory component in LE Udgivelsesdato: 2008/12...

  1. 49 CFR 40.162 - What must MROs do with multiple verified results for the same testing event?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What must MROs do with multiple verified results for the same testing event? 40.162 Section 40.162 Transportation Office of the Secretary of... and the Verification Process § 40.162 What must MROs do with multiple verified results for the same...

  2. The reading ability of good and poor temporal processors among a group of college students.

    Science.gov (United States)

    Au, Agnes; Lovegrove, Bill

    2008-05-01

    In this study, we examined whether good auditory and good visual temporal processors were better than their poor counterparts on certain reading measures. Various visual and auditory temporal tasks were administered to 105 undergraduates. They read some phonologically regular pseudowords and irregular words that were presented sequentially in the same ("word" condition) and in different ("line" condition) locations. Results indicated that auditory temporal acuity was more relevant to reading, whereas visual temporal acuity was more relevant to spelling. Good auditory temporal processors did not have the advantage in processing pseudowords, even though pseudoword reading correlated significantly with auditory temporal processing. These results suggested that some higher cognitive or phonological processes mediated the relationship between auditory temporal processing and pseudoword reading. Good visual temporal processors did not have the advantage in processing irregular words. They also did not process the line condition more accurately than the word condition. The discrepancy might be attributed to the use of normal adults and the unnatural reading situation that did not fully capture the function of the visual temporal processes. The distributions of auditory and visual temporal processing abilities were co-occurring to some degree, but they maintained considerable independence. There was also a lack of a relationship between the type and severity of reading deficits and the type and number of temporal deficits.

  3. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling

    Science.gov (United States)

    Antweiler, Ronald C.; Writer, Jeffrey H.; Murphy, Sheila F.

    2014-01-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations – such as missing the Lagrangian parcel by less than 1 h – can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed “verified Lagrangian” sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2–4 h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we

  4. Evaluating MC&A effectiveness to verify the presence of nuclear materials

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, P. G. (Pamela G.); Morzinski, J. A. (Jerome A.); Ostenak, Carl A.; Longmire, V. L. (Victoria L.); Jewell, D. (Don); Williams, J. D. (Joel D.)

    2001-06-01

    Traditional materials accounting is focused exclusively on the material balance area (MBA), and involves periodically closing a material balance based on accountability measurements conducted during a physical inventory. In contrast, the physical inventory for Los Alamos National Laboratory's near-real-time accounting system is established around processes and looks more like an item inventory. That is, the intent is not to measure material for accounting purposes, since materials have already been measured in the normal course of daily operations. A given unit process operates many times over the course of a material balance period. The product of a given unit process may move for processing within another unit process in the same MBA or may be transferred out of the MBA. Since few materials are unmeasured the physical inventory for a near-real-time process area looks more like an item inventory. Thus, the intent of the physical inventory is to locate the materials on the books and verify information about the materials contained in the books. Closing a materials balance for such an area is a matter of summing all the individual mass balances for the batches processed by all unit processes in the MBA. Additionally, performance parameters are established to measure the program's effectiveness. Program effectiveness for verifying the presence of nuclear material is required to be equal to or greater than a prescribed performance level, process measurements must be within established precision and accuracy values, physical inventory results meet or exceed performance requirements, and inventory differences are less than a target/goal quantity. This approach exceeds DOE established accounting and physical inventory program requirements. Hence, LANL is committed to this approach and to seeking opportunities for further improvement through integrated technologies. This paper will provide a detailed description of this evaluation process.

  5. Measuring reporting verifying. A primer on MRV for nationally appropriate mitigation actions

    Energy Technology Data Exchange (ETDEWEB)

    Hinostroza, M. (ed.); Luetken, S.; Holm Olsen, K. (Technical Univ. of Denmark. UNEP Risoe Centre, Roskilde (Denmark)); Aalders, E.; Pretlove, B.; Peters, N. (Det Norske Veritas, Hellerup (Denmark))

    2012-03-15

    The requirements for measurement, reporting and verification (MRV) of nationally appropriate mitigation actions (NAMAs) are one of the crucial topics on the agenda of international negotiations to address climate change mitigation. According to agreements so far, the general guidelines for domestic MRV are to be developed by Subsidiary Body for Scientific and Technological Advice (SBSTA)1. Further, the Subsidiary Body for Implementation (SBI) will be conducting international consultations and analysis (ICA) of biennial update reports (BUR) to improve transparency of mitigation actions, which should be measured, reported and verified. 2. What is clear from undergoing discussions both at SBSTA and at SBI is that MRV for NAMAs should not be a burden for controlling greenhouse gas (GHG) emissions connected to economic activities. Instead, the MRV process should facilitate mitigation actions; encourage the redirection of investments and address concerns regarding carbon content of emission intensive operations of private and public companies and enterprises worldwide. While MRV requirements are being shaped within the Convention, there are a number of initiatives supporting developing countries moving forward with NAMA development and demonstration activities. How these actions shall be measured, reported and verified, however, remain unanswered. MRV is not new. It is present in most existing policies and frameworks related to climate change mitigation. With an aim to contribute to international debate and capacity building on this crucial issue, the UNEP Risoe Centre in cooperation with UNDP, are pleased to present this publication that through the direct collaboration with Det Norske Veritas (DNV) builds on existing MRV practices in current carbon markets; provides insights on how MRV for NAMAs can be performed and identifies elements and drivers to be considered when designing adequate MRV systems for NAMAs in developing countries. This primer is the second

  6. Predicted and verified evolution of power-law exponent in product market

    CERN Document Server

    Hisano, Ryohei; Mizuno, Takayuki

    2011-01-01

    Power-law distributions constitute a generic empirical statistical regularity found in many complex systems. A recently developed theory finds that the interplay between one of the most universal ingredient, i.e., stochastic proportional growth, and stochastic birth and death processes, leads to generic power law distributions together with a non-universal exponent which depends explicitly on the characteristics of growth, birth and death. In particular, the theory rationalizes Zipf's law and explains deviations from it, for instance for the distribution of firm and of city sizes. Here, we report the first complete test of the theory, based on the empirical analysis from a real world complex phenomenon, namely the dynamics of market shares in the consumer electronics market. We estimate directly from the data the average growth rate of market shares, their standard deviation, the birth rates as well as the "death" hazard rate of products. When plugged in the theory, this predicts the power law exponent of the...

  7. Regularization theory for ill-posed problems selected topics

    CERN Document Server

    Lu, Shuai

    2013-01-01

    Thismonograph is a valuable contribution to thehighly topical and extremly productive field ofregularisationmethods for inverse and ill-posed problems. The author is an internationally outstanding and acceptedmathematicianin this field. In his book he offers a well-balanced mixtureof basic and innovative aspects.He demonstrates new,differentiatedviewpoints, and important examples for applications. The bookdemontrates thecurrent developments inthe field of regularization theory,such as multiparameter regularization and regularization in learning theory. The book is written for graduate and PhDs

  8. A regularized GMRES method for inverse blackbody radiation problem

    Directory of Open Access Journals (Sweden)

    Wu Jieer

    2013-01-01

    Full Text Available The inverse blackbody radiation problem is focused on determining temperature distribution of a blackbody from measured total radiated power spectrum. This problem consists of solving a first kind of Fredholm integral equation and many numerical methods have been proposed. In this paper, a regularized GMRES method is presented to solve the linear ill-posed problem caused by the discretization of such an integral equation. This method projects the orignal problem onto a lower dimensional subspaces by the Arnoldi process. Tikhonov regularization combined with GCV criterion is applied to stabilize the numerical iteration process. Three numerical examples indicate the effectiveness of the regularized GMRES method.

  9. Closedness type regularity conditions in convex optimization and beyond

    Directory of Open Access Journals (Sweden)

    Sorin-Mihai Grad

    2016-09-01

    Full Text Available The closedness type regularity conditions have proven during the last decade to be viable alternatives to their more restrictive interiority type counterparts, in both convex optimization and different areas where it was successfully applied. In this review article we de- and reconstruct some closedness type regularity conditions formulated by means of epigraphs and subdifferentials, respectively, for general optimization problems in order to stress that they arise naturally when dealing with such problems. The results are then specialized for constrained and unconstrained convex optimization problems. We also hint towards other classes of optimization problems where closedness type regularity conditions were successfully employed and discuss other possible applications of them.

  10. Bias-Variance Tradeoff of Graph Laplacian Regularizer

    Science.gov (United States)

    Chen, Pin-Yu; Liu, Sijia

    2017-08-01

    This paper presents a bias-variance tradeoff of graph Laplacian regularizer, which is widely used in graph signal processing and semi-supervised learning tasks. The scaling law of the optimal regularization parameter is specified in terms of the spectral graph properties and a novel signal-to-noise ratio parameter, which suggests selecting a mediocre regularization parameter is often suboptimal. The analysis is applied to three applications, including random, band-limited, and multiple-sampled graph signals. Experiments on synthetic and real-world graphs demonstrate near-optimal performance of the established analysis.

  11. Factors distinguishing regular readers of breast cancer information in magazines.

    Science.gov (United States)

    Johnson, J D

    1997-01-01

    This study examined the differences between women who were regular and occasional readers of breast cancer information in magazines. Based on uses and gratifications theory and the Health Belief Model, women respondents (n = 366) were predicted to differentially expose themselves to information. A discriminant analysis showed that women who were regular readers reported greater fear, perceived vulnerability, general health concern, personal experience, and surveillance need for breast cancer-related information. The results are discussed in terms of the potential positive and negative consequences of regular exposure to breast cancer information in magazines.

  12. Directional Total Generalized Variation Regularization for Impulse Noise Removal

    DEFF Research Database (Denmark)

    Kongskov, Rasmus Dalgas; Dong, Yiqiu

    2017-01-01

    A recently suggested regularization method, which combines directional information with total generalized variation (TGV), has been shown to be successful for restoring Gaussian noise corrupted images. We extend the use of this regularizer to impulse noise removal and demonstrate that using...... this regularizer for directional images is highly advantageous. In order to estimate directions in impulse noise corrupted images, which is much more challenging compared to Gaussian noise corrupted images, we introduce a new Fourier transform-based method. Numerical experiments show that this method is more...

  13. Caracterizaciones combinatorias y algebraicas de grafos distancia-regulares

    OpenAIRE

    Fiol Mora, Miquel Àngel

    2013-01-01

    Los grafos distancia-regulares aparecen a menudo en el estudio de es- tructuras matemáticas con un alto grado de simetría y/o regularidad. Un ejemplo bien conocido de tales grafos son los esqueletos de los sólidos platónicos. Desde que fueron propuestos por Norman Biggs, los grafos distancia-regulares han sido caracterizados por numerosos resultados, tanto de carácter combinatorio como algebraico. Como ejemplo del primer caso, sabemos que un grafo es distancia-regular ...

  14. Fuzzy Weak Regular, Strong and Preassociative Filters in Residuated Lattices

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2014-06-01

    Full Text Available In this paper, the notions of fuzzy weak regular, strong and preassociative filters are introduced with some properties of them investigated. In particular, under the context of Glivenko algebras, fuzzy weak regular filters and regular ones are equivalent and characterizations of Glivenko algebras are concluded by characterizations of fuzzy strong filters. At last the notion of fuzzy preassociative filters are defined, which are proved to coincide with fuzzy Boolean filters, and then some new alternative definitions of Boolean algebras are obtained by this type of filters.

  15. Homogeneity and EPR metrics for assessment of regular grids used in CW EPR powder simulations

    Science.gov (United States)

    Crăciun, Cora

    2014-08-01

    CW EPR powder spectra may be approximated numerically using a spherical grid and a Voronoi tessellation-based cubature. For a given spin system, the quality of simulated EPR spectra depends on the grid type, size, and orientation in the molecular frame. In previous work, the grids used in CW EPR powder simulations have been compared mainly from geometric perspective. However, some grids with similar homogeneity degree generate different quality simulated spectra. This paper evaluates the grids from EPR perspective, by defining two metrics depending on the spin system characteristics and the grid Voronoi tessellation. The first metric determines if the grid points are EPR-centred in their Voronoi cells, based on the resonance magnetic field variations inside these cells. The second metric verifies if the adjacent Voronoi cells of the tessellation are EPR-overlapping, by computing the common range of their resonance magnetic field intervals. Beside a series of well known regular grids, the paper investigates a modified ZCW grid and a Fibonacci spherical code, which are new in the context of EPR simulations. For the investigated grids, the EPR metrics bring more information than the homogeneity quantities and are better related to the grids’ EPR behaviour, for different spin system symmetries. The metrics’ efficiency and limits are finally verified for grids generated from the initial ones, by using the original or magnetic field-constraint variants of the Spherical Centroidal Voronoi Tessellation method.

  16. Manifold optimization-based analysis dictionary learning with an ℓ1∕2-norm regularizer.

    Science.gov (United States)

    Li, Zhenni; Ding, Shuxue; Li, Yujie; Yang, Zuyuan; Xie, Shengli; Chen, Wuhui

    2018-02-01

    Recently there has been increasing attention towards analysis dictionary learning. In analysis dictionary learning, it is an open problem to obtain the strong sparsity-promoting solutions efficiently while simultaneously avoiding the trivial solutions of the dictionary. In this paper, to obtain the strong sparsity-promoting solutions, we employ the ℓ 1∕2 norm as a regularizer. The very recent study on ℓ 1∕2 norm regularization theory in compressive sensing shows that its solutions can give sparser results than using the ℓ 1 norm. We transform a complex nonconvex optimization into a number of one-dimensional minimization problems. Then the closed-form solutions can be obtained efficiently. To avoid trivial solutions, we apply manifold optimization to update the dictionary directly on the manifold satisfying the orthonormality constraint, so that the dictionary can avoid the trivial solutions well while simultaneously capturing the intrinsic properties of the dictionary. The experiments with synthetic and real-world data verify that the proposed algorithm for analysis dictionary learning can not only obtain strong sparsity-promoting solutions efficiently, but also learn more accurate dictionary in terms of dictionary recovery and image processing than the state-of-the-art algorithms. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Neighborhood Regularized Logistic Matrix Factorization for Drug-Target Interaction Prediction.

    Directory of Open Access Journals (Sweden)

    Yong Liu

    2016-02-01

    Full Text Available In pharmaceutical sciences, a crucial step of the drug discovery process is the identification of drug-target interactions. However, only a small portion of the drug-target interactions have been experimentally validated, as the experimental validation is laborious and costly. To improve the drug discovery efficiency, there is a great need for the development of accurate computational approaches that can predict potential drug-target interactions to direct the experimental verification. In this paper, we propose a novel drug-target interaction prediction algorithm, namely neighborhood regularized logistic matrix factorization (NRLMF. Specifically, the proposed NRLMF method focuses on modeling the probability that a drug would interact with a target by logistic matrix factorization, where the properties of drugs and targets are represented by drug-specific and target-specific latent vectors, respectively. Moreover, NRLMF assigns higher importance levels to positive observations (i.e., the observed interacting drug-target pairs than negative observations (i.e., the unknown pairs. Because the positive observations are already experimentally verified, they are usually more trustworthy. Furthermore, the local structure of the drug-target interaction data has also been exploited via neighborhood regularization to achieve better prediction accuracy. We conducted extensive experiments over four benchmark datasets, and NRLMF demonstrated its effectiveness compared with five state-of-the-art approaches.

  18. Comparison of VerifyNow P2Y12 and thrombelastography for assessing clopidogrel response in stroke patients in China.

    Science.gov (United States)

    Lv, Hui-Hui; Wu, Shuai; Liu, Xu; Yang, Xiao-Li; Xu, Jian-Feng; Guan, Yang-Tai; Dong, Qiang; Zheng, S Lilly; Jiang, Jian-Ming; Li, Shi-Xu; Luo, Zheng; Li, Li; An, Li-Xian; Han, Yan

    2016-02-01

    Poor response to clopidogrel is often associated with recurrent ischemic events, and reliable platelet function tests are needed to identify clopidogrel low response (CLR). The aim of the study was to compare the consistency of VerifyNow P2Y12 and thrombelastography (TEG) in acute ischemic stroke patients treated with clopidogrel. Patients hospitalized in Changhai Hospital from August 2012 to September 2013 and assigned to treatment with a daily 75-mg dose of clopidogrel. The blood samples were taken on the 5-7th day to assess the capability of VerifyNow P2Y12 and TEG for evaluation of clopidogrel response, and all instrument parameters were used to perform correlation analysis. Patients with CLR were detected by using the methods and criteria published earlier (PRU ≥ 230 assayed by VerifyNow P2Y12 or TEG-Inhib% ≤30 % measured by TEG). Totally 58 patients were enrolled for the study and there were wide varieties in parameters of VerifyNow P2Y12 and TEG. Results showed a total of 17 and 9 patients, respectively, identified as CLR assessed by VerifyNow P2Y12 and TEG, but only three patients were detected to be clopidogrel low responders with both tests. The kappa consistency analysis showed poor consistency between VerifyNow P2Y12 and TEG results in terms of CLR (Kappa = -0.0349, p = 0.7730). Linear regression also demonstrated poor correlation between VerifyNow-PRU/VerifyNow-Inhib% and TEG-Inhib% (p = 0.07901 and p = 0.3788, respectively). Our study demonstrated that there was poor correlation between VerifyNow P2Y12 and TEG results, and VerifyNow P2Y12 showed a larger proportion of CLR than TEG.

  19. Spherical Deconvolution of Multichannel Diffusion MRI Data with Non-Gaussian Noise Models and Spatial Regularization.

    Science.gov (United States)

    Canales-Rodríguez, Erick J; Daducci, Alessandro; Sotiropoulos, Stamatios N; Caruyer, Emmanuel; Aja-Fernández, Santiago; Radua, Joaquim; Yurramendi Mendizabal, Jesús M; Iturria-Medina, Yasser; Melie-García, Lester; Alemán-Gómez, Yasser; Thiran, Jean-Philippe; Sarró, Salvador; Pomarol-Clotet, Edith; Salvador, Raymond

    2015-01-01

    Spherical deconvolution (SD) methods are widely used to estimate the intra-voxel white-matter fiber orientations from diffusion MRI data. However, while some of these methods assume a zero-mean Gaussian distribution for the underlying noise, its real distribution is known to be non-Gaussian and to depend on many factors such as the number of coils and the methodology used to combine multichannel MRI signals. Indeed, the two prevailing methods for multichannel signal combination lead to noise patterns better described by Rician and noncentral Chi distributions. Here we develop a Robust and Unbiased Model-BAsed Spherical Deconvolution (RUMBA-SD) technique, intended to deal with realistic MRI noise, based on a Richardson-Lucy (RL) algorithm adapted to Rician and noncentral Chi likelihood models. To quantify the benefits of using proper noise models, RUMBA-SD was compared with dRL-SD, a well-established method based on the RL algorithm for Gaussian noise. Another aim of the study was to quantify the impact of including a total variation (TV) spatial regularization term in the estimation framework. To do this, we developed TV spatially-regularized versions of both RUMBA-SD and dRL-SD algorithms. The evaluation was performed by comparing various quality metrics on 132 three-dimensional synthetic phantoms involving different inter-fiber angles and volume fractions, which were contaminated with noise mimicking patterns generated by data processing in multichannel scanners. The results demonstrate that the inclusion of proper likelihood models leads to an increased ability to resolve fiber crossings with smaller inter-fiber angles and to better detect non-dominant fibers. The inclusion of TV regularization dramatically improved the resolution power of both techniques. The above findings were also verified in human brain data.

  20. Bounds on Threshold of Regular Random $k$-SAT

    CERN Document Server

    Rathi, Vishwambhar; Rasmussen, Lars; Skoglund, Mikael

    2010-01-01

    We consider the regular model of formula generation in conjunctive normal form (CNF) introduced by Boufkhad et. al. We derive an upper bound on the satisfiability threshold and NAE-satisfiability threshold for regular random $k$-SAT for any $k \\geq 3$. We show that these bounds matches with the corresponding bound for the uniform model of formula generation. We derive lower bound on the threshold by applying the second moment method to the number of satisfying assignments. For large $k$, we note that the obtained lower bounds on the threshold of a regular random formula converges to the lower bound obtained for the uniform model. Thus, we answer the question posed in \\cite{AcM06} regarding the performance of the second moment method for regular random formulas.