WorldWideScience

Sample records for lempel-ziv incremental parsing

  1. On the Approximation Ratio of Lempel-Ziv Parsing

    DEFF Research Database (Denmark)

    Gagie, Travis; Navarro, Gonzalo; Prezza, Nicola

    2018-01-01

    Shannon’s entropy is a clear lower bound for statistical compression. The situation is not so well understood for dictionary-based compression. A plausible lower bound is b, the least number of phrases of a general bidirectional parse of a text, where phrases can be copied from anywhere else in t......-length context-free grammar based on a locally consistent parsing of the text. Our lower bound is obtained by relating b with r, the number of equal-letter runs in the Burrows-Wheeler transform of the text. On our way, we prove other relevant bounds between compressibility measures....

  2. Multidimensional incremental parsing for universal source coding.

    Science.gov (United States)

    Bae, Soo Hyun; Juang, Biing-Hwang

    2008-10-01

    A multidimensional incremental parsing algorithm (MDIP) for multidimensional discrete sources, as a generalization of the Lempel-Ziv coding algorithm, is investigated. It consists of three essential component schemes, maximum decimation matching, hierarchical structure of multidimensional source coding, and dictionary augmentation. As a counterpart of the longest match search in the Lempel-Ziv algorithm, two classes of maximum decimation matching are studied. Also, an underlying behavior of the dictionary augmentation scheme for estimating the source statistics is examined. For an m-dimensional source, m augmentative patches are appended into the dictionary at each coding epoch, thus requiring the transmission of a substantial amount of information to the decoder. The property of the hierarchical structure of the source coding algorithm resolves this issue by successively incorporating lower dimensional coding procedures in the scheme. In regard to universal lossy source coders, we propose two distortion functions, the local average distortion and the local minimax distortion with a set of threshold levels for each source symbol. For performance evaluation, we implemented three image compression algorithms based upon the MDIP; one is lossless and the others are lossy. The lossless image compression algorithm does not perform better than the Lempel-Ziv-Welch coding, but experimentally shows efficiency in capturing the source structure. The two lossy image compression algorithms are implemented using the two distortion functions, respectively. The algorithm based on the local average distortion is efficient at minimizing the signal distortion, but the images by the one with the local minimax distortion have a good perceptual fidelity among other compression algorithms. Our insights inspire future research on feature extraction of multidimensional discrete sources.

  3. Lempel-Ziv Compression in a Sliding Window

    DEFF Research Database (Denmark)

    Bille, Philip; Cording, Patrick Hagge; Fischer, Johannes

    2017-01-01

    We present new algorithms for the sliding window Lempel-Ziv (LZ77) problem and the approximate rightmost LZ77 parsing problem. Our main result is a new and surprisingly simple algorithm that computes the sliding window LZ77 parse in O(w) space and either O(n) expected time or O(n log log w + z log...... logσ) deterministic time. Here, w is the window size, n is the size of the input string, z is the number of phrases in the parse, and σ is the size of the alphabet. This matches the space and time bounds of previous results while removing constant size restrictions on the alphabet size. To achieve our...... result, we combine a simple modification and augmentation of the suffix tree with periodicity properties of sliding windows. We also apply this new technique to obtain an algorithm for the approximate rightmost LZ77 problem that uses O(n(log z + loglogn)) time and O(n) space and produces a (1 + ϵ...

  4. On the non-randomness of maximum Lempel Ziv complexity sequences of finite size

    Science.gov (United States)

    Estevez-Rams, E.; Lora Serrano, R.; Aragón Fernández, B.; Brito Reyes, I.

    2013-06-01

    Random sequences attain the highest entropy rate. The estimation of entropy rate for an ergodic source can be done using the Lempel Ziv complexity measure yet, the exact entropy rate value is only reached in the infinite limit. We prove that typical random sequences of finite length fall short of the maximum Lempel-Ziv complexity, contrary to common belief. We discuss that, for a finite length, maximum Lempel-Ziv sequences can be built from a well defined generating algorithm, which makes them of low Kolmogorov-Chaitin complexity, quite the opposite to randomness. It will be discussed that Lempel-Ziv measure is, in this sense, less general than Kolmogorov-Chaitin complexity, as it can be fooled by an intelligent enough agent. The latter will be shown to be the case for the binary expansion of certain irrational numbers. Maximum Lempel-Ziv sequences induce a normalization that gives good estimates of entropy rate for several sources, while keeping bounded values for all sequence length, making it an alternative to other normalization schemes in use.

  5. Parsed and fixed block representations of visual information for image retrieval

    Science.gov (United States)

    Bae, Soo Hyun; Juang, Biing-Hwang

    2009-02-01

    The theory of linguistics teaches us the existence of a hierarchical structure in linguistic expressions, from letter to word root, and on to word and sentences. By applying syntax and semantics beyond words, one can further recognize the grammatical relationship between among words and the meaning of a sequence of words. This layered view of a spoken language is useful for effective analysis and automated processing. Thus, it is interesting to ask if a similar hierarchy of representation of visual information does exist. A class of techniques that have a similar nature to the linguistic parsing is found in the Lempel-Ziv incremental parsing scheme. Based on a new class of multidimensional incremental parsing algorithms extended from the Lempel-Ziv incremental parsing, a new framework for image retrieval, which takes advantage of the source characterization property of the incremental parsing algorithm, was proposed recently. With the incremental parsing technique, a given image is decomposed into a number of patches, called a parsed representation. This representation can be thought of as a morphological interface between elementary pixel and a higher level representation. In this work, we examine the properties of two-dimensional parsed representation in the context of imagery information retrieval and in contrast to vector quantization; i.e. fixed square-block representations and minimum average distortion criteria. We implemented four image retrieval systems for the comparative study; three, called IPSILON image retrieval systems, use parsed representation with different perceptual distortion thresholds and one uses the convectional vector quantization for visual pattern analysis. We observe that different perceptual distortion in visual pattern matching does not have serious effects on the retrieval precision although allowing looser perceptual thresholds in image compression result poor reconstruction fidelity. We compare the effectiveness of the use of the

  6. Viral genome phylogeny based on Lempel-Ziv complexity and Hausdorff distance.

    Science.gov (United States)

    Yu, Chenglong; He, Rong Lucy; Yau, Stephen S-T

    2014-05-07

    In this paper, we develop a novel method to study the viral genome phylogeny. We apply Lempel-Ziv complexity to define the distance between two nucleic acid sequences. Then, based on this distance we use the Hausdorff distance (HD) and a modified Hausdorff distance (MHD) to make the phylogenetic analysis for multi-segmented viral genomes. The results show the MHD can provide more accurate phylogenetic relationship. Our method can have global comparison of all multi-segmented genomes simultaneously, that is, we treat the multi-segmented viral genome as an entirety to make the comparative analysis. Our method is not affected by the number or order of segments, and each segment can make contribution for the phylogeny of whole genomes. We have analyzed several groups of real multi-segmented genomes from different viral families. The results show that our method will provide a new powerful tool for studying the classification of viral genomes and their phylogenetic relationships. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Watermark Compression in Medical Image Watermarking Using Lempel-Ziv-Welch (LZW) Lossless Compression Technique.

    Science.gov (United States)

    Badshah, Gran; Liew, Siau-Chuin; Zain, Jasni Mohd; Ali, Mushtaq

    2016-04-01

    In teleradiology, image contents may be altered due to noisy communication channels and hacker manipulation. Medical image data is very sensitive and can not tolerate any illegal change. Illegally changed image-based analysis could result in wrong medical decision. Digital watermarking technique can be used to authenticate images and detect as well as recover illegal changes made to teleradiology images. Watermarking of medical images with heavy payload watermarks causes image perceptual degradation. The image perceptual degradation directly affects medical diagnosis. To maintain the image perceptual and diagnostic qualities standard during watermarking, the watermark should be lossless compressed. This paper focuses on watermarking of ultrasound medical images with Lempel-Ziv-Welch (LZW) lossless-compressed watermarks. The watermark lossless compression reduces watermark payload without data loss. In this research work, watermark is the combination of defined region of interest (ROI) and image watermarking secret key. The performance of the LZW compression technique was compared with other conventional compression methods based on compression ratio. LZW was found better and used for watermark lossless compression in ultrasound medical images watermarking. Tabulated results show the watermark bits reduction, image watermarking with effective tamper detection and lossless recovery.

  8. Effects of the series length on Lempel-Ziv Complexity during sleep.

    Science.gov (United States)

    Rivolta, Massimo W; Migliorini, Matteo; Aktaruzzaman, Md; Sassi, Roberto; Bianchi, Anna M

    2014-01-01

    Lempel-Ziv Complexity (LZC) has been demonstrated to be a powerful complexity measure in several biomedical applications. During sleep, it is still not clear how many samples are required to ensure robustness of its estimate when computed on beat-to-beat interval series (RR). The aims of this study were: i) evaluation of the number of necessary samples in different sleep stages for a reliable estimation of LZC; ii) evaluation of the LZC when considering inter-subject variability; and iii) comparison between LZC and Sample Entropy (SampEn). Both synthetic and real data were employed. In particular, synthetic RR signals were generated by means of AR models fitted on real data. The minimum number of samples required by LZC for having no changes in its average value, for both NREM and REM sleep periods, was 10(4) (p1000 when a tolerance of 5% is considered satisfying. The influence of the inter-subject variability on the LZC was first assessed on model generated data confirming what found (>10(4); pNREM and REM stage. However, on real data, without differentiate between sleep stages, the minimum number of samples required was 1.8×10(4). The linear correlation between LZC and SampEn was computed on a synthetic dataset. We obtained a correlation higher than 0.75 (p<;0.01) when considering sleep stages separately, and higher than 0.90 (p<;0.01) when stages were not differentiated. Summarizing, we suggest to use LZC with the binary quantization and at least 1000 samples when a variation smaller than 5% is considered satisfying, or at least 10(4) for maximal accuracy. The use of more than 2 levels of quantization is not recommended.

  9. Método para el Diagnóstico de Rodamientos Utilizando la Complejidad de Lempel-Ziv

    Directory of Open Access Journals (Sweden)

    Diego L. Guarín-Lopez

    2011-06-01

    Full Text Available La presencia de una falla en un rodamiento hace que el sistema mecánico evolucione de una dinámica débilmente no lineal a una dinámica fuertemente no lineal, por lo tanto los métodos lineales comunes en el dominio del tiempo y la frecuencia no son adecuados para el diagnóstico de rodamientos. En el presente artículo se propone una metodología novedosa no lineal para la detección de fallas en rodamientos, que usa la medida de complejidad sugerida por Lempel y Ziv para caracterizar las señales de vibración. La ventaja principal de este método sobre las demás técnicas de análisis no lineal es que no requiere la reconstrucción de un atractor, por lo que es adecuado para realizar análisis en tiempo real. Los resultados obtenidos muestran que la complejidad de Lempel-Ziv es una herramienta efectiva para el diagnóstico de rodamientos.

  10. Nonlinear complexity of random visibility graph and Lempel-Ziv on multitype range-intensity interacting financial dynamics

    Science.gov (United States)

    Zhang, Yali; Wang, Jun

    2017-09-01

    In an attempt to investigate the nonlinear complex evolution of financial dynamics, a new financial price model - the multitype range-intensity contact (MRIC) financial model, is developed based on the multitype range-intensity interacting contact system, in which the interaction and transmission of different types of investment attitudes in a stock market are simulated by viruses spreading. Two new random visibility graph (VG) based analyses and Lempel-Ziv complexity (LZC) are applied to study the complex behaviors of return time series and the corresponding random sorted series. The VG method is the complex network theory, and the LZC is a non-parametric measure of complexity reflecting the rate of new pattern generation of a series. In this work, the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, the numerical empirical study shows the similar complexity behaviors between the model and the real markets, the research confirms that the financial model is reasonable to some extent.

  11. A short note on the paper of Liu et al. (2012). A relative Lempel-Ziv complexity: Application to comparing biological sequences. Chemical Physics Letters, volume 530, 19 March 2012, pages 107-112

    Science.gov (United States)

    Arit, Turkan; Keskin, Burak; Firuzan, Esin; Cavas, Cagin Kandemir; Liu, Liwei; Cavas, Levent

    2018-04-01

    The report entitled "L. Liu, D. Li, F. Bai, A relative Lempel-Ziv complexity: Application to comparing biological sequences, Chem. Phys. Lett. 530 (2012) 107-112" mentions on the powerful construction of phylogenetic trees based on Lempel-Ziv algorithm. On the other hand, the method explained in the paper does not give promising result on the data set on invasive Caulerpa taxifolia in the Mediterranean Sea. The phylogenetic trees are obtained by the proposed method of the aforementioned paper in this short note.

  12. A syntactic language model based on incremental CCG parsing

    NARCIS (Netherlands)

    Hassan, H.; Sima'an, K.; Way, A.

    2008-01-01

    Syntactically-enriched language models (parsers) constitute a promising component in applications such as machine translation and speech-recognition. To maintain a useful level of accuracy, existing parsers are non-incremental and must span a combinatorially growing space of possible structures as

  13. Incremental Learning of Context Free Grammars by Parsing-Based Rule Generation and Rule Set Search

    Science.gov (United States)

    Nakamura, Katsuhiko; Hoshina, Akemi

    This paper discusses recent improvements and extensions in Synapse system for inductive inference of context free grammars (CFGs) from sample strings. Synapse uses incremental learning, rule generation based on bottom-up parsing, and the search for rule sets. The form of production rules in the previous system is extended from Revised Chomsky Normal Form A→βγ to Extended Chomsky Normal Form, which also includes A→B, where each of β and γ is either a terminal or nonterminal symbol. From the result of bottom-up parsing, a rule generation mechanism synthesizes minimum production rules required for parsing positive samples. Instead of inductive CYK algorithm in the previous version of Synapse, the improved version uses a novel rule generation method, called ``bridging,'' which bridges the lacked part of the derivation tree for the positive string. The improved version also employs a novel search strategy, called serial search in addition to minimum rule set search. The synthesis of grammars by the serial search is faster than the minimum set search in most cases. On the other hand, the size of the generated CFGs is generally larger than that by the minimum set search, and the system can find no appropriate grammar for some CFL by the serial search. The paper shows experimental results of incremental learning of several fundamental CFGs and compares the methods of rule generation and search strategies.

  14. Using modified incremental chart parsing to ascribe intentions to animated geometric figures.

    Science.gov (United States)

    Pautler, David; Koenig, Bryan L; Quek, Boon-Kiat; Ortony, Andrew

    2011-09-01

    People spontaneously ascribe intentions on the basis of observed behavior, and research shows that they do this even with simple geometric figures moving in a plane. The latter fact suggests that 2-D animations isolate critical information--object movement--that people use to infer the possible intentions (if any) underlying observed behavior. This article describes an approach to using motion information to model the ascription of intentions to simple figures. Incremental chart parsing is a technique developed in natural-language processing that builds up an understanding as text comes in one word at a time. We modified this technique to develop a system that uses spatiotemporal constraints about simple figures and their observed movements in order to propose candidate intentions or nonagentive causes. Candidates are identified via partial parses using a library of rules, and confidence scores are assigned so that candidates can be ranked. As observations come in, the system revises its candidates and updates the confidence scores. We describe a pilot study demonstrating that people generally perceive a simple animation in a manner consistent with the model.

  15. Dependency Parsing

    CERN Document Server

    Kubler, Sandra; Nivre, Joakim

    2009-01-01

    Dependency-based methods for syntactic parsing have become increasingly popular in natural language processing in recent years. This book gives a thorough introduction to the methods that are most widely used today. After an introduction to dependency grammar and dependency parsing, followed by a formal characterization of the dependency parsing problem, the book surveys the three major classes of parsing models that are in current use: transition-based, graph-based, and grammar-based models. It continues with a chapter on evaluation and one on the comparison of different methods, and it close

  16. Structural parsing

    NARCIS (Netherlands)

    Hoede, C.; Zhang, Lei

    2000-01-01

    Parsing is an essential part of natural language processing. In this paper, structural parsing, which is based on the theory of knowledge graphs, is introduced. Under consideration of the semantic and syntactic features of natural language, both semantic and syntactic word graphs are formed. Grammar

  17. Time-space trade-offs for lempel-ziv compressed indexing

    DEFF Research Database (Denmark)

    Bille, Philip; Ettienne, Mikko Berggren; Gørtz, Inge Li

    2017-01-01

    compression scheme. Let n, and z denote the size of the input string, and the compressed LZ77 string, respectively. We obtain the following time-space trade-offs. Given a pattern string P of length m, we can solve the problem in (i) O (m + occ lg lg n) time using O(z lg(n/z) lg lg z) space, or (ii) (m (1...... + lgϵ z/lg(n/z) + occ(lg lg n + lgϵ z)) time using O(z lg (n/z)) space, for any 0 time of the previous best solution from O(m lg m) to O(m) at the cost of increasing the space by a factor lg lg z. Alternatively, (ii) matches the previous...... best space bound, but has a leading term in the query time of O(m(1 + lgϵ z/lg(n/z))). However, for any polynomial compression ratio, i.e., z = O(n1-δ), for constant δ > 0, this becomes O(m). Our index also supports extraction of any substring of length ℓ in O(ℓ + lg(n/z)) time. Technically, our...

  18. Relational-realizational parsing

    NARCIS (Netherlands)

    Tsarfaty, R.; Sima'an, K.

    2008-01-01

    State-of-the-art statistical parsing models applied to free word-order languages tend to underperform compared to, e.g., parsing English. Constituency-based models often fail to capture generalizations that cannot be stated in structural terms, and dependency-based models employ a 'single-head'

  19. Faster Scannerless GLR parsing

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); G.R. Economopoulos (Giorgos Robert); P. Klint (Paul)

    2008-01-01

    textabstractAnalysis and renovation of large software portfolios requires syntax analysis of multiple, usually embedded, languages and this is beyond the capabilities of many standard parsing techniques. The traditional separation between lexer and parser falls short due to the limitations of

  20. Faster scannerless GLR parsing

    NARCIS (Netherlands)

    G.R. Economopoulos (Giorgos Robert); P. Klint (Paul); J.J. Vinju (Jurgen); O. de Moor; M.I. Schwartzbach

    2009-01-01

    textabstractAnalysis and renovation of large software portfolios requires syntax analysis of multiple, usually embedded, languages and this is beyond the capabilities of many standard parsing techniques. The traditional separation between lexer and parser falls short due to the limitations of

  1. Passive sentences and structural parsing

    NARCIS (Netherlands)

    Liu, X; Hoede, C.

    2002-01-01

    Traditional language parsing is mainly based on generative grammar in English. As English and Chinese belong to two different families of language, a grammar is not sufficient for Chinese parsing although it is still important. In passive sentences in English and Chinese, there exists some

  2. Parsing Schemata - a framework for specification and analysis of parsing algorithms

    NARCIS (Netherlands)

    Sikkel, Nicolaas

    1997-01-01

    Parsing schemata provide a general framework for specification, analysis and comparison of (sequential and/or parallel) parsing algorithms. A grammar specifies implicitly what the valid parses of a sentence are; a parsing algorithm specifies explicitly how to compute these. Parsing schemata form a

  3. Parsing Heterogeneous Striatal Activity

    Directory of Open Access Journals (Sweden)

    Kae Nakamura

    2017-05-01

    Full Text Available The striatum is an input channel of the basal ganglia and is well known to be involved in reward-based decision making and learning. At the macroscopic level, the striatum has been postulated to contain parallel functional modules, each of which includes neurons that perform similar computations to support selection of appropriate actions for different task contexts. At the single-neuron level, however, recent studies in monkeys and rodents have revealed heterogeneity in neuronal activity even within restricted modules of the striatum. Looking for generality in the complex striatal activity patterns, here we briefly survey several types of striatal activity, focusing on their usefulness for mediating behaviors. In particular, we focus on two types of behavioral tasks: reward-based tasks that use salient sensory cues and manipulate outcomes associated with the cues; and perceptual decision tasks that manipulate the quality of noisy sensory cues and associate all correct decisions with the same outcome. Guided by previous insights on the modular organization and general selection-related functions of the basal ganglia, we relate striatal activity patterns on these tasks to two types of computations: implementation of selection and evaluation. We suggest that a parsing with the selection/evaluation categories encourages a focus on the functional commonalities revealed by studies with different animal models and behavioral tasks, instead of a focus on aspects of striatal activity that may be specific to a particular task setting. We then highlight several questions in the selection-evaluation framework for future explorations.

  4. Dependency Parsing with Transformed Feature

    Directory of Open Access Journals (Sweden)

    Fuxiang Wu

    2017-01-01

    Full Text Available Dependency parsing is an important subtask of natural language processing. In this paper, we propose an embedding feature transforming method for graph-based parsing, transform-based parsing, which directly utilizes the inner similarity of the features to extract information from all feature strings including the un-indexed strings and alleviate the feature sparse problem. The model transforms the extracted features to transformed features via applying a feature weight matrix, which consists of similarities between the feature strings. Since the matrix is usually rank-deficient because of similar feature strings, it would influence the strength of constraints. However, it is proven that the duplicate transformed features do not degrade the optimization algorithm: the margin infused relaxed algorithm. Moreover, this problem can be alleviated by reducing the number of the nearest transformed features of a feature. In addition, to further improve the parsing accuracy, a fusion parser is introduced to integrate transformed and original features. Our experiments verify that both transform-based and fusion parser improve the parsing accuracy compared to the corresponding feature-based parser.

  5. Learning for Semantic Parsing Using Statistical Syntactic Parsing Techniques

    Science.gov (United States)

    2010-05-01

    Committee: Raymond J. Mooney, Supervisor Jason M. Baldridge Risto Miikkulainen Martha Palmer Bruce W. Porter Learning for Semantic Parsing Using... Baldridge , Risto Miikku- lainen, Martha Palmer, Bruce Porter, and the former member, Ben Kuipers. I have a deeper understanding of the research thanks to...their insight and suggestions. I am especially grateful to Jason Baldridge for his valuable advice and help on my job hunting besides research. I would

  6. Parsing Universal Dependencies without training

    DEFF Research Database (Denmark)

    Martinez Alonso, Hector; Agic, Zeljko; Plank, Barbara

    2017-01-01

    We present UDP, the first training-free parser for Universal Dependencies (UD). Our algorithm is based on PageRank and a small set of specific dependency head rules. UDP features two-step decoding to guarantee that function words are attached as leaf nodes. The parser requires no training......, and it is competitive with a delexicalized transfer system. UDP offers a linguistically sound unsupervised alternative to cross-lingual parsing for UD. The parser has very few parameters and distinctly robust to domain change across languages....

  7. Parsing Chinese-Russian Military Exercises

    Science.gov (United States)

    2015-04-01

    music festival.154 ASSESSMENT These joint Russian- Chinese military exercises serve several important national security purposes for both governments...APR 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Parsing Chinese -Russian Military Exercises 5a. CONTRACT...support of Army participation in national security policy formulation. iii Strategic Studies Institute and U.S. Army War College Press PARSING CHINESE

  8. Context-free parsing with connectionist networks

    Science.gov (United States)

    Fanty, M. A.

    1986-08-01

    This paper presents a simple algorithm which converts any context-free grammar into a connectionist network which parses strings (of arbitrary but fixed maximum length) in the language defined by that grammar. The network is fast, O(n), and deterministicd. It consists of binary units which compute a simple function of their input. When the grammar is put in Chomsky normal form, O(n3) units needed to parse inputs of length up to n.

  9. Two-pass greedy regular expression parsing

    DEFF Research Database (Denmark)

    Grathwohl, Niels Bjørn Bugge; Henglein, Fritz; Nielsen, Lasse

    2013-01-01

    by: operating in only 2 passes; using only O(m) words of random-access memory (independent of n); requiring only kn bits of sequentially written and read log storage, where k ...We present new algorithms for producing greedy parses for regular expressions (REs) in a semi-streaming fashion. Our lean-log algorithm executes in time O(mn) for REs of size m and input strings of size n and outputs a compact bit-coded parse tree representation. It improves on previous algorithms...

  10. Application development with Parse using iOS SDK

    CERN Document Server

    Birani, Bhanu

    2013-01-01

    A practical guide, featuring step-by-step instructions showing you how to use Parse iOS, and handle your data on cloud.If you are a developer who wants to build your applications instantly using Parse iOS as a back end application development, this book is ideal for you. This book will help you to understand Parse, featuring examples to help you get familiar with the concepts of Parse iOS.

  11. Perceiving Event Dynamics and Parsing Hollywood Films

    Science.gov (United States)

    Cutting, James E.; Brunick, Kaitlin L.; Candan, Ayse

    2012-01-01

    We selected 24 Hollywood movies released from 1940 through 2010 to serve as a film corpus. Eight viewers, three per film, parsed them into events, which are best termed subscenes. While watching a film a second time, viewers scrolled through frames and recorded the frame number where each event began. Viewers agreed about 90% of the time. We then…

  12. A Semantic Constraint on Syntactic Parsing.

    Science.gov (United States)

    Crain, Stephen; Coker, Pamela L.

    This research examines how semantic information influences syntactic parsing decisions during sentence processing. In the first experiment, subjects were presented lexical strings having syntactically identical surface structures but with two possible underlying structures: "The children taught by the Berlitz method," and "The…

  13. Predictive Head-Corner Chart Parsing

    NARCIS (Netherlands)

    Sikkel, Nicolaas; op den Akker, Hendrikus J.A.; Bunt, H.; Tomita, M.

    Head-Corner (HC) parsing has come up in computational linguistics a few years ago, motivated by linguistic arguments. This idea is a heuristic, rather than a fail-safe principle, hence it is relevant indeed to consider the worst-case behaviour of the HC parser. We define a novel predictive

  14. Predictive Head-Corner Chart Parsing

    NARCIS (Netherlands)

    Sikkel, Nicolaas; op den Akker, Hendrikus J.A.; Bunt, H.; Tomita, M.

    1996-01-01

    Head-Corner (HC) parsing has come up in computational linguistics a few years ago, motivated by linguistic arguments. This idea is a heuristics, rather than a fail-safe principle, hence it is relevant indeed to consider the worst-case behaviour of the HC parser. We define a novel predictive

  15. Parsing polarization squeezing into Fock layers

    DEFF Research Database (Denmark)

    Mueller, Christian R.; Madsen, Lars Skovgaard; Klimov, Andrei B.

    2016-01-01

    photon number do the methods coincide; when the photon number is indefinite, we parse the state in Fock layers, finding that substantially higher squeezing can be observed in some of the single layers. By capitalizing on the properties of the Husimi Q function, we map this notion onto the Poincare space......, providing a full account of the measured squeezing....

  16. Analyzing holistic parsers: implications for robust parsing and systematicity.

    Science.gov (United States)

    Ho, E K; Chan, L W

    2001-05-01

    Holistic parsers offer a viable alternative to traditional algorithmic parsers. They have good generalization performance and are robust inherently. In a holistic parser, parsing is achieved by mapping the connectionist representation of the input sentence to the connectionist representation of the target parse tree directly. Little prior knowledge of the underlying parsing mechanism thus needs to be assumed. However, it also makes holistic parsing difficult to understand. In this article, an analysis is presented for studying the operations of the confluent preorder parser (CPP). In the analysis, the CPP is viewed as a dynamical system, and holistic parsing is perceived as a sequence of state transitions through its state-space. The seemingly one-shot parsing mechanism can thus be elucidated as a step-by-step inference process, with the intermediate parsing decisions being reflected by the states visited during parsing. The study serves two purposes. First, it improves our understanding of how grammatical errors are corrected by the CPP. The occurrence of an error in a sentence will cause the CPP to deviate from the normal track that is followed when the original sentence is parsed. But as the remaining terminals are read, the two trajectories will gradually converge until finally the correct parse tree is produced. Second, it reveals that having systematic parse tree representations alone cannot guarantee good generalization performance in holistic parsing. More important, they need to be distributed in certain useful locations of the representational space. Sentences with similar trailing terminals should have their corresponding parse tree representations mapped to nearby locations in the representational space. The study provides concrete evidence that encoding the linearized parse trees as obtained via preorder traversal can satisfy such a requirement.

  17. Telugu dependency parsing using different statistical parsers

    Directory of Open Access Journals (Sweden)

    B. Venkata Seshu Kumari

    2017-01-01

    Full Text Available In this paper we explore different statistical dependency parsers for parsing Telugu. We consider five popular dependency parsers namely, MaltParser, MSTParser, TurboParser, ZPar and Easy-First Parser. We experiment with different parser and feature settings and show the impact of different settings. We also provide a detailed analysis of the performance of all the parsers on major dependency labels. We report our results on test data of Telugu dependency treebank provided in the ICON 2010 tools contest on Indian languages dependency parsing. We obtain state-of-the art performance of 91.8% in unlabeled attachment score and 70.0% in labeled attachment score. To the best of our knowledge ours is the only work which explored all the five popular dependency parsers and compared the performance under different feature settings for Telugu.

  18. Parsing Argumentation Structures in Persuasive Essays

    OpenAIRE

    Stab, Christian; Gurevych, Iryna

    2016-01-01

    In this article, we present a novel approach for parsing argumentation structures. We identify argument components using sequence labeling at the token level and apply a new joint model for detecting argumentation structures. The proposed model globally optimizes argument component types and argumentative relations using integer linear programming. We show that our model considerably improves the performance of base classifiers and significantly outperforms challenging heuristic baselines. Mo...

  19. Prosody and Children's Parsing of Sentences. Technical Report No. 123.

    Science.gov (United States)

    Kleiman, Glenn M.; And Others

    Parsing sentences into meaningful phrases and clauses is an essential step in language comprehension, and parsing difficulty is a common reading problem. Prosody (intonation, stress, and rhythm) provides information about phrase and clause boundaries in spoken language that is not available in written language. In an experiment to test whether…

  20. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  1. ASF+SDF parsing tools applied to ELAN

    OpenAIRE

    van den Brand, M.G.J.; Ringeissen, C.

    2000-01-01

    textabstractThis paper describes the development of a new elan parser using asdf parsing technology. asdf and elan are two modern rule-based systems. Both systems have their own features and application domains, however, both formalism have user-defined syntax for defining rewrite rules. The asdf Meta-Environment uses powerful and efficient generic parsing tools, whereas the elan parser is based on an Earley parser. Furthermore, the elan syntax is ``hard-wired'' in the parser, which makes ada...

  2. On Collocations and Their Interaction with Parsing and Translation

    Directory of Open Access Journals (Sweden)

    Violeta Seretan

    2013-10-01

    Full Text Available We address the problem of automatically processing collocations—a subclass of multi-word expressions characterized by a high degree of morphosyntactic flexibility—in the context of two major applications, namely, syntactic parsing and machine translation. We show that parsing and collocation identification are processes that are interrelated and that benefit from each other, inasmuch as syntactic information is crucial for acquiring collocations from corpora and, vice versa, collocational information can be used to improve parsing performance. Similarly, we focus on the interrelation between collocations and machine translation, highlighting the use of translation information for multilingual collocation identification, as well as the use of collocational knowledge for improving translation. We give a panorama of the existing relevant work, and we parallel the literature surveys with our own experiments involving a symbolic parser and a rule-based translation system. The results show a significant improvement over approaches in which the corresponding tasks are decoupled.

  3. Coalescing the theories of two nurse visionaries: Parse and Watson.

    Science.gov (United States)

    Walker, C A

    1996-11-01

    The theories of two nurse visionaries, Rosemarie Rizzo Parse and Jean Watson, are examined for areas of agreement and notable differences. Watson and Parse reject (or hold seriously suspect) traditional, positivistic methods of studying human behaviour and posit their theories as alternatives to the totality paradigm. Since both of these theories, Parse's theory of human becoming and Watson's theory of transpersonal care, borrow heavily from existential phenomenology, major tenets of this philosophic perspective are outlined. Each theory is then described with emphasis on anchoring motifs, concepts, and principles. Next both theories are analysed and critiqued simultaneously. Finally, the theories are applied to a case study with the intent of maximizing their mutual strengths and diminishing their limitations. Coalescence of compatible theories is recommended as a way of enhancing the application of nursing knowledge in practice.

  4. Recursive Neural Networks Based on PSO for Image Parsing

    Directory of Open Access Journals (Sweden)

    Guo-Rong Cai

    2013-01-01

    Full Text Available This paper presents an image parsing algorithm which is based on Particle Swarm Optimization (PSO and Recursive Neural Networks (RNNs. State-of-the-art method such as traditional RNN-based parsing strategy uses L-BFGS over the complete data for learning the parameters. However, this could cause problems due to the nondifferentiable objective function. In order to solve this problem, the PSO algorithm has been employed to tune the weights of RNN for minimizing the objective. Experimental results obtained on the Stanford background dataset show that our PSO-based training algorithm outperforms traditional RNN, Pixel CRF, region-based energy, simultaneous MRF, and superpixel MRF.

  5. Treelet Probabilities for HPSG Parsing and Error Correction

    NARCIS (Netherlands)

    Ivanova, Angelina; van Noord, Gerardus; Calzolari, Nicoletta; al, et

    2014-01-01

    Most state-of-the-art parsers take an approach to produce an analysis for any input despite errors. However, small grammatical mistakes in a sentence often cause parser to fail to build a correct syntactic tree. Applications that can identify and correct mistakes during parsing are particularly

  6. Time-Driven Effects on Parsing during Reading

    Science.gov (United States)

    Roll, Mikael; Lindgren, Magnus; Alter, Kai; Horne, Merle

    2012-01-01

    The phonological trace of perceived words starts fading away in short-term memory after a few seconds. Spoken utterances are usually 2-3 s long, possibly to allow the listener to parse the words into coherent prosodic phrases while they still have a clear representation. Results from this brain potential study suggest that even during silent…

  7. Towards Robustness in Parsing -- Fuzzifying Context-Free Language Recognition

    NARCIS (Netherlands)

    Asveld, P.R.J.; Dassow, J.; Rozenberg, G.; Salomaa, A.

    1996-01-01

    We discuss the concept of robustness with respect to parsing or recognizing a context-free language. Our approach is based on the notions of fuzzy language, (generalized) fuzzy context-free grammar, and parser/recognizer for fuzzy languages. As concrete examples we consider a robust version of

  8. ASF+SDF parsing tools applied to ELAN

    NARCIS (Netherlands)

    M.G.J. van den Brand (Mark); C. Ringeissen

    2000-01-01

    textabstractThis paper describes the development of a new elan parser using asdf parsing technology. asdf and elan are two modern rule-based systems. Both systems have their own features and application domains, however, both formalism have user-defined syntax for defining rewrite rules. The asdf

  9. Connectionist natural language parsing with BrainC

    Science.gov (United States)

    Mueller, Adrian; Zell, Andreas

    1991-08-01

    A close examination of pure neural parsers shows that they either could not guarantee the correctness of their derivations or had to hard-code seriality into the structure of the net. The authors therefore decided to use a hybrid architecture, consisting of a serial parsing algorithm and a trainable net. The system fulfills the following design goals: (1) parsing of sentences without length restriction, (2) soundness and completeness for any context-free language, and (3) learning the applicability of parsing rules with a neural network to increase the efficiency of the whole system. BrainC (backtracktacking and backpropagation in C) combines the well- known shift-reduce parsing technique with backtracking with a backpropagation network to learn and represent typical structures of the trained natural language grammars. The system has been implemented as a subsystem of the Rochester Connectionist Simulator (RCS) on SUN workstations and was tested with several grammars for English and German. The design of the system and then the results are discussed.

  10. Parsing with Subdomain Instance Weighting from Raw Corpora

    NARCIS (Netherlands)

    Plank, Barbara; Sima'an, Khalil

    2008-01-01

    The treebanks that are used for training statistical parsers consist of hand-parsed sentences from a single source/domain like newspaper text. However, newspaper text concerns different subdomains of language use (e.g. finance, sports, politics, music), which implies that the statistics gathered by

  11. Parsing with subdomain instance weighting from raw corpora

    NARCIS (Netherlands)

    Plank, B.; Sima'an, K.

    2008-01-01

    The treebanks that are used for training statistical parsers consist of hand-parsed sentences from a single source/domain like newspaper text. However, newspaper text concerns different subdomains of language use (e.g. finance, sports, politics, music), which implies that the statistics gathered by

  12. Dependency Parsing with Lattice Structures for Resource-Poor Languages

    Science.gov (United States)

    Sudprasert, Sutee; Kawtrakul, Asanee; Boitet, Christian; Berment, Vincent

    In this paper, we present a new dependency parsing method for languages which have very small annotated corpus and for which methods of segmentation and morphological analysis producing a unique (automatically disambiguated) result are very unreliable. Our method works on a morphosyntactic lattice factorizing all possible segmentation and part-of-speech tagging results. The quality of the input to syntactic analysis is hence much better than that of an unreliable unique sequence of lemmatized and tagged words. We propose an adaptation of Eisner's algorithm for finding the k-best dependency trees in a morphosyntactic lattice structure encoding multiple results of morphosyntactic analysis. Moreover, we present how to use Dependency Insertion Grammar in order to adjust the scores and filter out invalid trees, the use of language model to rescore the parse trees and the k-best extension of our parsing model. The highest parsing accuracy reported in this paper is 74.32% which represents a 6.31% improvement compared to the model taking the input from the unreliable morphosyntactic analysis tools.

  13. A simple DOP model for constituency parsing of Italian sentences

    NARCIS (Netherlands)

    Sangati, F.

    2009-01-01

    We present a simplified Data-Oriented Parsing (DOP) formalism for learning the constituency structure of Italian sentences. In our approach we try to simplify the original DOP methodology by constraining the number and type of fragments we extract from the training corpus. We provide some examples

  14. Fuzzy Context- Free Languages. Part 2: Recognition and Parsing Algorithms

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2000-01-01

    In a companion paper \\cite{Asv:FCF1} we used fuzzy context-free grammars in order to model grammatical errors resulting in erroneous inputs for robust recognizing and parsing algorithms for fuzzy context-free languages. In particular, this approach enables us to distinguish between small errors

  15. Incremental Risk Vulnerability

    OpenAIRE

    Günter Franke; Richard C. Stapleton; Marti G. Subrahmanyam

    2005-01-01

    We present a necessary and sufficient condition on an agent's utility function for a simple mean preserving spread in an independent background risk to increase the agent's risk aversion (incremental risk vulnerability). Gollier and Pratt (1996) have shown that declining and convex risk aversion as well as standard risk aversion are sufficient for risk vulnerability. We show that these conditions are also sufficient for incremental risk vulnerability. In addition, we present sufficient condit...

  16. Deep Incremental Boosting

    OpenAIRE

    Mosca, Alan; Magoulas, George D

    2017-01-01

    This paper introduces Deep Incremental Boosting, a new technique derived from AdaBoost, specifically adapted to work with Deep Learning methods, that reduces the required training time and improves generalisation. We draw inspiration from Transfer of Learning approaches to reduce the start-up time to training each incremental Ensemble member. We show a set of experiments that outlines some preliminary results on some common Deep Learning datasets and discuss the potential improvements Deep In...

  17. Incremental Refinement of FAÇADE Models with Attribute Grammar from 3d Point Clouds

    Science.gov (United States)

    Dehbi, Y.; Staat, C.; Mandtler, L.; Pl¨umer, L.

    2016-06-01

    Data acquisition using unmanned aerial vehicles (UAVs) has gotten more and more attention over the last years. Especially in the field of building reconstruction the incremental interpretation of such data is a demanding task. In this context formal grammars play an important role for the top-down identification and reconstruction of building objects. Up to now, the available approaches expect offline data in order to parse an a-priori known grammar. For mapping on demand an on the fly reconstruction based on UAV data is required. An incremental interpretation of the data stream is inevitable. This paper presents an incremental parser of grammar rules for an automatic 3D building reconstruction. The parser enables a model refinement based on new observations with respect to a weighted attribute context-free grammar (WACFG). The falsification or rejection of hypotheses is supported as well. The parser can deal with and adapt available parse trees acquired from previous interpretations or predictions. Parse trees derived so far are updated in an iterative way using transformation rules. A diagnostic step searches for mismatches between current and new nodes. Prior knowledge on façades is incorporated. It is given by probability densities as well as architectural patterns. Since we cannot always assume normal distributions, the derivation of location and shape parameters of building objects is based on a kernel density estimation (KDE). While the level of detail is continuously improved, the geometrical, semantic and topological consistency is ensured.

  18. Extending TF1: Argument parsing, function composition, and vectorization

    CERN Document Server

    Tsang Mang Kin, Arthur Leonard

    2017-01-01

    In this project, we extend the functionality of the TF1 function class in root. We add argument parsing, making it possible to freely pass variables and parameters into pre-defined and user-defined functions. We also introduce a syntax to use certain compositions of functions, namely normalized sums and convolutions, directly in TF1. Finally, we introduce some simple vectorization functionality to TF1 and demonstrate the potential to speed up parallelizable computations.

  19. Creating Parsing Lexicons from Semantic Lexicons Automatically and Its Applications

    National Research Council Canada - National Science Library

    Ayan, Necip F; Dorr, Bonnie

    2002-01-01

    ...). We also present the effects of using such a lexicon on the parser performance. The advantage of automating the process is that the same technique can be applied directly to lexicons we have for other languages, for example, Arabic, Chinese, and Spanish. The results indicate that our method will help us generate parsing lexicons which can be used by a broad-coverage parser that runs on different languages.

  20. Parsing Citations in Biomedical Articles Using Conditional Random Fields

    OpenAIRE

    Zhang, Qing; Cao, Yong-Gang; Yu, Hong

    2011-01-01

    Citations are used ubiquitously in biomedical full-text articles and play an important role for representing both the rhetorical structure and the semantic content of the articles. As a result, text mining systems will significantly benefit from a tool that automatically extracts the content of a citation. In this study, we applied the supervised machine-learning algorithms Conditional Random Fields (CRFs) to automatically parse a citation into its fields (e.g., Author, Title, Journal, and Ye...

  1. Incremental Gaussian Processes

    DEFF Research Database (Denmark)

    Quiñonero-Candela, Joaquin; Winther, Ole

    2002-01-01

    In this paper, we consider Tipping's relevance vector machine (RVM) and formalize an incremental training strategy as a variant of the expectation-maximization (EM) algorithm that we call subspace EM. Working with a subset of active basis functions, the sparsity of the RVM solution will ensure...

  2. Incremental Similarity and Turbulence

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole E.; Hedevang, Emil; Schmiegel, Jürgen

    This paper discusses the mathematical representation of an empirically observed phenomenon, referred to as Incremental Similarity. We discuss this feature from the viewpoint of stochastic processes and present a variety of non-trivial examples, including those that are of relevance for turbulence...

  3. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2005-01-01

    This volume is the first of two volumes containing the revised and completed notes lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald during the period March 9 – 22, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present first volume contains the following lectures: "Lévy Processes in Euclidean Spaces and Groups" by David Applebaum, "Locally Compact Quantum Groups" by Johan Kustermans, "Quantum Stochastic Analysis" by J. Martin Lindsay, and "Dilations, Cocycles and Product Systems" by B.V. Rajarama Bhat.

  4. Quantum independent increment processes

    CERN Document Server

    Franz, Uwe

    2006-01-01

    This is the second of two volumes containing the revised and completed notes of lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald in March, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present second volume contains the following lectures: "Random Walks on Finite Quantum Groups" by Uwe Franz and Rolf Gohm, "Quantum Markov Processes and Applications in Physics" by Burkhard Kümmerer, Classical and Free Infinite Divisibility and Lévy Processes" by Ole E. Barndorff-Nielsen, Steen Thorbjornsen, and "Lévy Processes on Quantum Groups and Dual Groups" by Uwe Franz.

  5. Efficient incremental relaying

    KAUST Repository

    Fareed, Muhammad Mehboob

    2013-07-01

    We propose a novel relaying scheme which improves the spectral efficiency of cooperative diversity systems by utilizing limited feedback from destination. Our scheme capitalizes on the fact that relaying is only required when direct transmission suffers deep fading. We calculate the packet error rate for the proposed efficient incremental relaying scheme with both amplify and forward and decode and forward relaying. Numerical results are also presented to verify their analytical counterparts. © 2013 IEEE.

  6. Locating and parsing bibliographic references in HTML medical articles.

    Science.gov (United States)

    Zou, Jie; Le, Daniel; Thoma, George R

    2010-06-01

    The set of references that typically appear toward the end of journal articles is sometimes, though not always, a field in bibliographic (citation) databases. But even if references do not constitute such a field, they can be useful as a preprocessing step in the automated extraction of other bibliographic data from articles, as well as in computer-assisted indexing of articles. Automation in data extraction and indexing to minimize human labor is key to the affordable creation and maintenance of large bibliographic databases. Extracting the components of references, such as author names, article title, journal name, publication date and other entities, is therefore a valuable and sometimes necessary task. This paper describes a two-step process using statistical machine learning algorithms, to first locate the references in HTML medical articles and then to parse them. Reference locating identifies the reference section in an article and then decomposes it into individual references. We formulate this step as a two-class classification problem based on text and geometric features. An evaluation conducted on 500 articles drawn from 100 medical journals achieves near-perfect precision and recall rates for locating references. Reference parsing identifies the components of each reference. For this second step, we implement and compare two algorithms. One relies on sequence statistics and trains a Conditional Random Field. The other focuses on local feature statistics and trains a Support Vector Machine to classify each individual word, followed by a search algorithm that systematically corrects low confidence labels if the label sequence violates a set of predefined rules. The overall performance of these two reference-parsing algorithms is about the same: above 99% accuracy at the word level, and over 97% accuracy at the chunk level.

  7. LR-parsing of Extended Context-free Grammars

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann; Kristensen, Bent Bruun

    1976-01-01

    To improve the readability of a grammar it is common to use extended context free grammars (ECFGs) which are context free grammars (CFGs) extended with the repetition operator (*), the alternation operator (¦) and parentheses to express the right hand sides of the productions. The topic treated h...... here is LR-parsing of ECFGs. The LR(k) concept is generalized to ECFGs, a set of LR-preserving transformations from ECFGs to CFGs is given and finally it is shown how to construct LR-parsers directly from ECFGs....

  8. KEGGParser: parsing and editing KEGG pathway maps in Matlab.

    Science.gov (United States)

    Arakelyan, Arsen; Nersisyan, Lilit

    2013-02-15

    KEGG pathway database is a collection of manually drawn pathway maps accompanied with KGML format files intended for use in automatic analysis. KGML files, however, do not contain the required information for complete reproduction of all the events indicated in the static image of a pathway map. Several parsers and editors of KEGG pathways exist for processing KGML files. We introduce KEGGParser-a MATLAB based tool for KEGG pathway parsing, semiautomatic fixing, editing, visualization and analysis in MATLAB environment. It also works with Scilab. The source code is available at http://www.mathworks.com/matlabcentral/fileexchange/37561.

  9. Automated vocabulary discovery for geo-parsing online epidemic intelligence.

    Science.gov (United States)

    Keller, Mikaela; Freifeld, Clark C; Brownstein, John S

    2009-11-24

    Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  10. Automated vocabulary discovery for geo-parsing online epidemic intelligence

    Directory of Open Access Journals (Sweden)

    Freifeld Clark C

    2009-11-01

    Full Text Available Abstract Background Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Results Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. Conclusion The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  11. Analysis of Azari Language based on Parsing using Link Gram

    Directory of Open Access Journals (Sweden)

    Maryam Arabzadeh

    2014-09-01

    Full Text Available There are different classes of theories for the natural lanuguage syntactic parsing problem and for creating the related grammars .This paper presents a syntactic grammar developed in the link grammar formalism for Turkish which is an agglutinative language. In the link grammar formalism, the words of a sentence are linked with each other depending on their syntactic roles. Turkish has complex derivational and inflectional morphology, and derivational and inflection morphemes play important syntactic roles in the sentences. In order to develop a link grammar for Turkish, the lexical parts in the morphological representations of Turkish words are removed, and the links are created depending on the part of speech tags and inflectional morphemes in words. Furthermore, a derived word is separated at the derivational boundaries in order to treat each derivation morpheme as a special distinct word, and allow it to be linked with the rest of the sentence. The derivational morphemes of a word are also linked with each other with special links to indicate that they are parts of the same word. Finally the adapted unique link grammar formalism for Turkish provides flexibility for the linkage construction, and similar methods can be used for other languages with complex morphology. Finally, using the Delphi programming language, the link grammar related to the Azeri language was developed and implemented and then by selecting 250 random sentences, this grammar is evaluated and then tested. For 84.31% of the sentences, the result set of the parser contains the correct parse.

  12. Motion based parsing for video from observational psychology

    Science.gov (United States)

    Kokaram, Anil; Doyle, Erika; Lennon, Daire; Joyeux, Laurent; Fuller, Ray

    2006-01-01

    In Psychology it is common to conduct studies involving the observation of humans undertaking some task. The sessions are typically recorded on video and used for subjective visual analysis. The subjective analysis is tedious and time consuming, not only because much useless video material is recorded but also because subjective measures of human behaviour are not necessarily repeatable. This paper presents tools using content based video analysis that allow automated parsing of video from one such study involving Dyslexia. The tools rely on implicit measures of human motion that can be generalised to other applications in the domain of human observation. Results comparing quantitative assessment of human motion with subjective assessment are also presented, illustrating that the system is a useful scientific tool.

  13. Parsing citations in biomedical articles using conditional random fields.

    Science.gov (United States)

    Zhang, Qing; Cao, Yong-Gang; Yu, Hong

    2011-04-01

    Citations are used ubiquitously in biomedical full-text articles and play an important role for representing both the rhetorical structure and the semantic content of the articles. As a result, text mining systems will significantly benefit from a tool that automatically extracts the content of a citation. In this study, we applied the supervised machine-learning algorithms Conditional Random Fields (CRFs) to automatically parse a citation into its fields (e.g., Author, Title, Journal, and Year). With a subset of html format open-access PubMed Central articles, we report an overall 97.95% F1-score. The citation parser can be accessed at: http://www.cs.uwm.edu/∼qing/projects/cithit/index.html. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Relative clauses as a benchmark for Minimalist parsing

    Directory of Open Access Journals (Sweden)

    Thomas Graf

    2017-07-01

    Full Text Available Minimalist grammars have been used recently in a series of papers to explain well-known contrasts in human sentence processing in terms of subtle structural differences. These proposals combine a top-down parser with complexity metrics that relate parsing difficulty to memory usage. So far, though, there has been no large-scale exploration of the space of viable metrics. Building on this earlier work, we compare the ability of 1600 metrics to derive several processing effects observed with relative clauses, many of which have been proven difficult to unify. We show that among those 1600 candidates, a few metrics (and only a few can provide a unified account of all these contrasts. This is a welcome result for two reasons: First, it provides a novel account of extensively studied psycholinguistic data. Second, it significantly limits the number of viable metrics that may be applied to other phenomena, thus reducing theoretical indeterminacy.

  15. Parsing interindividual drug variability: an emerging role for systems pharmacology

    Science.gov (United States)

    Turner, Richard M; Park, B Kevin; Pirmohamed, Munir

    2015-01-01

    There is notable interindividual heterogeneity in drug response, affecting both drug efficacy and toxicity, resulting in patient harm and the inefficient utilization of limited healthcare resources. Pharmacogenomics is at the forefront of research to understand interindividual drug response variability, but although many genotype-drug response associations have been identified, translation of pharmacogenomic associations into clinical practice has been hampered by inconsistent findings and inadequate predictive values. These limitations are in part due to the complex interplay between drug-specific, human body and environmental factors influencing drug response and therefore pharmacogenomics, whilst intrinsically necessary, is by itself unlikely to adequately parse drug variability. The emergent, interdisciplinary and rapidly developing field of systems pharmacology, which incorporates but goes beyond pharmacogenomics, holds significant potential to further parse interindividual drug variability. Systems pharmacology broadly encompasses two distinct research efforts, pharmacologically-orientated systems biology and pharmacometrics. Pharmacologically-orientated systems biology utilizes high throughput omics technologies, including next-generation sequencing, transcriptomics and proteomics, to identify factors associated with differential drug response within the different levels of biological organization in the hierarchical human body. Increasingly complex pharmacometric models are being developed that quantitatively integrate factors associated with drug response. Although distinct, these research areas complement one another and continual development can be facilitated by iterating between dynamic experimental and computational findings. Ultimately, quantitative data-derived models of sufficient detail will be required to help realize the goal of precision medicine. WIREs Syst Biol Med 2015, 7:221–241. doi: 10.1002/wsbm.1302 PMID:25950758

  16. Machine learning to parse breast pathology reports in Chinese.

    Science.gov (United States)

    Tang, Rong; Ouyang, Lizhi; Li, Clara; He, Yue; Griffin, Molly; Taghian, Alphonse; Smith, Barbara; Yala, Adam; Barzilay, Regina; Hughes, Kevin

    2018-01-29

    Large structured databases of pathology findings are valuable in deriving new clinical insights. However, they are labor intensive to create and generally require manual annotation. There has been some work in the bioinformatics community to support automating this work via machine learning in English. Our contribution is to provide an automated approach to construct such structured databases in Chinese, and to set the stage for extraction from other languages. We collected 2104 de-identified Chinese benign and malignant breast pathology reports from Hunan Cancer Hospital. Physicians with native Chinese proficiency reviewed the reports and annotated a variety of binary and numerical pathologic entities. After excluding 78 cases with a bilateral lesion in the same report, 1216 cases were used as a training set for the algorithm, which was then refined by 405 development cases. The Natural language processing algorithm was tested by using the remaining 405 cases to evaluate the machine learning outcome. The model was used to extract 13 binary entities and 8 numerical entities. When compared to physicians with native Chinese proficiency, the model showed a per-entity accuracy from 91 to 100% for all common diagnoses on the test set. The overall accuracy of binary entities was 98% and of numerical entities was 95%. In a per-report evaluation for binary entities with more than 100 training cases, 85% of all the testing reports were completely correct and 11% had an error in 1 out of 22 entities. We have demonstrated that Chinese breast pathology reports can be automatically parsed into structured data using standard machine learning approaches. The results of our study demonstrate that techniques effective in parsing English reports can be scaled to other languages.

  17. Using machine learning to parse breast pathology reports.

    Science.gov (United States)

    Yala, Adam; Barzilay, Regina; Salama, Laura; Griffin, Molly; Sollender, Grace; Bardia, Aditya; Lehman, Constance; Buckley, Julliette M; Coopey, Suzanne B; Polubriaginof, Fernanda; Garber, Judy E; Smith, Barbara L; Gadd, Michele A; Specht, Michelle C; Gudewicz, Thomas M; Guidi, Anthony J; Taghian, Alphonse; Hughes, Kevin S

    2017-01-01

    Extracting information from electronic medical record is a time-consuming and expensive process when done manually. Rule-based and machine learning techniques are two approaches to solving this problem. In this study, we trained a machine learning model on pathology reports to extract pertinent tumor characteristics, which enabled us to create a large database of attribute searchable pathology reports. This database can be used to identify cohorts of patients with characteristics of interest. We collected a total of 91,505 breast pathology reports from three Partners hospitals: Massachusetts General Hospital, Brigham and Women's Hospital, and Newton-Wellesley Hospital, covering the period from 1978 to 2016. We trained our system with annotations from two datasets, consisting of 6295 and 10,841 manually annotated reports. The system extracts 20 separate categories of information, including atypia types and various tumor characteristics such as receptors. We also report a learning curve analysis to show how much annotation our model needs to perform reasonably. The model accuracy was tested on 500 reports that did not overlap with the training set. The model achieved accuracy of 90% for correctly parsing all carcinoma and atypia categories for a given patient. The average accuracy for individual categories was 97%. Using this classifier, we created a database of 91,505 parsed pathology reports. Our learning curve analysis shows that the model can achieve reasonable results even when trained on a few annotations. We developed a user-friendly interface to the database that allows physicians to easily identify patients with target characteristics and export the matching cohort. This model has the potential to reduce the effort required for analyzing large amounts of data from medical records, and to minimize the cost and time required to glean scientific insight from these data.

  18. Marginal Space Deep Learning: Efficient Architecture for Volumetric Image Parsing.

    Science.gov (United States)

    Ghesu, Florin C; Krubasik, Edward; Georgescu, Bogdan; Singh, Vivek; Yefeng Zheng; Hornegger, Joachim; Comaniciu, Dorin

    2016-05-01

    Robust and fast solutions for anatomical object detection and segmentation support the entire clinical workflow from diagnosis, patient stratification, therapy planning, intervention and follow-up. Current state-of-the-art techniques for parsing volumetric medical image data are typically based on machine learning methods that exploit large annotated image databases. Two main challenges need to be addressed, these are the efficiency in scanning high-dimensional parametric spaces and the need for representative image features which require significant efforts of manual engineering. We propose a pipeline for object detection and segmentation in the context of volumetric image parsing, solving a two-step learning problem: anatomical pose estimation and boundary delineation. For this task we introduce Marginal Space Deep Learning (MSDL), a novel framework exploiting both the strengths of efficient object parametrization in hierarchical marginal spaces and the automated feature design of Deep Learning (DL) network architectures. In the 3D context, the application of deep learning systems is limited by the very high complexity of the parametrization. More specifically 9 parameters are necessary to describe a restricted affine transformation in 3D, resulting in a prohibitive amount of billions of scanning hypotheses. The mechanism of marginal space learning provides excellent run-time performance by learning classifiers in clustered, high-probability regions in spaces of gradually increasing dimensionality. To further increase computational efficiency and robustness, in our system we learn sparse adaptive data sampling patterns that automatically capture the structure of the input. Given the object localization, we propose a DL-based active shape model to estimate the non-rigid object boundary. Experimental results are presented on the aortic valve in ultrasound using an extensive dataset of 2891 volumes from 869 patients, showing significant improvements of up to 45

  19. Incremental implicit learning of bundles of statistical patterns.

    Science.gov (United States)

    Qian, Ting; Jaeger, T Florian; Aslin, Richard N

    2016-12-01

    Forming an accurate representation of a task environment often takes place incrementally as the information relevant to learning the representation only unfolds over time. This incremental nature of learning poses an important problem: it is usually unclear whether a sequence of stimuli consists of only a single pattern, or multiple patterns that are spliced together. In the former case, the learner can directly use each observed stimulus to continuously revise its representation of the task environment. In the latter case, however, the learner must first parse the sequence of stimuli into different bundles, so as to not conflate the multiple patterns. We created a video-game statistical learning paradigm and investigated (1) whether learners without prior knowledge of the existence of multiple "stimulus bundles" - subsequences of stimuli that define locally coherent statistical patterns - could detect their presence in the input and (2) whether learners are capable of constructing a rich representation that encodes the various statistical patterns associated with bundles. By comparing human learning behavior to the predictions of three computational models, we find evidence that learners can handle both tasks successfully. In addition, we discuss the underlying reasons for why the learning of stimulus bundles occurs even when such behavior may seem irrational. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Incremental Aerodynamic Coefficient Database for the USA2

    Science.gov (United States)

    Richardson, Annie Catherine

    2016-01-01

    In March through May of 2016, a wind tunnel test was conducted by the Aerosciences Branch (EV33) to visually study the unsteady aerodynamic behavior over multiple transition geometries for the Universal Stage Adapter 2 (USA2) in the MSFC Aerodynamic Research Facility's Trisonic Wind Tunnel (TWT). The purpose of the test was to make a qualitative comparison of the transonic flow field in order to provide a recommended minimum transition radius for manufacturing. Additionally, 6 Degree of Freedom force and moment data for each configuration tested was acquired in order to determine the geometric effects on the longitudinal aerodynamic coefficients (Normal Force, Axial Force, and Pitching Moment). In order to make a quantitative comparison of the aerodynamic effects of the USA2 transition geometry, the aerodynamic coefficient data collected during the test was parsed and incorporated into a database for each USA2 configuration tested. An incremental aerodynamic coefficient database was then developed using the generated databases for each USA2 geometry as a function of Mach number and angle of attack. The final USA2 coefficient increments will be applied to the aerodynamic coefficients of the baseline geometry to adjust the Space Launch System (SLS) integrated launch vehicle force and moment database based on the transition geometry of the USA2.

  1. Chomsky-Schützenberger parsing for weighted multiple context-free languages

    Directory of Open Access Journals (Sweden)

    Tobias Denkinger

    2017-07-01

    Full Text Available We prove a Chomsky-Schützenberger representation theorem for multiple context-free languages weighted over complete commutative strong bimonoids. Using this representation we devise a parsing algorithm for a restricted form of those devices.

  2. Toward the Soundness of Sense Structure Definitions in Thesaurus-Dictionaries. Parsing Problems and Solutions

    Directory of Open Access Journals (Sweden)

    Neculai Curteanu

    2012-10-01

    Full Text Available In this paper we point out some difficult problems of thesaurus-dictionary entry parsing, relying on the parsing technology of SCD (Segmentation-Cohesion-Dependency configurations, successfully applied on six largest thesauri -- Romanian (2, French, German (2, and Russian. \\textbf{Challenging Problems:} \\textbf{(a}~Intricate and~/~or recursive structures of the lexicographic segments met in the entries of certain thesauri; \\textbf{(b}~Cyclicity (recursive calls of some sense marker classes on marker sequences; \\textbf{(c}~Establishing the hypergraph-driven dependencies between all the atomic and non-atomic sense definitions. Classical approach to solve these parsing problems is hard mainly because of depth-first search of sense definitions and markers, the substantial complexity of entries, and the sense tree dynamic construction embodied within these parsers. \\textbf{SCD-based Parsing Solutions:} \\textbf{(a}~The SCD parsing method is a procedural tool, completely formal grammar-free, handling the recursive structure of the lexicographic segments by procedural non-recursive calls performed on the SCD parsing configurations of the entry structure. \\textbf{(b}~For dealing with cyclicity (recursive calls between secondary sense markers and the sense enumeration markers, we proposed the Enumeration Closing Condition, sometimes coupled with New{\\_}Paragraphs typographic markers transformed into numeral sense enumeration. \\textbf{(c}~These problems, their lexicographic modeling and parsing solutions are addressed to both dictionary parser programmers to experience the SCD-based parsing method, as well as to lexicographers and thesauri designers for tailoring balanced lexical-semantics granularities and sounder sense tree definitions of the dictionary entries.

  3. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  4. Detecting modification of biomedical events using a deep parsing approach

    Directory of Open Access Journals (Sweden)

    MacKinlay Andrew

    2012-04-01

    Full Text Available Abstract Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur. The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.

  5. Deep PDF parsing to extract features for detecting embedded malware.

    Energy Technology Data Exchange (ETDEWEB)

    Munson, Miles Arthur; Cross, Jesse S. (Missouri University of Science and Technology, Rolla, MO)

    2011-09-01

    The number of PDF files with embedded malicious code has risen significantly in the past few years. This is due to the portability of the file format, the ways Adobe Reader recovers from corrupt PDF files, the addition of many multimedia and scripting extensions to the file format, and many format properties the malware author may use to disguise the presence of malware. Current research focuses on executable, MS Office, and HTML formats. In this paper, several features and properties of PDF Files are identified. Features are extracted using an instrumented open source PDF viewer. The feature descriptions of benign and malicious PDFs can be used to construct a machine learning model for detecting possible malware in future PDF files. The detection rate of PDF malware by current antivirus software is very low. A PDF file is easy to edit and manipulate because it is a text format, providing a low barrier to malware authors. Analyzing PDF files for malware is nonetheless difficult because of (a) the complexity of the formatting language, (b) the parsing idiosyncrasies in Adobe Reader, and (c) undocumented correction techniques employed in Adobe Reader. In May 2011, Esparza demonstrated that PDF malware could be hidden from 42 of 43 antivirus packages by combining multiple obfuscation techniques [4]. One reason current antivirus software fails is the ease of varying byte sequences in PDF malware, thereby rendering conventional signature-based virus detection useless. The compression and encryption functions produce sequences of bytes that are each functions of multiple input bytes. As a result, padding the malware payload with some whitespace before compression/encryption can change many of the bytes in the final payload. In this study we analyzed a corpus of 2591 benign and 87 malicious PDF files. While this corpus is admittedly small, it allowed us to test a system for collecting indicators of embedded PDF malware. We will call these indicators features throughout

  6. Incremental Visualizer for Visible Objects

    DEFF Research Database (Denmark)

    Bukauskas, Linas; Bøhlen, Michael Hanspeter

    This paper discusses the integration of database back-end and visualizer front-end into a one tightly coupled system. The main aim which we achieve is to reduce the data pipeline from database to visualization by using incremental data extraction of visible objects in a fly-through scenarios. We...

  7. Incremental data compression -extended abstract-

    NARCIS (Netherlands)

    Jeuring, J.T.

    1992-01-01

    Data may be compressed using textual substitution. Textual substitution identifies repeated substrings and replaces some or all substrings by pointers to another copy. We construct an incremental algorithm for a specific textual substitution method: coding a text with respect to a dictionary. With

  8. Robot training through incremental learning

    Science.gov (United States)

    Karlsen, Robert E.; Hunt, Shawn; Witus, Gary

    2011-05-01

    The real world is too complex and variable to directly program an autonomous ground robot's control system to respond to the inputs from its environmental sensors such as LIDAR and video. The need for learning incrementally, discarding prior data, is important because of the vast amount of data that can be generated by these sensors. This is crucial because the system needs to generate and update its internal models in real-time. There should be little difference between the training and execution phases; the system should be continually learning, or engaged in "life-long learning". This paper explores research into incremental learning systems such as nearest neighbor, Bayesian classifiers, and fuzzy c-means clustering.

  9. The Psychological Reality of Grammar : the Theta Principle in Parsing Performance

    NARCIS (Netherlands)

    Sadeh-Leicht, O.

    2007-01-01

    This dissertation presents evidence for the psychological reality of a grammatical principle, the Theta Principle. It adopts a grammar-derived theory of human natural language processing ? the thematic parser. Parsing phenomena such as garden path effects (The horse raced past the barn fell) and

  10. Introduction to special issue on machine learning approaches to shallow parsing

    NARCIS (Netherlands)

    Hammerton, J; Osborne, M; Armstrong, S; Daelemans, W

    2002-01-01

    This article introduces the problem of partial or shallow parsing (assigning partial syntactic structure to sentences) and explains why it is an important natural language processing (NLP) task. The complexity of the task makes Machine Learning an attractive option in comparison to the handcrafting

  11. Neural Semantic Parsing by Character-based Translation: Experiments with Abstract Meaning Representations

    NARCIS (Netherlands)

    van Noord, Rik; Bos, Johannes

    2017-01-01

    We evaluate the character-level translation method for neural semantic parsing on a large corpus of sentences annotated with Abstract Meaning Representations (AMRs). Using a sequence-to-sequence model, and some trivial preprocessing and postprocessing of AMRs, we obtain a baseline accuracy of 53.1

  12. Single-View 3D Scene Reconstruction and Parsing by Attribute Grammar.

    Science.gov (United States)

    Liu, Xiaobai; Zhao, Yibiao; Zhu, Song-Chun

    2018-03-01

    In this paper, we present an attribute grammar for solving two coupled tasks: i) parsing a 2D image into semantic regions; and ii) recovering the 3D scene structures of all regions. The proposed grammar consists of a set of production rules, each describing a kind of spatial relation between planar surfaces in 3D scenes. These production rules are used to decompose an input image into a hierarchical parse graph representation where each graph node indicates a planar surface or a composite surface. Different from other stochastic image grammars, the proposed grammar augments each graph node with a set of attribute variables to depict scene-level global geometry, e.g., camera focal length, or local geometry, e.g., surface normal, contact lines between surfaces. These geometric attributes impose constraints between a node and its off-springs in the parse graph. Under a probabilistic framework, we develop a Markov Chain Monte Carlo method to construct a parse graph that optimizes the 2D image recognition and 3D scene reconstruction purposes simultaneously. We evaluated our method on both public benchmarks and newly collected datasets. Experiments demonstrate that the proposed method is capable of achieving state-of-the-art scene reconstruction of a single image.

  13. Generative re-ranking model for dependency parsing of Italian sentences

    NARCIS (Netherlands)

    Sangati, F.

    2009-01-01

    We present a general framework for dependency parsing of Italian sentences based on a combination of discriminative and generative models. We use a state-of-the-art discriminative model to obtain a k-best list of candidate structures for the test sentences, and use the generative model to compute

  14. Fuzzy context-free languages - Part 2: Recognition and parsing algorithms

    NARCIS (Netherlands)

    Asveld, P.R.J.

    2005-01-01

    In a companion paper [P.R.J. Asveld, Fuzzy context-free languages---Part 1: Generalized fuzzy context-free grammars, Theoret. Comp. Sci. (2005)] we used fuzzy context-free grammars in order to model grammatical errors resulting in erroneous inputs for robust recognizing and parsing algorithms for

  15. ParseCNV integrative copy number variation association software with quality tracking.

    Science.gov (United States)

    Glessner, Joseph T; Li, Jin; Hakonarson, Hakon

    2013-03-01

    A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case-control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net.

  16. Incremental Trust in Grid Computing

    DEFF Research Database (Denmark)

    Brinkløv, Michael Hvalsøe; Sharp, Robin

    2007-01-01

    This paper describes a comparative simulation study of some incremental trust and reputation algorithms for handling behavioural trust in large distributed systems. Two types of reputation algorithm (based on discrete and Bayesian evaluation of ratings) and two ways of combining direct trust...... and reputation (discrete combination and combination based on fuzzy logic) are considered. The various combinations of these methods are evaluated from the point of view of their ability to respond to changes in behaviour and the ease with which suitable parameters for the algorithms can be found in the context...... of Grid computing systems....

  17. Incremental deformation: A literature review

    Directory of Open Access Journals (Sweden)

    Nasulea Daniel

    2017-01-01

    Full Text Available Nowadays the customer requirements are in permanent changing and according with them the tendencies in the modern industry is to implement flexible manufacturing processes. In the last decades, metal forming gained attention of the researchers and considerable changes has occurred. Because for a small number of parts, the conventional metal forming processes are expensive and time-consuming in terms of designing and manufacturing preparation, the manufacturers and researchers became interested in flexible processes. One of the most investigated flexible processes in metal forming is incremental sheet forming (ISF. ISF is an advanced flexible manufacturing process which allows to manufacture complex 3D products without expensive dedicated tools. In most of the cases it is needed for an ISF process the following: a simple tool, a fixing device for sheet metal blank and a universal CNC machine. Using this process it can be manufactured axis-symmetric parts, usually using a CNC lathe but also complex asymmetrical parts using CNC milling machines, robots or dedicated equipment. This paper aim to present the current status of incremental sheet forming technologies in terms of process parameters and their influences, wall thickness distribution, springback effect, formability, surface quality and the current main research directions.

  18. Incremental Observer Relative Data Extraction

    DEFF Research Database (Denmark)

    Bukauskas, Linas; Bøhlen, Michael Hanspeter

    2004-01-01

    The visual exploration of large databases calls for a tight coupling of database and visualization systems. Current visualization systems typically fetch all the data and organize it in a scene tree that is then used to render the visible data. For immersive data explorations in a Cave...... or a Panorama, where an observer is data space this approach is far from optimal. A more scalable approach is to make the observer-aware database system and to restrict the communication between the database and visualization systems to the relevant data. In this paper VR-tree, an extension of the R......-tree, is used to index visibility ranges of objects. We introduce a new operator for incremental Observer Relative data Extraction (iORDE). We propose the Volatile Access STructure (VAST), a lightweight main memory structure that is created on the fly and is maintained during visual data explorations. VAST...

  19. Fetching and Parsing Data from the Web with OpenRefine

    Directory of Open Access Journals (Sweden)

    Evan Peter Williamson

    2017-08-01

    Full Text Available OpenRefine is a powerful tool for exploring, cleaning, and transforming data. An earlier Programming Historian lesson, “Cleaning Data with OpenRefine”, introduced the basic functionality of Refine to efficiently discover and correct inconsistency in a data set. Building on those essential data wrangling skills, this lesson focuses on Refine’s ability to fetch URLs and parse web content. Examples introduce some of the advanced features to transform and enhance a data set including: - fetch URLs using Refine - construct URL queries to retrieve information from a simple web API - parse HTML and JSON responses to extract relevant data - use array functions to manipulate string values - use Jython to extend Refine’s functionality It will be helpful to have basic familiarity with OpenRefine, HTML, and programming concepts such as variables and loops to complete this lesson.

  20. BlaSTorage: a fast package to parse, manage and store BLAST results.

    Science.gov (United States)

    Orsini, Massimiliano; Carcangiu, Simone

    2013-01-30

    Large-scale sequence studies requiring BLAST-based analysis produce huge amounts of data to be parsed. BLAST parsers are available, but they are often missing some important features, such as keeping all information from the raw BLAST output, allowing direct access to single results, and performing logical operations over them. We implemented BlaSTorage, a Python package that parses multi BLAST results and returns them in a purpose-built object-database format. Unlike other BLAST parsers, BlaSTorage retains and stores all parts of BLAST results, including alignments, without loss of information; a complete API allows access to all the data components. BlaSTorage shows comparable speed of more basic parser written in compiled languages as C++ and can be easily integrated into web applications or software pipelines.

  1. BlaSTorage: a fast package to parse, manage and store BLAST results

    Directory of Open Access Journals (Sweden)

    Orsini Massimiliano

    2013-01-01

    Full Text Available Abstract Background Large-scale sequence studies requiring BLAST-based analysis produce huge amounts of data to be parsed. BLAST parsers are available, but they are often missing some important features, such as keeping all information from the raw BLAST output, allowing direct access to single results, and performing logical operations over them. Findings We implemented BlaSTorage, a Python package that parses multi BLAST results and returns them in a purpose-built object-database format. Unlike other BLAST parsers, BlaSTorage retains and stores all parts of BLAST results, including alignments, without loss of information; a complete API allows access to all the data components. Conclusions BlaSTorage shows comparable speed of more basic parser written in compiled languages as C++ and can be easily integrated into web applications or software pipelines.

  2. Learning for Semantic Parsing with Kernels under Various Forms of Supervision

    Science.gov (United States)

    2007-08-01

    is because Turkish has larger number of unique tokens (36% more than English ) due to its complex agglunative morphology , which makes learning from its...Semantic Parsing with Kernels under Various Forms of Supervision 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Subsequence Kernel . . . . . . . . . . . . . . . . 20 2.3 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.3.1 Syntax -based Semantic

  3. Automatic extraction of syntactic patterns for dependency parsing in noun phrase chunks

    Directory of Open Access Journals (Sweden)

    Mihaela Colhon

    2014-05-01

    Full Text Available In this article we present a method for automatic extraction of syntactic patterns that are used to develop a dependency parsing method. The patterns have been extracted from a corpus automatically annotated for tokens, sentences’ borders, parts of speech and noun phrases, and manually annotated for dependency relations between words. The evaluation shows promising results in the case of an order-free language.

  4. A Grammar Correction Algorithm – Deep Parsing and Minimal Corrections for a Grammar Checker

    OpenAIRE

    Clément, Lionel; Gerdes, Kim; Marlet, Renaud

    2009-01-01

    International audience; This article presents the central algorithm of an open system for grammar checking, based on deep parsing. The grammatical specification is a context-free grammar with flat feature structures. After a shared-forest analysis where feature agreement constraints are relaxed, error detection globally minimizes the number of corrections and alternative correct sentences are automatically proposed in an order of plausibility reflecting the number of changes made to the origi...

  5. On excursion increments in heartbeat dynamics

    International Nuclear Information System (INIS)

    Guzmán-Vargas, L.; Reyes-Ramírez, I.; Hernández-Pérez, R.

    2013-01-01

    We study correlation properties of excursion increments of heartbeat time series from healthy subjects and heart failure patients. We construct the excursion time based on the original heartbeat time series, representing the time employed by the walker to return to the local mean value. Next, the detrended fluctuation analysis and the fractal dimension method are applied to the magnitude and sign of the increments in the time excursions between successive excursions for the mentioned groups. Our results show that for magnitude series of excursion increments both groups display long-range correlations with similar correlation exponents, indicating that large (small) increments (decrements) are more likely to be followed by large (small) increments (decrements). For sign sequences and for both groups, we find that increments are short-range anti-correlated, which is noticeable under heart failure conditions

  6. Building Program Models Incrementally from Informal Descriptions.

    Science.gov (United States)

    1979-10-01

    AD-AOB6 50 STANFORD UNIV CA DEPT OF COMPUTER SCIENCE F/G 9/2 BUILDING PROGRAM MODELS INCREMENTALLY FROM INFORMAL DESCRIPTION--ETC(U) OCT 79 B P...port SCI.ICS.U.79.2 t Building Program Models Incrementally from Informal Descriptions by Brian P. McCune Research sponsored by Defense Advanced...TYPE OF REPORT & PERIOD COVERED Building Program Models Incrementally from Informal Descriptions. , technical, October 1979 6. PERFORMING ORG

  7. "gnparser": a powerful parser for scientific names based on Parsing Expression Grammar.

    Science.gov (United States)

    Mozzherin, Dmitry Y; Myltsev, Alexander A; Patterson, David J

    2017-05-26

    Scientific names in biology act as universal links. They allow us to cross-reference information about organisms globally. However variations in spelling of scientific names greatly diminish their ability to interconnect data. Such variations may include abbreviations, annotations, misspellings, etc. Authorship is a part of a scientific name and may also differ significantly. To match all possible variations of a name we need to divide them into their elements and classify each element according to its role. We refer to this as 'parsing' the name. Parsing categorizes name's elements into those that are stable and those that are prone to change. Names are matched first by combining them according to their stable elements. Matches are then refined by examining their varying elements. This two stage process dramatically improves the number and quality of matches. It is especially useful for the automatic data exchange within the context of "Big Data" in biology. We introduce Global Names Parser (gnparser). It is a Java tool written in Scala language (a language for Java Virtual Machine) to parse scientific names. It is based on a Parsing Expression Grammar. The parser can be applied to scientific names of any complexity. It assigns a semantic meaning (such as genus name, species epithet, rank, year of publication, authorship, annotations, etc.) to all elements of a name. It is able to work with nested structures as in the names of hybrids. gnparser performs with ≈99% accuracy and processes 30 million name-strings/hour per CPU thread. The gnparser library is compatible with Scala, Java, R, Jython, and JRuby. The parser can be used as a command line application, as a socket server, a web-app or as a RESTful HTTP-service. It is released under an Open source MIT license. Global Names Parser (gnparser) is a fast, high precision tool for biodiversity informaticians and biologists working with large numbers of scientific names. It can replace expensive and error

  8. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  9. Syntactic parsing of clinical text: guideline and corpus development with handling ill-formed sentences.

    Science.gov (United States)

    Fan, Jung-wei; Yang, Elly W; Jiang, Min; Prasad, Rashmi; Loomis, Richard M; Zisook, Daniel S; Denny, Josh C; Xu, Hua; Huang, Yang

    2013-01-01

    To develop, evaluate, and share: (1) syntactic parsing guidelines for clinical text, with a new approach to handling ill-formed sentences; and (2) a clinical Treebank annotated according to the guidelines. To document the process and findings for readers with similar interest. Using random samples from a shared natural language processing challenge dataset, we developed a handbook of domain-customized syntactic parsing guidelines based on iterative annotation and adjudication between two institutions. Special considerations were incorporated into the guidelines for handling ill-formed sentences, which are common in clinical text. Intra- and inter-annotator agreement rates were used to evaluate consistency in following the guidelines. Quantitative and qualitative properties of the annotated Treebank, as well as its use to retrain a statistical parser, were reported. A supplement to the Penn Treebank II guidelines was developed for annotating clinical sentences. After three iterations of annotation and adjudication on 450 sentences, the annotators reached an F-measure agreement rate of 0.930 (while intra-annotator rate was 0.948) on a final independent set. A total of 1100 sentences from progress notes were annotated that demonstrated domain-specific linguistic features. A statistical parser retrained with combined general English (mainly news text) annotations and our annotations achieved an accuracy of 0.811 (higher than models trained purely with either general or clinical sentences alone). Both the guidelines and syntactic annotations are made available at https://sourceforge.net/projects/medicaltreebank. We developed guidelines for parsing clinical text and annotated a corpus accordingly. The high intra- and inter-annotator agreement rates showed decent consistency in following the guidelines. The corpus was shown to be useful in retraining a statistical parser that achieved moderate accuracy.

  10. Efficient Incremental Checkpointing of Java Programs

    DEFF Research Database (Denmark)

    Lawall, Julia Laetitia; Muller, Gilles

    2000-01-01

    This paper investigates the optimization of language-level checkpointing of Java programs. First, we describe how to systematically associate incremental checkpoints with Java classes. While being safe, the genericness of this solution induces substantial execution overhead. Second, to solve...

  11. VT Tax Increment Financing (TIF) Districts

    Data.gov (United States)

    Vermont Center for Geographic Information — Tax Increment Financing (TIF) Districts is established by a municipality around an area that requires public infrastructure to encourage public and private real...

  12. Deep Action Parsing in Videos With Large-Scale Synthesized Data.

    Science.gov (United States)

    Liu, Li; Zhou, Yi; Shao, Ling

    2018-06-01

    Action parsing in videos with complex scenes is an interesting but challenging task in computer vision. In this paper, we propose a generic 3D convolutional neural network in a multi-task learning manner for effective Deep Action Parsing (DAP3D-Net) in videos. Particularly, in the training phase, action localization, classification, and attributes learning can be jointly optimized on our appearance-motion data via DAP3D-Net. For an upcoming test video, we can describe each individual action in the video simultaneously as: Where the action occurs, What the action is, and How the action is performed. To well demonstrate the effectiveness of the proposed DAP3D-Net, we also contribute a new Numerous-category Aligned Synthetic Action data set, i.e., NASA, which consists of 200 000 action clips of over 300 categories and with 33 pre-defined action attributes in two hierarchical levels (i.e., low-level attributes of basic body part movements and high-level attributes related to action motion). We learn DAP3D-Net using the NASA data set and then evaluate it on our collected Human Action Understanding data set and the public THUMOS data set. Experimental results show that our approach can accurately localize, categorize, and describe multiple actions in realistic videos.

  13. (Invariability in the Samoan syntax/prosody interface and consequences for syntactic parsing

    Directory of Open Access Journals (Sweden)

    Kristine M. Yu

    2017-10-01

    Full Text Available While it has long been clear that prosody should be part of the grammar influencing the action of the syntactic parser, how to bring prosody into computational models of syntactic parsing has remained unclear. The challenge is that prosodic information in the speech signal is the result of the interaction of a multitude of conditioning factors. From this output, how can we factor out the contribution of syntax to conditioning prosodic events? And if we are able to do that factorization and define a production model from the syntactic grammar to a prosodified utterance, how can we then define a comprehension model based on that production model? In this case study of the Samoan morphosyntax-prosody interface, we show how to factor out the influence of syntax on prosody in empirical work and confirm there is invariable morphosyntactic conditioning of high edge tones. Then, we show how this invariability can be precisely characterized and used by a parsing model that factors the various influences of morphosyntax on tonal events. We expect that models of these kinds can be extended to more comprehensive perspectives on Samoan and to languages where the syntax/prosody coupling is more complex.

  14. Attribute And-Or Grammar for Joint Parsing of Human Pose, Parts and Attributes.

    Science.gov (United States)

    Park, Seyoung; Nie, Xiaohan; Zhu, Song-Chun

    2017-07-25

    This paper presents an attribute and-or grammar (A-AOG) model for jointly inferring human body pose and human attributes in a parse graph with attributes augmented to nodes in the hierarchical representation. In contrast to other popular methods in the current literature that train separate classifiers for poses and individual attributes, our method explicitly represents the decomposition and articulation of body parts, and account for the correlations between poses and attributes. The A-AOG model is an amalgamation of three traditional grammar formulations: (i)Phrase structure grammar representing the hierarchical decomposition of the human body from whole to parts; (ii)Dependency grammar modeling the geometric articulation by a kinematic graph of the body pose; and (iii)Attribute grammar accounting for the compatibility relations between different parts in the hierarchy so that their appearances follow a consistent style. The parse graph outputs human detection, pose estimation, and attribute prediction simultaneously, which are intuitive and interpretable. We conduct experiments on two tasks on two datasets, and experimental results demonstrate the advantage of joint modeling in comparison with computing poses and attributes independently. Furthermore, our model obtains better performance over existing methods for both pose estimation and attribute prediction tasks.

  15. Acoustic landmarks drive delta-theta oscillations to enable speech comprehension by facilitating perceptual parsing.

    Science.gov (United States)

    Doelling, Keith B; Arnal, Luc H; Ghitza, Oded; Poeppel, David

    2014-01-15

    A growing body of research suggests that intrinsic neuronal slow (speech and other spectro-temporally complex auditory signals. Within this framework, several recent studies have identified critical-band temporal envelopes as the specific acoustic feature being reflected by the phase of these oscillations. However, how this alignment between speech acoustics and neural oscillations might underpin intelligibility is unclear. Here we test the hypothesis that the 'sharpness' of temporal fluctuations in the critical band envelope acts as a temporal cue to speech syllabic rate, driving delta-theta rhythms to track the stimulus and facilitate intelligibility. We interpret our findings as evidence that sharp events in the stimulus cause cortical rhythms to re-align and parse the stimulus into syllable-sized chunks for further decoding. Using magnetoencephalographic recordings, we show that by removing temporal fluctuations that occur at the syllabic rate, envelope-tracking activity is reduced. By artificially reinstating these temporal fluctuations, envelope-tracking activity is regained. These changes in tracking correlate with intelligibility of the stimulus. Together, the results suggest that the sharpness of fluctuations in the stimulus, as reflected in the cochlear output, drive oscillatory activity to track and entrain to the stimulus, at its syllabic rate. This process likely facilitates parsing of the stimulus into meaningful chunks appropriate for subsequent decoding, enhancing perception and intelligibility. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Parsing partial molar volumes of small molecules: a molecular dynamics study.

    Science.gov (United States)

    Patel, Nisha; Dubins, David N; Pomès, Régis; Chalikian, Tigran V

    2011-04-28

    We used molecular dynamics (MD) simulations in conjunction with the Kirkwood-Buff theory to compute the partial molar volumes for a number of small solutes of various chemical natures. We repeated our computations using modified pair potentials, first, in the absence of the Coulombic term and, second, in the absence of the Coulombic and the attractive Lennard-Jones terms. Comparison of our results with experimental data and the volumetric results of Monte Carlo simulation with hard sphere potentials and scaled particle theory-based computations led us to conclude that, for small solutes, the partial molar volume computed with the Lennard-Jones potential in the absence of the Coulombic term nearly coincides with the cavity volume. On the other hand, MD simulations carried out with the pair interaction potentials containing only the repulsive Lennard-Jones term produce unrealistically large partial molar volumes of solutes that are close to their excluded volumes. Our simulation results are in good agreement with the reported schemes for parsing partial molar volume data on small solutes. In particular, our determined interaction volumes() and the thickness of the thermal volume for individual compounds are in good agreement with empirical estimates. This work is the first computational study that supports and lends credence to the practical algorithms of parsing partial molar volume data that are currently in use for molecular interpretations of volumetric data.

  17. Process of 3D wireless decentralized sensor deployment using parsing crossover scheme

    Directory of Open Access Journals (Sweden)

    Albert H.R. Ko

    2015-07-01

    Full Text Available A Wireless Sensor Networks (WSN usually consists of numerous wireless devices deployed in a region of interest, each able to collect and process environmental information and communicate with neighboring devices. It can thus be regarded as a Multi-Agent System for territorial security, where individual agents cooperate with each other to avoid duplication of effort and to exploit other agent’s capacities. The problem of sensor deployment becomes non-trivial when we consider environmental factors, such as terrain elevations. Due to the fact that all sensors are homogeneous, the chromosomes that encode sensor positions are actually interchangeable, and conventional crossover schemes such as uniform crossover would cause some redundancy as well as over-concentration in certain specific geographical area. We propose a Parsing Crossover Scheme that intends to reduce redundancy and ease geographical concentration pattern in an effort to facilitate the search. The proposed parsing crossover method demonstrates better performances than those of uniform crossover under different terrain irregularities.

  18. Cuidado de enfermagem a pessoas com hipertensão fundamentado na teoria de Parse Atención de enfermería a personas con hipertensión basada en la teoría de Parse Nursing care to people with hypertension based on Parse's theory

    Directory of Open Access Journals (Sweden)

    Fabíola Vládia Freire da Silva

    2013-03-01

    Full Text Available Este estudo propõe o cuidado de enfermagem, baseado nos princípios de Parse, a pessoas com hipertensão consultadas na Estratégia Saúde da Família. Estudo descritivo, de cunho qualitativo, realizado de março a maio de 2011, com quatorze enfermeiros no município de Itapajé-Ceará. Para coleta das informações utilizou-se a entrevista semiestruturada e, para análise, o discurso dos sujeitos. Emergiram três categorias baseadas nos princípios de Parse: Multidimensão dos significados - o enfermeiro conduz ao relato dos significados; Sincronização de ritmos - o enfermeiro ajuda a identificar harmonia e desarmonia; Mobilização da transcendência - o enfermeiro guia o plano de mudanças. Notou-se aproximação dos discursos ao teorizado por Parse quando citaram buscar um cuidado humanizado, com a participação da família, valorização da autonomia, utilização da educação em saúde, com orientações individuais. Percebeu-se a viabilidade na implementação do cuidado de enfermagem fundamentado na Teoria de Parse a pessoas com hipertensão.Este estudio propone la atención de enfermería, basada en los principios de Parse, para personas con hipertensión en la Estrategia de Salud Familiar. Estudio descriptivo, cualitativo, realizado de marzo a mayo/2011, con catorce enfermeros en Itapajé, Ceará. Para la recolección de las informaciones, se utilizó la entrevista semiestructurada y para análisis, el discurso de los sujetos. Emergieron tres categorías: Multidimensiones de los significados - el enfermero conduce al relato de los significados; Sincronización de los ritmos - el enfermero ayuda a identificar armonía y desarmonía; Movilización de la trascendencia - el enfermero guía el plan de cambios. Se observó semejanzas entre el enfoque de los discursos y la Teoría de Parse, cuando citaron la búsqueda de atención humanizada, con participación de la familia, valoración de la autonomía, uso de la educación en salud

  19. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  20. Incremental Integrity Checking: Limitations and Possibilities

    DEFF Research Database (Denmark)

    Christiansen, Henning; Martinenghi, Davide

    2005-01-01

    Integrity checking is an essential means for the preservation of the intended semantics of a deductive database. Incrementality is the only feasible approach to checking and can be obtained with respect to given update patterns by exploiting query optimization techniques. By reducing the problem...... to query containment, we show that no procedure exists that always returns the best incremental test (aka simplification of integrity constraints), and this according to any reasonable criterion measuring the checking effort. In spite of this theoretical limitation, we develop an effective procedure...

  1. The Smallest Grammar Problem as Constituents Choice and Minimal Grammar Parsing

    Directory of Open Access Journals (Sweden)

    Gabriel Infante-Lopez

    2011-10-01

    Full Text Available The smallest grammar problem—namely, finding a smallest context-free grammar that generates exactly one sequence—is of practical and theoretical importance in fields such as Kolmogorov complexity, data compression and pattern discovery. We propose a new perspective on this problem by splitting it into two tasks: (1 choosing which words will be the constituents of the grammar and (2 searching for the smallest grammar given this set of constituents. We show how to solve the second task in polynomial time parsing longer constituent with smaller ones. We propose new algorithms based on classical practical algorithms that use this optimization to find small grammars. Our algorithms consistently find smaller grammars on a classical benchmark reducing the size in 10% in some cases. Moreover, our formulation allows us to define interesting bounds on the number of small grammars and to empirically compare different grammars of small size.

  2. PEG parsing in less space using progressive tabling and dynamic analysis

    DEFF Research Database (Denmark)

    Henglein, Fritz; Rasmussen, Ulrik Terp

    2017-01-01

    Tabular top-down parsing and its lazy variant, Packrat, are lineartime execution models for the TDPL family of recursive descent parsers with limited backtracking. Exponential work due to backtracking is avoided by tabulating the result of each (nonterminal, offset)-pair at the expense of always......-case constant and worst-case linear memory use. Furthermore, semantic actions are scheduled before the parser has seen the end of the input. The scheduling is conservative in the sense that no action has to be "undone" in the case of backtracking. The time complexity is O(dmn) where m is the size of the parser...... specification, n is the size of the input string, and d is either a configured constant or the maximum parser stack depth. For common data exchange formats such as JSON, we demonstrate practically constant space usage....

  3. A Python package for parsing, validating, mapping and formatting sequence variants using HGVS nomenclature.

    Science.gov (United States)

    Hart, Reece K; Rico, Rudolph; Hare, Emily; Garcia, John; Westbrook, Jody; Fusaro, Vincent A

    2015-01-15

    Biological sequence variants are commonly represented in scientific literature, clinical reports and databases of variation using the mutation nomenclature guidelines endorsed by the Human Genome Variation Society (HGVS). Despite the widespread use of the standard, no freely available and comprehensive programming libraries are available. Here we report an open-source and easy-to-use Python library that facilitates the parsing, manipulation, formatting and validation of variants according to the HGVS specification. The current implementation focuses on the subset of the HGVS recommendations that precisely describe sequence-level variation relevant to the application of high-throughput sequencing to clinical diagnostics. The package is released under the Apache 2.0 open-source license. Source code, documentation and issue tracking are available at http://bitbucket.org/hgvs/hgvs/. Python packages are available at PyPI (https://pypi.python.org/pypi/hgvs). Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  4. [Causality in objective world: Directed Acyclic Graphs-based structural parsing].

    Science.gov (United States)

    Zheng, Y J; Zhao, N Q; He, Y N

    2018-01-10

    The overall details of causality frames in the objective world remain obscure, which poses difficulty for causality research. Based on the temporality of cause and effect, the objective world is divided into three time zones and two time points, in which the causal relationships of the variables are parsed by using Directed Acyclic Graphs (DAGs). Causal DAGs of the world (or causal web) is composed of two parts. One is basic or core to the whole DAGs, formed by the combination of any one variable originating from each time unit mentioned above. Cause effect is affected by the confounding only. The other is an internal DAGs within each time unit representing a parent-child or ancestor-descendant relationship, which exhibits a structure similar to the confounding. This paper summarizes the construction of causality frames for objective world research (causal DAGs), and clarify a structural basis for the control of the confounding in effect estimate.

  5. Parsing Heterogeneity in the Brain Connectivity of Depressed and Healthy Adults During Positive Mood.

    Science.gov (United States)

    Price, Rebecca B; Lane, Stephanie; Gates, Kathleen; Kraynak, Thomas E; Horner, Michelle S; Thase, Michael E; Siegle, Greg J

    2017-02-15

    There is well-known heterogeneity in affective mechanisms in depression that may extend to positive affect. We used data-driven parsing of neural connectivity to reveal subgroups present across depressed and healthy individuals during positive processing, informing targets for mechanistic intervention. Ninety-two individuals (68 depressed patients, 24 never-depressed control subjects) completed a sustained positive mood induction during functional magnetic resonance imaging. Directed functional connectivity paths within a depression-relevant network were characterized using Group Iterative Multiple Model Estimation (GIMME), a method shown to accurately recover the direction and presence of connectivity paths in individual participants. During model selection, individuals were clustered using community detection on neural connectivity estimates. Subgroups were externally tested across multiple levels of analysis. Two connectivity-based subgroups emerged: subgroup A, characterized by weaker connectivity overall, and subgroup B, exhibiting hyperconnectivity (relative to subgroup A), particularly among ventral affective regions. Subgroup predicted diagnostic status (subgroup B contained 81% of patients; 50% of control subjects; χ 2 = 8.6, p = .003) and default mode network connectivity during a separate resting-state task. Among patients, subgroup B members had higher self-reported symptoms, lower sustained positive mood during the induction, and higher negative bias on a reaction-time task. Symptom-based depression subgroups did not predict these external variables. Neural connectivity-based categorization travels with diagnostic category and is clinically predictive, but not clinically deterministic. Both patients and control subjects showed heterogeneous, and overlapping, profiles. The larger and more severely affected patient subgroup was characterized by ventrally driven hyperconnectivity during positive processing. Data-driven parsing suggests heterogeneous

  6. Excemplify: A Flexible Template Based Solution, Parsing and Managing Data in Spreadsheets for Experimentalists

    Directory of Open Access Journals (Sweden)

    Shi Lei

    2013-06-01

    Full Text Available In systems biology, quantitative experimental data is the basis of building mathematical models. In most of the cases, they are stored in Excel files and hosted locally. To have a public database for collecting, retrieving and citing experimental raw data as well as experimental conditions is important for both experimentalists and modelers. However, the great effort needed in the data handling procedure and in the data submission procedure becomes the crucial limitation for experimentalists to contribute to a database, thereby impeding the database to deliver its benefit. Moreover, manual copy and paste operations which are commonly used in those procedures increase the chance of making mistakes. Excemplify, a web-based application, proposes a flexible and adaptable template-based solution to solve these problems. Comparing to the normal template based uploading approach, which is supported by some public databases, rather than predefining a format that is potentiall impractical, Excemplify allows users to create their own experiment-specific content templates in different experiment stages and to build corresponding knowledge bases for parsing. Utilizing the embedded knowledge of used templates, Excemplify is able to parse experimental data from the initial setup stage and generate following stages spreadsheets automatically. The proposed solution standardizes the flows of data traveling according to the standard procedures of applying the experiment, cuts down the amount of manual effort and reduces the chance of mistakes caused by manual data handling. In addition, it maintains the context of meta-data from the initial preparation manuscript and improves the data consistency. It interoperates and complements RightField and SEEK as well.

  7. Theory of Single Point Incremental Forming

    DEFF Research Database (Denmark)

    Martins, P.A.F.; Bay, Niels; Skjødt, Martin

    2008-01-01

    This paper presents a closed-form theoretical analysis modelling the fundamentals of single point incremental forming and explaining the experimental and numerical results available in the literature for the past couple of years. The model is based on membrane analysis with bi-directional in-plan...

  8. Incremental Integrity Checking: Limitations and Possibilities

    DEFF Research Database (Denmark)

    Christiansen, Henning; Martinenghi, Davide

    2005-01-01

    Integrity checking is an essential means for the preservation of the intended semantics of a deductive database. Incrementality is the only feasible approach to checking and can be obtained with respect to given update patterns by exploiting query optimization techniques. By reducing the problem...

  9. The Cognitive Underpinnings of Incremental Rehearsal

    Science.gov (United States)

    Varma, Sashank; Schleisman, Katrina B.

    2014-01-01

    Incremental rehearsal (IR) is a flashcard technique that has been developed and evaluated by school psychologists. We discuss potential learning and memory effects from cognitive psychology that may explain the observed superiority of IR over other flashcard techniques. First, we propose that IR is a form of "spaced practice" that…

  10. Análisis de los posibles desde la teoría de Parse en una persona con Alzheimer The analysis of the possibles through Parse's theory on a person with Alzheimer

    Directory of Open Access Journals (Sweden)

    Virtudes Rodero-Sánchez

    2006-11-01

    Full Text Available Cuando una persona acude al sistema sanitario con un problema de salud, con mucha frecuencia se le va a pedir que introduzca cambios en sus hábitos y estilo de vida. Esta demanda se suele concretar en un pacto-compromiso que se establece persona-profesional. Hemos observado que este pacto, a pesar de que el profesional se esfuerza en enmarcarlo en objetivos realistas, con demasiada frecuencia sobreviene la frustración, sobre todo en escenarios de cronicidad. La teoría de Parse nos ofrece una manera diferente de abordar el cambio. En la teoría de Parse, El Ser Humano en Devenir, los posibles son la expresión de la fuerza, entendida como una manera única de transformación, que consiste en avanzar con las esperanzas, anhelos y los proyectos de la persona. Planteamos: en primer lugar un análisis de los elementos de lo que Parse llama su tercer principio, la co-trancendencia con los posibles; en segundo lugar el análisis de los posibles desde este marco de referencia a través de una narrativa; y por último la práctica enfermera.When a person comes to the Health Care System with a health problem will often be asked to change some of his habits and lifestyles. This demand becomes a compromise-pact between the person and the professional. We have observed that in this compromise-pact, despite the effort of the professional to hide it behind realist targets, the patient usually becomes frustrated, especially in cases of chronic illnesses. Parse's theory offers us a different way to approach the change. In Parse's theory, The Human Becoming, the possibles are the expression of power, understood as a unique way of transformation, consisting in advancing with the hopes, desires and projects of a person. We suggest, first of all, an analysis of the elements that Parse calls her third principle: co-transcendence with the possibles; secondly, the analysis of the possibles from the basis of this reference framework through a narration and, finally

  11. Evolving effective incremental SAT solvers with GP

    OpenAIRE

    Bader, Mohamed; Poli, R.

    2008-01-01

    Hyper-Heuristics could simply be defined as heuristics to choose other heuristics, and it is a way of combining existing heuristics to generate new ones. In a Hyper-Heuristic framework, the framework is used for evolving effective incremental (Inc*) solvers for SAT. We test the evolved heuristics (IncHH) against other known local search heuristics on a variety of benchmark SAT problems.

  12. Sustained mahogany (Swietenia macrophylla) plantation heartwood increment.

    Science.gov (United States)

    Frank H. Wadsworth; Edgardo. Gonzalez

    2008-01-01

    In a search for an increment-based rotation for plantation mahogany(Swietenia macrophylla King), heartwood volume per tree was regressed on DBH (trunk diameter outside bark at 1.4 m above the ground) and merchantable height measurements. We updated a previous study [Wadsworth, F.H., González González, E., Figuera Colón, J.C., Lugo P...

  13. Emotion regulation during threat: Parsing the time course and consequences of safety signal processing

    Science.gov (United States)

    HEFNER, KATHRYN R.; VERONA, EDELYN; CURTIN, JOHN. J.

    2017-01-01

    Improved understanding of fear inhibition processes can inform the etiology and treatment of anxiety disorders. Safety signals can reduce fear to threat, but precise mechanisms remain unclear. Safety signals may acquire attentional salience and affective properties (e.g., relief) independent of the threat; alternatively, safety signals may only hold affective value in the presence of simultaneous threat. To clarify such mechanisms, an experimental paradigm assessed independent processing of threat and safety cues. Participants viewed a series of red and green words from two semantic categories. Shocks were administered following red words (cue+). No shocks followed green words (cue−). Words from one category were defined as safety signals (SS); no shocks were administered on cue+ trials. Words from the other (control) category did not provide information regarding shock administration. Threat (cue+ vs. cue−) and safety (SS+ vs. SS−) were fully crossed. Startle response and ERPs were recorded. Startle response was increased during cue+ versus cue−. Safety signals reduced startle response during cue+, but had no effect on startle response during cue−. ERP analyses (PD130 and P3) suggested that participants parsed threat and safety signal information in parallel. Motivated attention was not associated with safety signals in the absence of threat. Overall, these results confirm that fear can be reduced by safety signals. Furthermore, safety signals do not appear to hold inherent hedonic salience independent of their effect during threat. Instead, safety signals appear to enable participants to engage in effective top-down emotion regulatory processes. PMID:27088643

  14. Parsing heuristic and forward search in first-graders' game-play behavior.

    Science.gov (United States)

    Paz, Luciano; Goldin, Andrea P; Diuk, Carlos; Sigman, Mariano

    2015-07-01

    Seventy-three children between 6 and 7 years of age were presented with a problem having ambiguous subgoal ordering. Performance in this task showed reliable fingerprints: (a) a non-monotonic dependence of performance as a function of the distance between the beginning and the end-states of the problem, (b) very high levels of performance when the first move was correct, and (c) states in which accuracy of the first move was significantly below chance. These features are consistent with a non-Markov planning agent, with an inherently inertial decision process, and that uses heuristics and partial problem knowledge to plan its actions. We applied a statistical framework to fit and test the quality of a proposed planning model (Monte Carlo Tree Search). Our framework allows us to parse out independent contributions to problem-solving based on the construction of the value function and on general mechanisms of the search process in the tree of solutions. We show that the latter are correlated with children's performance on an independent measure of planning, while the former is highly domain specific. Copyright © 2014 Cognitive Science Society, Inc.

  15. Parsing pyrogenic polycyclic aromatic hydrocarbons: forensic chemistry, receptor models, and source control policy.

    Science.gov (United States)

    O'Reilly, Kirk T; Pietari, Jaana; Boehm, Paul D

    2014-04-01

    A realistic understanding of contaminant sources is required to set appropriate control policy. Forensic chemical methods can be powerful tools in source characterization and identification, but they require a multiple-lines-of-evidence approach. Atmospheric receptor models, such as the US Environmental Protection Agency (USEPA)'s chemical mass balance (CMB), are increasingly being used to evaluate sources of pyrogenic polycyclic aromatic hydrocarbons (PAHs) in sediments. This paper describes the assumptions underlying receptor models and discusses challenges in complying with these assumptions in practice. Given the variability within, and the similarity among, pyrogenic PAH source types, model outputs are sensitive to specific inputs, and parsing among some source types may not be possible. Although still useful for identifying potential sources, the technical specialist applying these methods must describe both the results and their inherent uncertainties in a way that is understandable to nontechnical policy makers. The authors present an example case study concerning an investigation of a class of parking-lot sealers as a significant source of PAHs in urban sediment. Principal component analysis is used to evaluate published CMB model inputs and outputs. Targeted analyses of 2 areas where bans have been implemented are included. The results do not support the claim that parking-lot sealers are a significant source of PAHs in urban sediments. © 2013 SETAC.

  16. Systematic Luby Transform codes as incremental redundancy scheme

    CSIR Research Space (South Africa)

    Grobler, TL

    2011-09-01

    Full Text Available Systematic Luby Transform (fountain) codes are investigated as a possible incremental redundancy scheme for EDGE. The convolutional incremental redundancy scheme currently used by EDGE is replaced by the fountain approach. The results...

  17. Incremental product development : four essays on activities, resources, and actors

    OpenAIRE

    Olsen, Nina Veflen

    2006-01-01

    Most innovations are incremental, and incremental innovations play an important role for the firm. In spite of that, traditional NPD studies most often emphasize moderate to highly innovative product development projects. In this dissertation the overall objective is to increase our understanding of incremental innovation. The dissertation is organized around four essays that emphasize different aspects of incremental innovation. NPD in hotels, retailers and food manufact...

  18. Incremental Nonnegative Matrix Factorization for Face Recognition

    Directory of Open Access Journals (Sweden)

    Wen-Sheng Chen

    2008-01-01

    Full Text Available Nonnegative matrix factorization (NMF is a promising approach for local feature extraction in face recognition tasks. However, there are two major drawbacks in almost all existing NMF-based methods. One shortcoming is that the computational cost is expensive for large matrix decomposition. The other is that it must conduct repetitive learning, when the training samples or classes are updated. To overcome these two limitations, this paper proposes a novel incremental nonnegative matrix factorization (INMF for face representation and recognition. The proposed INMF approach is based on a novel constraint criterion and our previous block strategy. It thus has some good properties, such as low computational complexity, sparse coefficient matrix. Also, the coefficient column vectors between different classes are orthogonal. In particular, it can be applied to incremental learning. Two face databases, namely FERET and CMU PIE face databases, are selected for evaluation. Compared with PCA and some state-of-the-art NMF-based methods, our INMF approach gives the best performance.

  19. Incremental Scheduling Engines: Cost Savings through Automation

    Science.gov (United States)

    Jaap, John; Phillips, Shaun

    2005-01-01

    As humankind embarks on longer space missions farther from home, the requirements and environments for scheduling the activities performed on these missions are changing. As we begin to prepare for these missions it is appropriate to evaluate the merits and applicability of the different types of scheduling engines. Scheduling engines temporally arrange tasks onto a timeline so that all constraints and ob.jectives are met and resources are not over-booked. Scheduling engines used to schedule space missions fall into three general categories: batch, mixed-initiative, and incremental. This paper, presents an assessment of the engine types, a discussion of the impact of human exploration of the moon and Mars on planning and scheduling, and the applicability of the different types of scheduling engines. This paper will pursue the hypothesis that incremental scheduling engines may have a place in the new environment; they have the potential to reduce cost, to improve the satisfaction of those who execute or benefit from a particular timeline (the customers), and to allow astronauts to plan their own tasks and those of their companion robots.

  20. ENERGY SYSTEM CONTRIBUTIONS DURING INCREMENTAL EXERCISE TEST

    Directory of Open Access Journals (Sweden)

    Rômulo Bertuzzi

    2013-09-01

    Full Text Available The main purpose of this study was to determine the relative contributions of the aerobic and glycolytic systems during an incremental exercise test (IET. Ten male recreational long-distance runners performed an IET consisting of three-minute incremental stages on a treadmill. The fractions of the contributions of the aerobic and glycolytic systems were calculated for each stage based on the oxygen uptake and the oxygen energy equivalents derived by blood lactate accumulation, respectively. Total metabolic demand (WTOTAL was considered as the sum of these two energy systems. The aerobic (WAER and glycolytic (WGLYCOL system contributions were expressed as a percentage of the WTOTAL. The results indicated that WAER (86-95% was significantly higher than WGLYCOL (5-14% throughout the IET (p < 0.05. In addition, there was no evidence of the sudden increase in WGLYCOL that has been previously reported to support to the "anaerobic threshold" concept. These data suggest that the aerobic metabolism is predominant throughout the IET and that energy system contributions undergo a slow transition from low to high intensity

  1. Incremental learning for automated knowledge capture

    Energy Technology Data Exchange (ETDEWEB)

    Benz, Zachary O. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Basilico, Justin Derrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Davis, Warren Leon [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dixon, Kevin R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Brian S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Nathaniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wendt, Jeremy Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-12-01

    People responding to high-consequence national-security situations need tools to help them make the right decision quickly. The dynamic, time-critical, and ever-changing nature of these situations, especially those involving an adversary, require models of decision support that can dynamically react as a situation unfolds and changes. Automated knowledge capture is a key part of creating individualized models of decision making in many situations because it has been demonstrated as a very robust way to populate computational models of cognition. However, existing automated knowledge capture techniques only populate a knowledge model with data prior to its use, after which the knowledge model is static and unchanging. In contrast, humans, including our national-security adversaries, continually learn, adapt, and create new knowledge as they make decisions and witness their effect. This artificial dichotomy between creation and use exists because the majority of automated knowledge capture techniques are based on traditional batch machine-learning and statistical algorithms. These algorithms are primarily designed to optimize the accuracy of their predictions and only secondarily, if at all, concerned with issues such as speed, memory use, or ability to be incrementally updated. Thus, when new data arrives, batch algorithms used for automated knowledge capture currently require significant recomputation, frequently from scratch, which makes them ill suited for use in dynamic, timecritical, high-consequence decision making environments. In this work we seek to explore and expand upon the capabilities of dynamic, incremental models that can adapt to an ever-changing feature space.

  2. Enabling Incremental Query Re-Optimization.

    Science.gov (United States)

    Liu, Mengmeng; Ives, Zachary G; Loo, Boon Thau

    2016-01-01

    As declarative query processing techniques expand to the Web, data streams, network routers, and cloud platforms, there is an increasing need to re-plan execution in the presence of unanticipated performance changes. New runtime information may affect which query plan we prefer to run. Adaptive techniques require innovation both in terms of the algorithms used to estimate costs , and in terms of the search algorithm that finds the best plan. We investigate how to build a cost-based optimizer that recomputes the optimal plan incrementally given new cost information, much as a stream engine constantly updates its outputs given new data. Our implementation especially shows benefits for stream processing workloads. It lays the foundations upon which a variety of novel adaptive optimization algorithms can be built. We start by leveraging the recently proposed approach of formulating query plan enumeration as a set of recursive datalog queries ; we develop a variety of novel optimization approaches to ensure effective pruning in both static and incremental cases. We further show that the lessons learned in the declarative implementation can be equally applied to more traditional optimizer implementations.

  3. [Incremental cost effectiveness of multifocal cataract surgery].

    Science.gov (United States)

    Pagel, N; Dick, H B; Krummenauer, F

    2007-02-01

    Supplementation of cataract patients with multifocal intraocular lenses involves an additional financial investment when compared to the corresponding monofocal supplementation, which usually is not funded by German health care insurers. In the context of recent resource allocation discussions, however, the cost effectiveness of multifocal cataract surgery could become an important rationale. Therefore an evidence-based estimation of its cost effectiveness was carried out. Three independent meta-analyses were implemented to estimate the gain in uncorrected near visual acuity and best corrected visual acuity (vision lines) as well as the predictability (fraction of patients without need for reading aids) of multifocal supplementation. Study reports published between 1995 and 2004 (English or German language) were screened for appropriate key words. Meta effects in visual gain and predictability were estimated by means and standard deviations of the reported effect measures. Cost data were estimated by German DRG rates and individual lens costs; the cost effectiveness of multifocal cataract surgery was then computed in terms of its marginal cost effectiveness ratio (MCER) for each clinical benefit endpoint; the incremental costs of multifocal versus monofocal cataract surgery were further estimated by means of their respective incremental cost effectiveness ratio (ICER). An independent meta-analysis estimated the complication profiles to be expected after monofocal and multifocal cataract surgery in order to evaluate expectable complication-associated additional costs of both procedures; the marginal and incremental cost effectiveness estimates were adjusted accordingly. A sensitivity analysis comprised cost variations of +/- 10 % and utility variations alongside the meta effect estimate's 95 % confidence intervals. Total direct costs from the health care insurer's perspective were estimated 3363 euro, associated with a visual meta benefit in best corrected visual

  4. Online Object Tracking, Learning and Parsing with And-Or Graphs.

    Science.gov (United States)

    Wu, Tianfu; Lu, Yang; Zhu, Song-Chun

    2017-12-01

    This paper presents a method, called AOGTracker, for simultaneously tracking, learning and parsing (TLP) of unknown objects in video sequences with a hierarchical and compositional And-Or graph (AOG) representation. The TLP method is formulated in the Bayesian framework with a spatial and a temporal dynamic programming (DP) algorithms inferring object bounding boxes on-the-fly. During online learning, the AOG is discriminatively learned using latent SVM [1] to account for appearance (e.g., lighting and partial occlusion) and structural (e.g., different poses and viewpoints) variations of a tracked object, as well as distractors (e.g., similar objects) in background. Three key issues in online inference and learning are addressed: (i) maintaining purity of positive and negative examples collected online, (ii) controling model complexity in latent structure learning, and (iii) identifying critical moments to re-learn the structure of AOG based on its intrackability. The intrackability measures uncertainty of an AOG based on its score maps in a frame. In experiments, our AOGTracker is tested on two popular tracking benchmarks with the same parameter setting: the TB-100/50/CVPR2013 benchmarks  , [3] , and the VOT benchmarks [4] -VOT 2013, 2014, 2015 and TIR2015 (thermal imagery tracking). In the former, our AOGTracker outperforms state-of-the-art tracking algorithms including two trackers based on deep convolutional network   [5] , [6] . In the latter, our AOGTracker outperforms all other trackers in VOT2013 and is comparable to the state-of-the-art methods in VOT2014, 2015 and TIR2015.

  5. Incremental nonlinear dimensionality reduction by manifold learning.

    Science.gov (United States)

    Law, Martin H C; Jain, Anil K

    2006-03-01

    Understanding the structure of multidimensional patterns, especially in unsupervised cases, is of fundamental importance in data mining, pattern recognition, and machine learning. Several algorithms have been proposed to analyze the structure of high-dimensional data based on the notion of manifold learning. These algorithms have been used to extract the intrinsic characteristics of different types of high-dimensional data by performing nonlinear dimensionality reduction. Most of these algorithms operate in a "batch" mode and cannot be efficiently applied when data are collected sequentially. In this paper, we describe an incremental version of ISOMAP, one of the key manifold learning algorithms. Our experiments on synthetic data as well as real world images demonstrate that our modified algorithm can maintain an accurate low-dimensional representation of the data in an efficient manner.

  6. Efficient Incremental Checkpointing of Java Programs

    DEFF Research Database (Denmark)

    Lawall, Julia Laetitia; Muller, Gilles

    2000-01-01

    This paper investigates the optimization of language-level checkpointing of Java programs. First, we describe how to systematically associate incremental checkpoints with Java classes. While being safe, the genericness of this solution induces substantial execution overhead. Second, to solve...... the dilemma of genericness versus performance, we use automatic program specialization to transform the generic checkpointing methods into highly optimized ones. Specialization exploits two kinds of information: (i) structural properties about the program classes, (ii) knowledge of unmodified data structures...... in specific program phases. The latter information allows us to generate phase-specific checkpointing methods. We evaluate our approach on two benchmarks, a realistic application which consists of a program analysis engine, and a synthetic program which can serve as a metric. Specialization gives a speedup...

  7. Parsing social network survey data from hidden populations using stochastic context-free grammars.

    Directory of Open Access Journals (Sweden)

    Art F Y Poon

    Full Text Available BACKGROUND: Human populations are structured by social networks, in which individuals tend to form relationships based on shared attributes. Certain attributes that are ambiguous, stigmatized or illegal can create a OhiddenO population, so-called because its members are difficult to identify. Many hidden populations are also at an elevated risk of exposure to infectious diseases. Consequently, public health agencies are presently adopting modern survey techniques that traverse social networks in hidden populations by soliciting individuals to recruit their peers, e.g., respondent-driven sampling (RDS. The concomitant accumulation of network-based epidemiological data, however, is rapidly outpacing the development of computational methods for analysis. Moreover, current analytical models rely on unrealistic assumptions, e.g., that the traversal of social networks can be modeled by a Markov chain rather than a branching process. METHODOLOGY/PRINCIPAL FINDINGS: Here, we develop a new methodology based on stochastic context-free grammars (SCFGs, which are well-suited to modeling tree-like structure of the RDS recruitment process. We apply this methodology to an RDS case study of injection drug users (IDUs in Tijuana, México, a hidden population at high risk of blood-borne and sexually-transmitted infections (i.e., HIV, hepatitis C virus, syphilis. Survey data were encoded as text strings that were parsed using our custom implementation of the inside-outside algorithm in a publicly-available software package (HyPhy, which uses either expectation maximization or direct optimization methods and permits constraints on model parameters for hypothesis testing. We identified significant latent variability in the recruitment process that violates assumptions of Markov chain-based methods for RDS analysis: firstly, IDUs tended to emulate the recruitment behavior of their own recruiter; and secondly, the recruitment of like peers (homophily was dependent on

  8. KAMUS BAHASA ARAB – INDONESIA ONLINE DENGAN PEMECAHAN SUKU KATA MENGGUNAKAN METODE PARSING

    Directory of Open Access Journals (Sweden)

    Anny Yuniarti

    2004-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Kebutuhan umat Islam akan fasilitas penunjang belajar bahasa Arab di Indonesia masih belum terpenuhi dengan optimal. Kamus bahasa Arab yang beredar di pasaran sulit dipahami karena minimnya pengetahuan tentang ilmu tata bahasa Arab di kalangan umat Islam. Pada penelitian ini dikembangkan sebuah perangkat lunak yang berfungsi menerjemahkan kata berbahasa Arab dengan metode parsing sehingga dapat mencakup kata-kata yang telah mengalami perubahan bentuk dari bentuk dasarnya. Karena kata bahasa Arab memiliki turunan kata yang jumlahnya cukup besar, dan supaya kamus efisien, maka tidak semua turunan kata disimpan dalam basisdata. Oleh sebab itu diperlukan suatu cara untuk mengenali pola kata, dan cara mengetahui bentuk dasar suatu kata. Keseluruhan perangkat lunak ini diimplementasikan berbasis web sehingga memudahkan pengaksesan pengguna. Dan pengguna tidak memerlukan proses instalasi perangkat lunak atau sistem operasi tertentu. Pembuatan perangkat lunak ini didahului dengan perancangan proses dan perancangan interface. Kemudian rancangan tersebut diimplementasikan menjadi sebuah perangkat lunak yang siap untuk dipakai. Perangkat lunak yang sudah jadi tersebut telah diuji coba sesuai dengan spesifikasi kebutuhan

  9. Performance Evaluation of Incremental K-means Clustering Algorithm

    OpenAIRE

    Chakraborty, Sanjay; Nagwani, N. K.

    2014-01-01

    The incremental K-means clustering algorithm has already been proposed and analysed in paper [Chakraborty and Nagwani, 2011]. It is a very innovative approach which is applicable in periodically incremental environment and dealing with a bulk of updates. In this paper the performance evaluation is done for this incremental K-means clustering algorithm using air pollution database. This paper also describes the comparison on the performance evaluations between existing K-means clustering and i...

  10. Evolution of cooperation driven by incremental learning

    Science.gov (United States)

    Li, Pei; Duan, Haibin

    2015-02-01

    It has been shown that the details of microscopic rules in structured populations can have a crucial impact on the ultimate outcome in evolutionary games. So alternative formulations of strategies and their revision processes exploring how strategies are actually adopted and spread within the interaction network need to be studied. In the present work, we formulate the strategy update rule as an incremental learning process, wherein knowledge is refreshed according to one's own experience learned from the past (self-learning) and that gained from social interaction (social-learning). More precisely, we propose a continuous version of strategy update rules, by introducing the willingness to cooperate W, to better capture the flexibility of decision making behavior. Importantly, the newly gained knowledge including self-learning and social learning is weighted by the parameter ω, establishing a strategy update rule involving innovative element. Moreover, we quantify the macroscopic features of the emerging patterns to inspect the underlying mechanisms of the evolutionary process using six cluster characteristics. In order to further support our results, we examine the time evolution course for these characteristics. Our results might provide insights for understanding cooperative behaviors and have several important implications for understanding how individuals adjust their strategies under real-life conditions.

  11. Pseudocode Interpreter (Pseudocode Integrated Development Environment with Lexical Analyzer and Syntax Analyzer using Recursive Descent Parsing Algorithm

    Directory of Open Access Journals (Sweden)

    Christian Lester D. Gimeno

    2017-11-01

    Full Text Available –This research study focused on the development of a software that helps students design, write, validate and run their pseudocode in a semi Integrated Development Environment (IDE instead of manually writing it on a piece of paper.Specifically, the study aimed to develop lexical analyzer or lexer, syntax analyzer or parser using recursive descent parsing algorithm and an interpreter. The lexical analyzer reads pseudocodesource in a sequence of symbols or characters as lexemes.The lexemes are then analyzed by the lexer that matches a pattern for valid tokens and passes to the syntax analyzer or parser. The syntax analyzer or parser takes those valid tokens and builds meaningful commands using recursive descent parsing algorithm in a form of an abstract syntax tree. The generation of an abstract syntax tree is based on the specified grammar rule created by the researcher expressed in Extended Backus-Naur Form. The Interpreter takes the generated abstract syntax tree and starts the evaluation or interpretation to produce pseudocode output. The software was evaluated using white-box testing by several ICT professionals and black-box testing by several computer science students based on the International Organization for Standardization (ISO 9126 software quality standards. The overall results of the evaluation both for white-box and black-box were described as “Excellent in terms of functionality, reliability, usability, efficiency, maintainability and portability”.

  12. Empirical and Statistical Evaluation of the Effectiveness of Four ...

    African Journals Online (AJOL)

    The algorithms implementation was done in the NetBeans Integrated Development Environment using Java as the programming language. Through the statistical analysis performed using Boxplot and ANOVA and comparison made on the four algorithms, Lempel Ziv Welch algorithm was the most efficient and effective ...

  13. A simple data compression scheme for binary images of bacteria compared with commonly used image data compression schemes

    NARCIS (Netherlands)

    Wilkinson, M.H.F.

    A run length code compression scheme of extreme simplicity, used for image storage in an automated bacterial morphometry system, is compared with more common compression schemes, such as are used in the tag image file format. These schemes are Lempel-Ziv and Welch (LZW), Macintosh Packbits, and

  14. String graph construction using incremental hashing.

    Science.gov (United States)

    Ben-Bassat, Ilan; Chor, Benny

    2014-12-15

    New sequencing technologies generate larger amount of short reads data at decreasing cost. De novo sequence assembly is the problem of combining these reads back to the original genome sequence, without relying on a reference genome. This presents algorithmic and computational challenges, especially for long and repetitive genome sequences. Most existing approaches to the assembly problem operate in the framework of de Bruijn graphs. Yet, a number of recent works use the paradigm of string graph, using a variety of methods for storing and processing suffixes and prefixes, like suffix arrays, the Burrows-Wheeler transform or the FM index. Our work is motivated by a search for new approaches to constructing the string graph, using alternative yet simple data structures and algorithmic concepts. We introduce a novel hash-based method for constructing the string graph. We use incremental hashing, and specifically a modification of the Karp-Rabin fingerprint, and Bloom filters. Using these probabilistic methods might create false-positive and false-negative edges during the algorithm's execution, but these are all detected and corrected. The advantages of the proposed approach over existing methods are its simplicity and the incorporation of established probabilistic techniques in the context of de novo genome sequencing. Our preliminary implementation is favorably comparable with the first string graph construction of Simpson and Durbin (2010) (but not with subsequent improvements). Further research and optimizations will hopefully enable the algorithm to be incorporated, with noticeable performance improvement, in state-of-the-art string graph-based assemblers. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Validade incremental da Escala de Abordagens de Aprendizagem (EABAP Incremental validity of the Learning Approaches Scale

    Directory of Open Access Journals (Sweden)

    Cristiano Mauro Assis Gomes

    2012-01-01

    Full Text Available Este trabalho investiga a validade incremental da abordagem à aprendizagem (AbA sobre o desempenho escolar, além da inteligência. São analisados dados de 684 estudantes da sexta série ao terceiro ano do ensino médio de uma escola particular de Belo Horizonte, Minas Gerais, Brasil. A inteligência é mensurada por itens marcadores da inteligência fluida da Bateria de Fatores Cognitivos de Alta-Ordem. AbA é medida através da Escala de Abordagens de Aprendizagem. O desempenho acadêmico é medido através das notas escolares em Matemática, Português, Geografia e História. Três hipóteses sobre a relação entre inteligência, abordagem à aprendizagem e proficiência acadêmica são testadas através da modelagem por equação estrutural. O modelo da relação direta foi o mais adequado aos dados e apresentou bom grau de ajuste. Inteligência e AbA apresentam efeito direto sobre o desempenho escolar. AbA possui validade incremental independente da inteligência sobre as diferenças individuais do rendimento acadêmico.This paper investigates the incremental validity of learning approach in academic achievement. Participants were 684 junior and high school students from a private school in Belo Horizonte, Minas Gerais, Brazil. Intelligence is measured by fluid intelligence items of the Higher-Order Cognitive Factors Kit. Learning approach is measured by the Learning Approaches Scale. Academic achievement is measured by annual grades in Mathematics, Portuguese (native language, Geography, and History. Three hypotheses about the relation among intelligence, learning approach and academic achievement are tested through structural equation modeling. The direct relation model was the most adequate and showed good fit. Intelligence and learning approach show direct effect in academic achievement. Learning approach has incremental validity of individual differences in academic achievement, independently of intelligence.

  16. The Time Course of Incremental Word Processing during Chinese Reading

    Science.gov (United States)

    Zhou, Junyi; Ma, Guojie; Li, Xingshan; Taft, Marcus

    2018-01-01

    In the current study, we report two eye movement experiments investigating how Chinese readers process incremental words during reading. These are words where some of the component characters constitute another word (an embedded word). In two experiments, eye movements were monitored while the participants read sentences with incremental words…

  17. Public Key Infrastructure Increment 2 (PKI Inc 2)

    Science.gov (United States)

    2016-03-01

    2016 Major Automated Information System Annual Report Public Key Infrastructure Increment 2 (PKI Inc 2) Defense Acquisition Management...6615 DSN Phone: 244-4900 DSN Fax: Date Assigned: July 1, 2015 Program Information Program Name Public Key Infrastructure Increment 2 (PKI Inc 2... Public Key Infrastructure (PKI) is a critical enabling technology for Information Assurance (IA) services to support seamless secure information flows

  18. Validation of the periodicity of growth increment deposition in ...

    African Journals Online (AJOL)

    Validation of the periodicity of growth increment deposition in otoliths from the larval and early juvenile stages of two cyprinids from the Orange–Vaal river ... Linear regression models were fitted to the known age post-fertilisation and the age estimated using increment counts to test the correspondence between the two for ...

  19. Incrementality in naming and reading complex numerals : Evidence from eyetracking

    NARCIS (Netherlands)

    Korvorst, M.H.W.; Roelofs, A.P.A.; Levelt, W.J.M.

    2006-01-01

    Individuals speak incrementally when they interleave planning and articulation. Eyetracking, along with the measurement of speech onset latencies, can be used to gain more insight into the degree of incrementality adopted by speakers. In the current article, two eyetracking experiments are reported

  20. Incremental Seismic Rehabilitation of School Buildings (K-12).

    Science.gov (United States)

    Krimgold, Frederick; Hattis, David; Green, Melvyn

    Asserting that the strategy of incremental seismic rehabilitation makes it possible for schools to get started now on improving earthquake safety, this manual provides school administrators with the information necessary to assess the seismic vulnerability of their buildings and to implement a program of incremental seismic rehabilitation for…

  1. Incrementality in naming and reading complex numerals: Evidence from eyetracking

    NARCIS (Netherlands)

    Korvorst, M.H.W.; Roelofs, A.P.A.; Levelt, W.J.M.

    2006-01-01

    Individuals speak incrementally when they interleave planning and articulation. Eyetracking, along with the measurement of speech onset latencies, can be used to gain more insight into the degree of incrementality adopted by speakers. In the current article, two eyetracking experiments are reported

  2. Creating Helical Tool Paths for Single Point Incremental Forming

    DEFF Research Database (Denmark)

    Skjødt, Martin; Hancock, Michael H.; Bay, Niels

    2007-01-01

    Single point incremental forming (SPIF) is a relatively new sheet forming process. A sheet is clamped in a rig and formed incrementally using a rotating single point tool in the form of a rod with a spherical end. The process is often performed on a CNC milling machine and the tool movement...

  3. On Incremental Condition Estimators in the 2-norm

    Czech Academy of Sciences Publication Activity Database

    Duintjer Tebbens, Jurjen; Tůma, Miroslav

    2014-01-01

    Roč. 35, č. 1 (2014), s. 174-197 ISSN 0895-4798 R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : condition number estimation * matrix inverses * incremental condition estimator * incremental norm estimator Subject RIV: BA - General Mathematics Impact factor: 1.590, year: 2014

  4. Lifetime costs of lung transplantation : Estimation of incremental costs

    NARCIS (Netherlands)

    VanEnckevort, PJ; Koopmanschap, MA; Tenvergert, EM; VanderBij, W; Rutten, FFH

    1997-01-01

    Despite an expanding number of centres which provide lung transplantation, information about the incremental costs of lung transplantation is scarce. From 1991 until 1995, in The Netherlands a technology assessment was performed which provided information about the incremental costs of lung

  5. Power-law confusion: You say incremental, I say differential

    Science.gov (United States)

    Colwell, Joshua E.

    1993-01-01

    Power-law distributions are commonly used to describe the frequency of occurrences of crater diameters, stellar masses, ring particle sizes, planetesimal sizes, and meteoroid masses to name a few. The distributions are simple, and this simplicity has led to a number of misstatements in the literature about the kind of power-law that is being used: differential, cumulative, or incremental. Although differential and cumulative power-laws are mathematically trivial, it is a hybrid incremental distribution that is often used and the relationship between the incremental distribution and the differential or cumulative distributions is not trivial. In many cases the slope of an incremental power-law will be nearly identical to the slope of the cumulative power-law of the same distribution, not the differential slope. The discussion that follows argues for a consistent usage of these terms and against the oft-made implicit claim that incremental and differential distributions are indistinguishable.

  6. An incremental approach to automated protein localisation

    Directory of Open Access Journals (Sweden)

    Kummert Franz

    2008-10-01

    Full Text Available Abstract Background The subcellular localisation of proteins in intact living cells is an important means for gaining information about protein functions. Even dynamic processes can be captured, which can barely be predicted based on amino acid sequences. Besides increasing our knowledge about intracellular processes, this information facilitates the development of innovative therapies and new diagnostic methods. In order to perform such a localisation, the proteins under analysis are usually fused with a fluorescent protein. So, they can be observed by means of a fluorescence microscope and analysed. In recent years, several automated methods have been proposed for performing such analyses. Here, two different types of approaches can be distinguished: techniques which enable the recognition of a fixed set of protein locations and methods that identify new ones. To our knowledge, a combination of both approaches – i.e. a technique, which enables supervised learning using a known set of protein locations and is able to identify and incorporate new protein locations afterwards – has not been presented yet. Furthermore, associated problems, e.g. the recognition of cells to be analysed, have usually been neglected. Results We introduce a novel approach to automated protein localisation in living cells. In contrast to well-known techniques, the protein localisation technique presented in this article aims at combining the two types of approaches described above: After an automatic identification of unknown protein locations, a potential user is enabled to incorporate them into the pre-trained system. An incremental neural network allows the classification of a fixed set of protein location as well as the detection, clustering and incorporation of additional patterns that occur during an experiment. Here, the proposed technique achieves promising results with respect to both tasks. In addition, the protein localisation procedure has been adapted

  7. Rapid transcriptome characterization and parsing of sequences in a non-model host-pathogen interaction; pea-Sclerotinia sclerotiorum

    Directory of Open Access Journals (Sweden)

    Zhuang Xiaofeng

    2012-11-01

    Full Text Available Abstract Background White mold, caused by Sclerotinia sclerotiorum, is one of the most important diseases of pea (Pisum sativum L., however, little is known about the genetics and biochemistry of this interaction. Identification of genes underlying resistance in the host or pathogenicity and virulence factors in the pathogen will increase our knowledge of the pea-S. sclerotiorum interaction and facilitate the introgression of new resistance genes into commercial pea varieties. Although the S. sclerotiorum genome sequence is available, no pea genome is available, due in part to its large genome size (~3500 Mb and extensive repeated motifs. Here we present an EST data set specific to the interaction between S. sclerotiorum and pea, and a method to distinguish pathogen and host sequences without a species-specific reference genome. Results 10,158 contigs were obtained by de novo assembly of 128,720 high-quality reads generated by 454 pyrosequencing of the pea-S. sclerotiorum interactome. A method based on the tBLASTx program was modified to distinguish pea and S. sclerotiorum ESTs. To test this strategy, a mixture of known ESTs (18,490 pea and 17,198 S. sclerotiorum ESTs from public databases were pooled and parsed; the tBLASTx method successfully separated 90.1% of the artificial EST mix with 99.9% accuracy. The tBLASTx method successfully parsed 89.4% of the 454-derived EST contigs, as validated by PCR, into pea (6,299 contigs and S. sclerotiorum (2,780 contigs categories. Two thousand eight hundred and forty pea ESTs and 996 S. sclerotiorum ESTs were predicted to be expressed specifically during the pea-S. sclerotiorum interaction as determined by homology search against 81,449 pea ESTs (from flowers, leaves, cotyledons, epi- and hypocotyl, and etiolated and light treated etiolated seedlings and 57,751 S. sclerotiorum ESTs (from mycelia at neutral pH, developing apothecia and developing sclerotia. Among those ESTs specifically expressed

  8. Rapid transcriptome characterization and parsing of sequences in a non-model host-pathogen interaction; pea-Sclerotinia sclerotiorum.

    Science.gov (United States)

    Zhuang, Xiaofeng; McPhee, Kevin E; Coram, Tristan E; Peever, Tobin L; Chilvers, Martin I

    2012-11-26

    White mold, caused by Sclerotinia sclerotiorum, is one of the most important diseases of pea (Pisum sativum L.), however, little is known about the genetics and biochemistry of this interaction. Identification of genes underlying resistance in the host or pathogenicity and virulence factors in the pathogen will increase our knowledge of the pea-S. sclerotiorum interaction and facilitate the introgression of new resistance genes into commercial pea varieties. Although the S. sclerotiorum genome sequence is available, no pea genome is available, due in part to its large genome size (~3500 Mb) and extensive repeated motifs. Here we present an EST data set specific to the interaction between S. sclerotiorum and pea, and a method to distinguish pathogen and host sequences without a species-specific reference genome. 10,158 contigs were obtained by de novo assembly of 128,720 high-quality reads generated by 454 pyrosequencing of the pea-S. sclerotiorum interactome. A method based on the tBLASTx program was modified to distinguish pea and S. sclerotiorum ESTs. To test this strategy, a mixture of known ESTs (18,490 pea and 17,198 S. sclerotiorum ESTs) from public databases were pooled and parsed; the tBLASTx method successfully separated 90.1% of the artificial EST mix with 99.9% accuracy. The tBLASTx method successfully parsed 89.4% of the 454-derived EST contigs, as validated by PCR, into pea (6,299 contigs) and S. sclerotiorum (2,780 contigs) categories. Two thousand eight hundred and forty pea ESTs and 996 S. sclerotiorum ESTs were predicted to be expressed specifically during the pea-S. sclerotiorum interaction as determined by homology search against 81,449 pea ESTs (from flowers, leaves, cotyledons, epi- and hypocotyl, and etiolated and light treated etiolated seedlings) and 57,751 S. sclerotiorum ESTs (from mycelia at neutral pH, developing apothecia and developing sclerotia). Among those ESTs specifically expressed, 277 (9.8%) pea ESTs were predicted to be

  9. Height increment of understorey Norway spruces under different tree canopies

    Directory of Open Access Journals (Sweden)

    Olavi Laiho

    2014-02-01

    Full Text Available Background Stands having advance regeneration of spruce are logical places to start continuous cover forestry (CCF in fertile and mesic boreal forests. However, the development of advance regeneration is poorly known. Methods This study used regression analysis to model the height increment of spruce understorey as a function of seedling height, site characteristics and canopy structure. Results An admixture of pine and birch in the main canopy improves the height increment of understorey. When the stand basal area is 20 m2ha-1 height increment is twice as fast under pine and birch canopies, as compared to spruce. Height increment of understorey spruce increases with increasing seedling height. Between-stand and within-stand residual variation in the height increment of understorey spruces is high. The increment of 1/6 fastest-growing seedlings is at least 50% greater than the average. Conclusions The results of this study help forest managers to regulate the density and species composition of the stand, so as to obtain a sufficient height development of the understorey. In pure and almost pure spruce stands, the stand basal area should be low for a good height increment of the understorey.

  10. Unsupervised parsing of gaze data with a beta-process vector auto-regressive hidden Markov model.

    Science.gov (United States)

    Houpt, Joseph W; Frame, Mary E; Blaha, Leslie M

    2017-10-26

    The first stage of analyzing eye-tracking data is commonly to code the data into sequences of fixations and saccades. This process is usually automated using simple, predetermined rules for classifying ranges of the time series into events, such as "if the dispersion of gaze samples is lower than a particular threshold, then code as a fixation; otherwise code as a saccade." More recent approaches incorporate additional eye-movement categories in automated parsing algorithms by using time-varying, data-driven thresholds. We describe an alternative approach using the beta-process vector auto-regressive hidden Markov model (BP-AR-HMM). The BP-AR-HMM offers two main advantages over existing frameworks. First, it provides a statistical model for eye-movement classification rather than a single estimate. Second, the BP-AR-HMM uses a latent process to model the number and nature of the types of eye movements and hence is not constrained to predetermined categories. We applied the BP-AR-HMM both to high-sampling rate gaze data from Andersson et al. (Behavior Research Methods 49(2), 1-22 2016) and to low-sampling rate data from the DIEM project (Mital et al., Cognitive Computation 3(1), 5-24 2011). Driven by the data properties, the BP-AR-HMM identified over five categories of movements, some which clearly mapped on to fixations and saccades, and others potentially captured post-saccadic oscillations, smooth pursuit, and various recording errors. The BP-AR-HMM serves as an effective algorithm for data-driven event parsing alone or as an initial step in exploring the characteristics of gaze data sets.

  11. Incremental Tensor Principal Component Analysis for Handwritten Digit Recognition

    Directory of Open Access Journals (Sweden)

    Chang Liu

    2014-01-01

    Full Text Available To overcome the shortcomings of traditional dimensionality reduction algorithms, incremental tensor principal component analysis (ITPCA based on updated-SVD technique algorithm is proposed in this paper. This paper proves the relationship between PCA, 2DPCA, MPCA, and the graph embedding framework theoretically and derives the incremental learning procedure to add single sample and multiple samples in detail. The experiments on handwritten digit recognition have demonstrated that ITPCA has achieved better recognition performance than that of vector-based principal component analysis (PCA, incremental principal component analysis (IPCA, and multilinear principal component analysis (MPCA algorithms. At the same time, ITPCA also has lower time and space complexity.

  12. An Incremental Support Vector Machine based Speech Activity Detection Algorithm.

    Science.gov (United States)

    Xianbo, Xiao; Guangshu, Hu

    2005-01-01

    Traditional voice activity detection algorithms are mostly threshold-based or statistical model-based. All those methods are absent of the ability to react quickly to variations of environments. This paper describes an incremental SVM (Support Vector Machine) method for speech activity detection. The proposed incremental procedure makes it adaptive to variation of environments and the special construction of incremental training data set decreases computing consumption effectively. Experiments results demonstrated its higher end point detection accuracy. Further work will be focused on decreasing computing consumption and importing multi-class SVM classifiers.

  13. Incremental Sampling Algorithms for Robust Propulsion Control, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences proposes to develop a system for robust engine control based on incremental sampling, specifically Rapidly-Expanding Random Tree (RRT)...

  14. MRI: Modular reasoning about interference in incremental programming

    OpenAIRE

    Oliveira, Bruno C. D. S; Schrijvers, Tom; Cook, William R

    2012-01-01

    Incremental Programming (IP) is a programming style in which new program components are defined as increments of other components. Examples of IP mechanisms include: Object-oriented programming (OOP) inheritance, aspect-oriented programming (AOP) advice and feature-oriented programming (FOP). A characteristic of IP mechanisms is that, while individual components can be independently defined, the composition of components makes those components become tightly coupled, sh...

  15. On the instability increments of a stationary pinch

    International Nuclear Information System (INIS)

    Bud'ko, A.B.

    1989-01-01

    The stability of stationary pinch to helical modes is numerically studied. It is shown that in the case of a rather fast plasma pressure decrease to the pinch boundary, for example, for an isothermal diffusion pinch with Gauss density distribution instabilities with m=0 modes are the most quickly growing. Instability increments are calculated. A simple analytical expression of a maximum increment of growth of sausage instability for automodel Gauss profiles is obtained

  16. Entity versus incremental theories predict older adults' memory performance.

    Science.gov (United States)

    Plaks, Jason E; Chasteen, Alison L

    2013-12-01

    The authors examined whether older adults' implicit theories regarding the modifiability of memory in particular (Studies 1 and 3) and abilities in general (Study 2) would predict memory performance. In Study 1, individual differences in older adults' endorsement of the "entity theory" (a belief that one's ability is fixed) or "incremental theory" (a belief that one's ability is malleable) of memory were measured using a version of the Implicit Theories Measure (Dweck, 1999). Memory performance was assessed with a free-recall task. Results indicated that the higher the endorsement of the incremental theory, the better the free recall. In Study 2, older and younger adults' theories were measured using a more general version of the Implicit Theories Measure that focused on the modifiability of abilities in general. Again, for older adults, the higher the incremental endorsement, the better the free recall. Moreover, as predicted, implicit theories did not predict younger adults' memory performance. In Study 3, participants read mock news articles reporting evidence in favor of either the entity or incremental theory. Those in the incremental condition outperformed those in the entity condition on reading span and free-recall tasks. These effects were mediated by pretask worry such that, for those in the entity condition, higher worry was associated with lower performance. Taken together, these studies suggest that variation in entity versus incremental endorsement represents a key predictor of older adults' memory performance. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  17. The Parsing Syllable Envelopes Test for Assessment of Amplitude Modulation Discrimination Skills in Children: Development, Normative Data, and Test-Retest Reliability Studies.

    Science.gov (United States)

    Cameron, Sharon; Chong-White, Nicky; Mealings, Kiri; Beechey, Tim; Dillon, Harvey; Young, Taegan

    2018-02-01

    Intensity peaks and valleys in the acoustic signal are salient cues to syllable structure, which is accepted to be a crucial early step in phonological processing. As such, the ability to detect low-rate (envelope) modulations in signal amplitude is essential to parse an incoming speech signal into smaller phonological units. The Parsing Syllable Envelopes (ParSE) test was developed to quantify the ability of children to recognize syllable boundaries using an amplitude modulation detection paradigm. The envelope of a 750-msec steady-state /a/ vowel is modulated into two or three pseudo-syllables using notches with modulation depths varying between 0% and 100% along an 11-step continuum. In an adaptive three-alternative forced-choice procedure, the participant identified whether one, two, or three pseudo-syllables were heard. Development of the ParSE stimuli and test protocols, and collection of normative and test-retest reliability data. Eleven adults (aged 23 yr 10 mo to 50 yr 9 mo, mean 32 yr 10 mo) and 134 typically developing, primary-school children (aged 6 yr 0 mo to 12 yr 4 mo, mean 9 yr 3 mo). There were 73 males and 72 females. Data were collected using a touchscreen computer. Psychometric functions (PFs) were automatically fit to individual data by the ParSE software. Performance was related to the modulation depth at which syllables can be detected with 88% accuracy (referred to as the upper boundary of the uncertainty region [UBUR]). A shallower PF slope reflected a greater level of uncertainty. Age effects were determined based on raw scores. z Scores were calculated to account for the effect of age on performance. Outliers, and individual data for which the confidence interval of the UBUR exceeded a maximum allowable value, were removed. Nonparametric tests were used as the data were skewed toward negative performance. Across participants, the performance criterion (UBUR) was met with a median modulation depth of 42%. The effect of age on the UBUR was

  18. Three routes forward for biofuels: Incremental, leapfrog, and transitional

    International Nuclear Information System (INIS)

    Morrison, Geoff M.; Witcover, Julie; Parker, Nathan C.; Fulton, Lew

    2016-01-01

    This paper examines three technology routes for lowering the carbon intensity of biofuels: (1) a leapfrog route that focuses on major technological breakthroughs in lignocellulosic pathways at new, stand-alone biorefineries; (2) an incremental route in which improvements are made to existing U.S. corn ethanol and soybean biodiesel biorefineries; and (3) a transitional route in which biotechnology firms gain experience growing, handling, or chemically converting lignocellulosic biomass in a lower-risk fashion than leapfrog biorefineries by leveraging existing capital stock. We find the incremental route is likely to involve the largest production volumes and greenhouse gas benefits until at least the mid-2020s, but transitional and leapfrog biofuels together have far greater long-term potential. We estimate that the Renewable Fuel Standard, California's Low Carbon Fuel Standard, and federal tax credits provided an incentive of roughly $1.5–2.5 per gallon of leapfrog biofuel between 2012 and 2015, but that regulatory elements in these policies mostly incentivize lower-risk incremental investments. Adjustments in policy may be necessary to bring a greater focus on transitional technologies that provide targeted learning and cost reduction opportunities for leapfrog biofuels. - Highlights: • Three technological pathways are compared that lower carbon intensity of biofuels. • Incremental changes lead to faster greenhouse gas reductions. • Leapfrog changes lead to greatest long-term potential. • Two main biofuel policies (RFS and LCFS) are largely incremental in nature. • Transitional biofuels offer medium-risk, medium reward pathway.

  19. Compressing DNA sequence databases with coil

    Directory of Open Access Journals (Sweden)

    Hendy Michael D

    2008-05-01

    Full Text Available Abstract Background Publicly available DNA sequence databases such as GenBank are large, and are growing at an exponential rate. The sheer volume of data being dealt with presents serious storage and data communications problems. Currently, sequence data is usually kept in large "flat files," which are then compressed using standard Lempel-Ziv (gzip compression – an approach which rarely achieves good compression ratios. While much research has been done on compressing individual DNA sequences, surprisingly little has focused on the compression of entire databases of such sequences. In this study we introduce the sequence database compression software coil. Results We have designed and implemented a portable software package, coil, for compressing and decompressing DNA sequence databases based on the idea of edit-tree coding. coil is geared towards achieving high compression ratios at the expense of execution time and memory usage during compression – the compression time represents a "one-off investment" whose cost is quickly amortised if the resulting compressed file is transmitted many times. Decompression requires little memory and is extremely fast. We demonstrate a 5% improvement in compression ratio over state-of-the-art general-purpose compression tools for a large GenBank database file containing Expressed Sequence Tag (EST data. Finally, coil can efficiently encode incremental additions to a sequence database. Conclusion coil presents a compelling alternative to conventional compression of flat files for the storage and distribution of DNA sequence databases having a narrow distribution of sequence lengths, such as EST data. Increasing compression levels for databases having a wide distribution of sequence lengths is a direction for future work.

  20. High-content image informatics of the structural nuclear protein NuMA parses trajectories for stem/progenitor cell lineages and oncogenic transformation

    Energy Technology Data Exchange (ETDEWEB)

    Vega, Sebastián L. [Department of Chemical and Biochemical Engineering, Rutgers University, Piscataway, NJ (United States); Liu, Er; Arvind, Varun [Department of Biomedical Engineering, Rutgers University, Piscataway, NJ (United States); Bushman, Jared [Department of Chemistry and Chemical Biology, New Jersey Center for Biomaterials, Piscataway, NJ (United States); School of Pharmacy, University of Wyoming, Laramie, WY (United States); Sung, Hak-Joon [Department of Chemistry and Chemical Biology, New Jersey Center for Biomaterials, Piscataway, NJ (United States); Department of Biomedical Engineering, Vanderbilt University, Nashville, TN (United States); Becker, Matthew L. [Department of Polymer Science and Engineering, University of Akron, Akron, OH (United States); Lelièvre, Sophie [Department of Basic Medical Sciences, Purdue University, West Lafayette, IN (United States); Kohn, Joachim [Department of Chemistry and Chemical Biology, New Jersey Center for Biomaterials, Piscataway, NJ (United States); Vidi, Pierre-Alexandre, E-mail: pvidi@wakehealth.edu [Department of Cancer Biology, Wake Forest School of Medicine, Winston-Salem, NC (United States); Moghe, Prabhas V., E-mail: moghe@rutgers.edu [Department of Chemical and Biochemical Engineering, Rutgers University, Piscataway, NJ (United States); Department of Biomedical Engineering, Rutgers University, Piscataway, NJ (United States)

    2017-02-01

    Stem and progenitor cells that exhibit significant regenerative potential and critical roles in cancer initiation and progression remain difficult to characterize. Cell fates are determined by reciprocal signaling between the cell microenvironment and the nucleus; hence parameters derived from nuclear remodeling are ideal candidates for stem/progenitor cell characterization. Here we applied high-content, single cell analysis of nuclear shape and organization to examine stem and progenitor cells destined to distinct differentiation endpoints, yet undistinguishable by conventional methods. Nuclear descriptors defined through image informatics classified mesenchymal stem cells poised to either adipogenic or osteogenic differentiation, and oligodendrocyte precursors isolated from different regions of the brain and destined to distinct astrocyte subtypes. Nuclear descriptors also revealed early changes in stem cells after chemical oncogenesis, allowing the identification of a class of cancer-mitigating biomaterials. To capture the metrology of nuclear changes, we developed a simple and quantitative “imaging-derived” parsing index, which reflects the dynamic evolution of the high-dimensional space of nuclear organizational features. A comparative analysis of parsing outcomes via either nuclear shape or textural metrics of the nuclear structural protein NuMA indicates the nuclear shape alone is a weak phenotypic predictor. In contrast, variations in the NuMA organization parsed emergent cell phenotypes and discerned emergent stages of stem cell transformation, supporting a prognosticating role for this protein in the outcomes of nuclear functions. - Highlights: • High-content analysis of nuclear shape and organization classify stem and progenitor cells poised for distinct lineages. • Early oncogenic changes in mesenchymal stem cells (MSCs) are also detected with nuclear descriptors. • A new class of cancer-mitigating biomaterials was identified based on image

  1. SmilesDrawer: Parsing and Drawing SMILES-Encoded Molecular Structures Using Client-Side JavaScript.

    Science.gov (United States)

    Probst, Daniel; Reymond, Jean-Louis

    2018-01-22

    Here we present SmilesDrawer, a dependency-free JavaScript component capable of both parsing and drawing SMILES-encoded molecular structures client-side, developed to be easily integrated into web projects and to display organic molecules in large numbers and fast succession. SmilesDrawer can draw structurally and stereochemically complex structures such as maitotoxin and C 60 without using templates, yet has an exceptionally small computational footprint and low memory usage without the requirement for loading images or any other form of client-server communication, making it easy to integrate even in secure (intranet, firewalled) or offline applications. These features allow the rendering of thousands of molecular structure drawings on a single web page within seconds on a wide range of hardware supporting modern browsers. The source code as well as the most recent build of SmilesDrawer is available on Github ( http://doc.gdb.tools/smilesDrawer/ ). Both yarn and npm packages are also available.

  2. High-content image informatics of the structural nuclear protein NuMA parses trajectories for stem/progenitor cell lineages and oncogenic transformation.

    Science.gov (United States)

    Vega, Sebastián L; Liu, Er; Arvind, Varun; Bushman, Jared; Sung, Hak-Joon; Becker, Matthew L; Lelièvre, Sophie; Kohn, Joachim; Vidi, Pierre-Alexandre; Moghe, Prabhas V

    2017-02-01

    Stem and progenitor cells that exhibit significant regenerative potential and critical roles in cancer initiation and progression remain difficult to characterize. Cell fates are determined by reciprocal signaling between the cell microenvironment and the nucleus; hence parameters derived from nuclear remodeling are ideal candidates for stem/progenitor cell characterization. Here we applied high-content, single cell analysis of nuclear shape and organization to examine stem and progenitor cells destined to distinct differentiation endpoints, yet undistinguishable by conventional methods. Nuclear descriptors defined through image informatics classified mesenchymal stem cells poised to either adipogenic or osteogenic differentiation, and oligodendrocyte precursors isolated from different regions of the brain and destined to distinct astrocyte subtypes. Nuclear descriptors also revealed early changes in stem cells after chemical oncogenesis, allowing the identification of a class of cancer-mitigating biomaterials. To capture the metrology of nuclear changes, we developed a simple and quantitative "imaging-derived" parsing index, which reflects the dynamic evolution of the high-dimensional space of nuclear organizational features. A comparative analysis of parsing outcomes via either nuclear shape or textural metrics of the nuclear structural protein NuMA indicates the nuclear shape alone is a weak phenotypic predictor. In contrast, variations in the NuMA organization parsed emergent cell phenotypes and discerned emergent stages of stem cell transformation, supporting a prognosticating role for this protein in the outcomes of nuclear functions. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. The interaction of parsing rules and argument – Predicate constructions: implications for the structure of the Grammaticon in FunGramKB

    Directory of Open Access Journals (Sweden)

    María del Carmen Fumero Pérez

    2017-07-01

    Full Text Available The Functional Grammar Knowledge Base (FunGramKB, (Periñán-Pascual and Arcas-Túnez 2010 is a multipurpose lexico-conceptual knowledge base designed to be used in different Natural Language Processing (NLP tasks. It is complemented with the ARTEMIS (Automatically Representing Text Meaning via an Interlingua–based System application, a parsing device linguistically grounded on Role and Reference Grammar (RRG that transduces natural language fragments into their corresponding grammatical and semantic structures. This paper unveils the different phases involved in its parsing routine, paying special attention to the treatment of argumental constructions. As an illustrative case, we will follow all the steps necessary to effectively parse a For-Benefactive structure within ARTEMIS. This methodology will reveal the necessity to distinguish between Kernel constructs and L1-constructions, since the latter involve a modification of the lexical template of the verb. Our definition of L1-constructions leads to the reorganization of the catalogue of FunGramKB L1-constructions, formerly based on Levin’s (1993 alternations. Accordingly, a rearrangement of the internal configuration of the L1-Constructicon within the Grammaticon is proposed.

  4. A heuristic approach to incremental and reactive scheduling

    Science.gov (United States)

    Odubiyi, Jide B.; Zoch, David R.

    1989-01-01

    An heuristic approach to incremental and reactive scheduling is described. Incremental scheduling is the process of modifying an existing schedule if the initial schedule does not meet its stated initial goals. Reactive scheduling occurs in near real-time in response to changes in available resources or the occurrence of targets of opportunity. Only minor changes are made during both incremental and reactive scheduling because a goal of re-scheduling procedures is to minimally impact the schedule. The described heuristic search techniques, which are employed by the Request Oriented Scheduling Engine (ROSE), a prototype generic scheduler, efficiently approximate the cost of reaching a goal from a given state and effective mechanisms for controlling search.

  5. Increment definitions for scale-dependent analysis of stochastic data.

    Science.gov (United States)

    Waechter, Matthias; Kouzmitchev, Alexei; Peinke, Joachim

    2004-11-01

    It is common for scale-dependent analysis of stochastic data to use the increment Delta(t,r) =xi(t+r)-xi(t) of a data set xi(t) as a stochastic measure, where r denotes the scale. For joint statistics of Delta(t,r) and Delta(t, r') the question of how to nest the increments on different scales r, r' is investigated. Here we show that in some cases spurious correlations between scales can be introduced by the common left-justified definition. The consequences for a Markov process are discussed. These spurious correlations can be avoided by an appropriate nesting of increments. We demonstrate this effect for different data sets and show how it can be detected and quantified. The problem allows to propose a unique method to distinguish between experimental data generated by a noiselike or a Langevin-like random-walk process, respectively.

  6. Making context explicit for explanation and incremental knowledge acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Brezillon, P. [Univ. Paris (France)

    1996-12-31

    Intelligent systems may be improved by making context explicit in problem solving. This is a lesson drawn from a study of the reasons why a number of knowledge-based systems (KBSs) failed. We discuss the interest to make context explicit in explanation generation and incremental knowledge acquisition, two important aspects of intelligent systems that aim to cooperate with users. We show how context can be used to better explain and incrementally acquire knowledge. The advantages of using context in explanation and incremental knowledge acquisition are discussed through SEPIT, an expert system for supporting diagnosis and explanation through simulation of power plants. We point out how the limitations of such systems may be overcome by making context explicit.

  7. Springback law study and application in incremental bending process

    Science.gov (United States)

    Zhang, Feifei; He, Kai; Dang, Xiaobing; Du, Ruxu

    2018-02-01

    One incremental bending process has been proposed for manufacturing the complex and thick ship-hull plates. The accuracy and efficiency for this novel process is mainly dependent on the loading path and thus the unavoidable springback behavior should be considered in the loading path determination. In this paper, firstly, the numerical simulation method is verified by the corresponding experiment, and then the springback law during the incremental bending process is investigated based on numerical simulation, and later the loading path based on the springback law and the minimum energy method is achieved for specific machining shape. Comparison between the designed curve based on springback law and the new simulation results verifies that the springback law obtained by numerical simulation is believable, so this study provides a new perspective for the further research on incremental bending process.

  8. Motion-Induced Blindness Using Increments and Decrements of Luminance

    Directory of Open Access Journals (Sweden)

    Stine Wm Wren

    2017-10-01

    Full Text Available Motion-induced blindness describes the disappearance of stationary elements of a scene when other, perhaps non-overlapping, elements of the scene are in motion. We measured the effects of increment (200.0 cd/m2 and decrement targets (15.0 cd/m2 and masks presented on a grey background (108.0 cd/m2, tapping into putative ON- and OFF-channels, on the rate of target disappearance psychophysically. We presented two-frame motion, which has coherent motion energy, and dynamic Glass patterns and dynamic anti-Glass patterns, which do not have coherent motion energy. Using the method of constant stimuli, participants viewed stimuli of varying durations (3.1 s, 4.6 s, 7.0 s, 11 s, or 16 s in a given trial and then indicated whether or not the targets vanished during that trial. Psychometric function midpoints were used to define absolute threshold mask duration for the disappearance of the target. 95% confidence intervals for threshold disappearance times were estimated using a bootstrap technique for each of the participants across two experiments. Decrement masks were more effective than increment masks with increment targets. Increment targets were easier to mask than decrement targets. Distinct mask pattern types had no effect, suggesting that perceived coherence contributes to the effectiveness of the mask. The ON/OFF dichotomy clearly carries its influence to the level of perceived motion coherence. Further, the asymmetry in the effects of increment and decrement masks on increment and decrement targets might lead one to speculate that they reflect the ‘importance’ of detecting decrements in the environment.

  9. Rapid Prototyping of wax foundry models in an incremental process

    Directory of Open Access Journals (Sweden)

    B. Kozik

    2011-04-01

    Full Text Available The paper presents an analysis incremental methods of creating wax founding models. There are two methods of Rapid Prototypingof wax models in an incremental process which are more and more often used in industrial practice and in scientific research.Applying Rapid Prototyping methods in the process of making casts allows for acceleration of work on preparing prototypes. It isespecially important in case of element having complicated shapes. The time of making a wax model depending on the size and the appliedRP method may vary from several to a few dozen hours.

  10. Incremental Learning of Skill Collections based on Intrinsic Motivation

    OpenAIRE

    Jan Hendrik Metzen; Frank eKirchner; Frank eKirchner

    2013-01-01

    Life-long learning of reusable, versatile skills is a key prerequisite forembodied agents that act in a complex, dynamic environment and are faced withdifferent tasks over their lifetime. We address the question of how an agentcan learn useful skills efficiently during a developmental period,i.e., when no task is imposed on him and no external reward signal is provided.Learning of skills in a developmental period needs to be incremental andself-motivated. We propose a new incremental, task-i...

  11. Bipower variation for Gaussian processes with stationary increments

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Corcuera, José Manuel; Podolskij, Mark

    2009-01-01

    Convergence in probability and central limit laws of bipower variation for Gaussian processes with stationary increments and for integrals with respect to such processes are derived. The main tools of the proofs are some recent powerful techniques of Wiener/Itô/Malliavin calculus for establishing...

  12. 76 FR 73475 - Immigration Benefits Business Transformation, Increment I; Correction

    Science.gov (United States)

    2011-11-29

    ... [CIS No. 2481-09; Docket No. USCIS-2009-0022] RIN 1615-AB83 Immigration Benefits Business Transformation, Increment I; Correction AGENCY: U.S. Citizenship and Immigration Services, DHS. ACTION: Final... to enable U.S. Citizenship and Immigration Services (USCIS) to transform its business processes. The...

  13. Next Generation Diagnostic System (NGDS) Increment 1 Early Fielding Report

    Science.gov (United States)

    2017-06-07

    laptop in order to utilize the Windows 10 operating system required by the Department of Defense. Additional developmental and cybersecurity testing...upgraded the NGDS laptop in order to implement the Windows 10 operating system required by the Department of Defense. Additional cybersecurity...Director, Operational Test and Evaluation Next Generation Diagnostic System (NGDS) Increment 1 Early Fielding Report   June 2017

  14. Single-point incremental forming and formability-failure diagrams

    DEFF Research Database (Denmark)

    Silva, M.B.; Skjødt, Martin; Atkins, A.G.

    2008-01-01

    In a recent work [1], the authors constructed a closed-form analytical model that is capable of dealing with the fundamentals of single point incremental forming and explaining the experimental and numerical results published in the literature over the past couple of years. The model is based on ...

  15. Incremental concept learning with few training examples and hierarchical classification

    NARCIS (Netherlands)

    Bouma, Henri; Eendebak, Pieter T.; Schutte, Klamer; Azzopardi, George; Burghouts, Gertjan J.

    2015-01-01

    Object recognition and localization are important to automatically interpret video and allow better querying on its content. We propose a method for object localization that learns incrementally and addresses four key aspects. Firstly, we show that for certain applications, recognition is feasible

  16. Factors for Radical Creativity, Incremental Creativity, and Routine, Noncreative Performance

    Science.gov (United States)

    Madjar, Nora; Greenberg, Ellen; Chen, Zheng

    2011-01-01

    This study extends theory and research by differentiating between routine, noncreative performance and 2 distinct types of creativity: radical and incremental. We also use a sensemaking perspective to examine the interplay of social and personal factors that may influence a person's engagement in a certain level of creative action versus routine,…

  17. Geometry of finite deformations and time-incremental analysis

    Czech Academy of Sciences Publication Activity Database

    Fiala, Zdeněk

    2016-01-01

    Roč. 81, May (2016), s. 230-244 ISSN 0020-7462 Institutional support: RVO:68378297 Keywords : solid mechanics * finite deformations * time-incremental analysis * Lagrangian system * evolution equation of Lie type Subject RIV: BE - Theoretical Physics Impact factor: 2.074, year: 2016 http://www.sciencedirect.com/science/article/pii/S0020746216000330

  18. Respiratory ammonia output and blood ammonia concentration during incremental exercise

    NARCIS (Netherlands)

    Ament, W; Huizenga, [No Value; Kort, E; van der Mark, TW; Grevink, RG; Verkerke, GJ

    The aim of this study was to investigate whether the increase of ammonia concentration and lactate concentration in blood was accompanied by an increased expiration of ammonia during graded exercise. Eleven healthy subjects performed an incremental cycle ergometer test. Blood ammonia, blood lactate

  19. Predicting Robust Vocabulary Growth from Measures of Incremental Learning

    Science.gov (United States)

    Frishkoff, Gwen A.; Perfetti, Charles A.; Collins-Thompson, Kevyn

    2011-01-01

    We report a study of incremental learning of new word meanings over multiple episodes. A new method called MESA (Markov Estimation of Semantic Association) tracked this learning through the automated assessment of learner-generated definitions. The multiple word learning episodes varied in the strength of contextual constraint provided by…

  20. Simultaneous versus incremental learning of multiple skills by modular robots

    NARCIS (Netherlands)

    Rossi, C.; Eiben, A.E.

    2014-01-01

    This paper is concerned with the problem of learning multiple skills by modular robots. The main question we address is whether it is better to learn multiple skills simultaneously (all-at-once) or incrementally (one-by-one). We conduct an experimental study with modular robots of various

  1. Gust Disturbance Alleviation with Incremental Nonlinear Dynamic Inversion

    NARCIS (Netherlands)

    Smeur, E.J.J.; de Croon, G.C.H.E.; Chu, Q.P.

    2016-01-01

    Micro Aerial Vehicles (MAVs) are limited in their operation outdoors near obstacles by their ability to withstand wind gusts. Currently widespread position control methods such as Proportional Integral Derivative control do not perform well under the influence of gusts. Incremental Nonlinear Dynamic

  2. Incremental cryptography and security of public hash functions ...

    African Journals Online (AJOL)

    An investigation of incremental algorithms for crytographic functions was initiated. The problem, for collision-free hashing, is to design a scheme for which there exists an efficient “update” algorithm: this algorithm is given the hash function H, the hash h = H(M) of message M and the “replacement request” (j, m), and outputs ...

  3. Predicting Robust Vocabulary Growth from Measures of Incremental Learning

    NARCIS (Netherlands)

    Frishkoff, G.A.; Perfetti, C.A.; Collins-Thompson, K.

    2011-01-01

    We report a study of incremental learning of new word meanings over multiple episodes. A new method called MESA (Markov Estimation of Semantic Association) tracked this learning through the automated assessment of learner-generated definitions. The multiple word learning episodes varied in the

  4. Flexible Aircraft Gust Load Alleviation with Incremental Nonlinear Dynamic Inversion

    NARCIS (Netherlands)

    Wang, X.; van Kampen, E.; Chu, Q.; De Breuker, R.

    2018-01-01

    In this paper, an Incremental Nonlinear Dynamic Inversion (INDI) controller is
    developed for the flexible aircraft gust load alleviation (GLA) problem. First, a flexible aircraft model captures both inertia and aerodynamic coupling effects between flight dynamics and structural vibration

  5. Revisiting the fundamentals of single point incremental forming by

    DEFF Research Database (Denmark)

    Silva, Beatriz; Skjødt, Martin; Martins, Paulo A.F.

    2008-01-01

    Knowledge of the physics behind the fracture of material at the transition between the inclined wall and the corner radius of the sheet is of great importance for understanding the fundamentals of single point incremental forming (SPIF). How the material fractures, what is the state of strain and...

  6. The Incremental Validity of Positive Emotions in Predicting School Functioning

    Science.gov (United States)

    Lewis, Ashley D.; Huebner, E. Scott; Reschly, Amy L.; Valois, Robert F.

    2009-01-01

    Proponents of positive psychology have argued for more comprehensive assessments incorporating positive measures (e.g., student strengths) as well as negative measures (e.g., psychological symptoms). However, few variable-centered studies have addressed the incremental validity of positive assessment data. The authors investigated the incremental…

  7. Some theoretical aspects of capacity increment in gaseous diffusion

    International Nuclear Information System (INIS)

    Coates, J.H.; Guais, J.C.; Lamorlette, G.

    1975-01-01

    Facing to the sharply growing needs of enrichment services, the problem of implementing new capacities must be included in an optimized scheme spread out in time. In this paper the alternative solutions will be studied first for an unique increment decision, and then in an optimum schedule. The limits of the analysis will be discussed [fr

  8. Incremental validity of a measure of emotional intelligence.

    Science.gov (United States)

    Chapman, Benjamin P; Hayslip, Bert

    2005-10-01

    After the Schutte Self-Report Inventory of Emotional Intelligence (SSRI; Schutte et al., 1998) was found to predict college grade point average, subsequent emotional intelligence (EI)-college adjustment research has used inconsistent measures and widely varying criteria, resulting in confusion about the construct's predictive validity. In this study, we assessed the SSRI's incremental validity for a wide range of adjustment criteria, pitting it against a competing trait measure, the NEO Five-Factor Inventory (NEO-FFI; Costa & McCrae, 1992), and tests of fluid and crystallized intelligence. At a broad bandwidth, the SSRI total score significantly and uniquely predicted variance beyond NEO-FFI domain scores in the UCLA Loneliness Scale, Revised (Russell, Peplau, & Cutrono, 1980) scores. Higher fidelity analyses using previously identified SSRI factors and NEO-FFI item clusters revealed that the SSRI's Optimism/Mood Regulation and Emotion Appraisal factors contributed unique variance to self-reported study habits and social stress, respectively. The potential moderation of incremental validity by gender did not reach significance due to loss of power from splitting the sample, and mediational analyses revealed the SSRI Optimism/Mood Regulation factor was both directly and indirectly related to various criteria. We discuss the small magnitude of incremental validity coefficients and the differential incremental validity of SSRI factor and total scores.

  9. A sequential tree approach for incremental sequential pattern mining

    Indian Academy of Sciences (India)

    Data mining; STISPM; sequential tree; incremental mining; backward tracking. Abstract. ''Sequential pattern mining'' is a prominent and significant method to explore the knowledge and innovation from the large database. Common sequential pattern mining algorithms handle static databases.Pragmatically, looking into the ...

  10. Cascaded incremental nonlinear dynamic inversion for MAV disturbance rejection

    NARCIS (Netherlands)

    Smeur, E.J.J.; de Croon, G.C.H.E.; Chu, Q.

    2018-01-01

    This paper presents the cascaded integration of Incremental Nonlinear Dynamic Inversion (INDI) for attitude control and INDI for position control of micro air vehicles. Significant improvements over a traditional Proportional Integral Derivative (PID) controller are demonstrated in an experiment

  11. Playing by the rules? Analysing incremental urban developments

    NARCIS (Netherlands)

    Karnenbeek, van Lilian; Janssen-Jansen, Leonie

    2018-01-01

    Current urban developments are often considered outdated and static, and the argument follows that they should become more adaptive. In this paper, we argue that existing urban development are already adaptive and incremental. Given this flexibility in urban development, understanding changes in the

  12. a model for incremental grounding in spoken dialogue systems

    NARCIS (Netherlands)

    Visser, Thomas; Traum, David; DeVault, David; op den Akker, Hendrikus J.A.

    2012-01-01

    Recent advances in incremental language processing for dialogue systems promise to enable more natural conversation between humans and computers. By analyzing the user's utterance while it is still in progress, systems can provide more human-like overlapping and backchannel responses to convey their

  13. A model for incremental grounding in spoken dialogue systems

    NARCIS (Netherlands)

    Visser, Thomas; Traum, David; DeVault, David; op den Akker, Hendrikus J.A.

    2014-01-01

    We present a computational model of incremental grounding, including state updates and action selection. The model is inspired by corpus-based examples of overlapping utterances of several sorts, including backchannels and completions. The model has also been partially implemented within a virtual

  14. Average-case analysis of incremental topological ordering

    DEFF Research Database (Denmark)

    Ajwani, Deepak; Friedrich, Tobias

    2010-01-01

    Many applications like pointer analysis and incremental compilation require maintaining a topological ordering of the nodes of a directed acyclic graph (DAG) under dynamic updates. All known algorithms for this problem are either only analyzed for worst-case insertion sequences or only evaluated...

  15. Against the Odds: Academic Underdogs Benefit from Incremental Theories

    Science.gov (United States)

    Davis, Jody L.; Burnette, Jeni L.; Allison, Scott T.; Stone, Heather

    2011-01-01

    An implicit theory of ability approach to motivation argues that students who believe traits to be malleable (incremental theorists), relative to those who believe traits to be fixed (entity theorists), cope more effectively when academic challenges arise. In the current work, we integrated the implicit theory literature with research on top dog…

  16. The Incremental Hybrid Natural Element Method for Elastoplasticity Problems

    Directory of Open Access Journals (Sweden)

    Yongqi Ma

    2014-01-01

    Full Text Available An incremental hybrid natural element method (HNEM is proposed to solve the two-dimensional elasto-plastic problems in the paper. The corresponding formulae of this method are obtained by consolidating the hybrid stress element and the incremental Hellinger-Reissner variational principle into the NEM. Using this method, the stress and displacement variables at each node can be directly obtained after the stress and displacement interpolation functions are properly constructed. The numerical examples are given to show the advantages of the proposed algorithm of the HNEM, and the solutions for the elasto-plastic problems are better than those of the NEM. In addition, the performance of the proposed algorithm is better than the recover stress method using moving least square interpolation.

  17. Power calculation of linear and angular incremental encoders

    Science.gov (United States)

    Prokofev, Aleksandr V.; Timofeev, Aleksandr N.; Mednikov, Sergey V.; Sycheva, Elena A.

    2016-04-01

    Automation technology is constantly expanding its role in improving the efficiency of manufacturing and testing processes in all branches of industry. More than ever before, the mechanical movements of linear slides, rotary tables, robot arms, actuators, etc. are numerically controlled. Linear and angular incremental photoelectric encoders measure mechanical motion and transmit the measured values back to the control unit. The capabilities of these systems are undergoing continual development in terms of their resolution, accuracy and reliability, their measuring ranges, and maximum speeds. This article discusses the method of power calculation of linear and angular incremental photoelectric encoders, to find the optimum parameters for its components, such as light emitters, photo-detectors, linear and angular scales, optical components etc. It analyzes methods and devices that permit high resolutions in the order of 0.001 mm or 0.001°, as well as large measuring lengths of over 100 mm. In linear and angular incremental photoelectric encoders optical beam is usually formulated by a condenser lens passes through the measuring unit changes its value depending on the movement of a scanning head or measuring raster. Past light beam is converting into an electrical signal by the photo-detecter's block for processing in the electrical block. Therefore, for calculating the energy source is a value of the desired value of the optical signal at the input of the photo-detecter's block, which reliably recorded and processed in the electronic unit of linear and angular incremental optoelectronic encoders. Automation technology is constantly expanding its role in improving the efficiency of manufacturing and testing processes in all branches of industry. More than ever before, the mechanical movements of linear slides, rotary tables, robot arms, actuators, etc. are numerically controlled. Linear and angular incremental photoelectric encoders measure mechanical motion and

  18. Numerical and experimental microscale analysis of the incremental forming process

    Science.gov (United States)

    Szyndler, Joanna; Delannay, Laurent; Muszka, Krzysztof; Madej, Lukasz

    2017-10-01

    Development of the 2D concurrent multiscale numerical model of novel incremental forming (IF) process is the main aim of the paper. The IF process is used to obtain light and durable integral parts, especially useful in aerospace or automotive industries. Particular attention in the present work is put on numerical investigation of material behavior at both, macro and micro scale levels. A Finite Element Method (FEM) supported by Digital Material Representation (DMR) concept is used during the investigation. Also, the Crystal Plasticity (CP) theory is applied to describe material flow at the grain level. Examples of obtained results both from the macro and micro scales are presented in the form of strain distributions, grain shapes and pole figures at different process stages. Moreover, Electron Backscatter Diffraction (EBSD) analysis is used to obtain detailed information regarding material morphology changes during the incremental forming for the comparison purposes.

  19. Thermomechanical simulations and experimental validation for high speed incremental forming

    Science.gov (United States)

    Ambrogio, Giuseppina; Gagliardi, Francesco; Filice, Luigino; Romero, Natalia

    2016-10-01

    Incremental sheet forming (ISF) consists in deforming only a small region of the workspace through a punch driven by a NC machine. The drawback of this process is its slowness. In this study, a high speed variant has been investigated from both numerical and experimental points of view. The aim has been the design of a FEM model able to perform the material behavior during the high speed process by defining a thermomechanical model. An experimental campaign has been performed by a CNC lathe with high speed to test process feasibility. The first results have shown how the material presents the same performance than in conventional speed ISF and, in some cases, better material behavior due to the temperature increment. An accurate numerical simulation has been performed to investigate the material behavior during the high speed process confirming substantially experimental evidence.

  20. Incremental exposure facilitates adaptation to sensory rearrangement. [vestibular stimulation patterns

    Science.gov (United States)

    Lackner, J. R.; Lobovits, D. N.

    1978-01-01

    Visual-target pointing experiments were performed on 24 adult volunteers in order to compare the relative effectiveness of incremental (stepwise) and single-step exposure conditions on adaptation to visual rearrangement. The differences between the preexposure and postexposure scores served as an index of the adaptation elicited during the exposure period. It is found that both single-step and stepwise exposure to visual rearrangement elicit compensatory changes in sensorimotor coordination. However, stepwise exposure, when compared to single-step exposur in terms of the average magnitude of visual displacement over the exposure period, clearly enhances the rate of adaptation. It seems possible that the enhancement of adaptation to unusual patterns of sensory stimulation produced by incremental exposure reflects a general principle of sensorimotor function.

  1. On kinematical minimum principles for rates and increments in plasticity

    International Nuclear Information System (INIS)

    Zouain, N.

    1984-01-01

    The optimization approach for elastoplastic analysis is discussed showing that some minimum principles related to numerical methods can be derived by means of duality and penalization procedures. Three minimum principles for velocity and plastic multiplier rate fields are presented in the framework of perfect plasticity. The first one is the classical Greenberg formulation. The second one, due to Capurso, is developed here with different motivation, and modified by penalization of constraints so as to arrive at a third principle for rates. The counterparts of these optimization formulations in terms of discrete increments of displacements of displacements and plastic multipliers are discussed. The third one of these minimum principles for finite increments is recognized to be closely related to Maier's formulation of holonomic plasticity. (Author) [pt

  2. Will Incremental Hemodialysis Preserve Residual Function and Improve Patient Survival?

    Science.gov (United States)

    Davenport, Andrew

    2015-01-01

    The progressive loss of residual renal function in peritoneal dialysis patients is associated with increased mortality. It has been suggested that incremental dialysis may help preserve residual renal function and improve patient survival. Residual renal function depends upon both patient related and dialysis associated factors. Maintaining patients in an over-hydrated state may be associated with better preservation of residual renal function but any benefit comes with a significant risk of cardiovascular consequences. Notably, it is only observational studies that have reported an association between dialysis patient survival and residual renal function; causality has not been established for dialysis patient survival. The tenuous connections between residual renal function and outcomes and between incremental hemodialysis and residual renal function should temper our enthusiasm for interventions in this area. PMID:25385441

  3. Creating Helical Tool Paths for Single Point Incremental Forming

    DEFF Research Database (Denmark)

    Skjødt, Martin; Hancock, Michael H.; Bay, Niels

    2007-01-01

    Single point incremental forming (SPIF) is a relatively new sheet forming process. A sheet is clamped in a rig and formed incrementally using a rotating single point tool in the form of a rod with a spherical end. The process is often performed on a CNC milling machine and the tool movement...... is programed using CAM software intended for surface milling. Often the function called profile milling or contour milling is applied. Using this milling function the tool only has a continuous feed rate in two directions X and Y, which is the plane of the undeformed sheet. The feed in the vertical Z direction...... from the profile milling code and converts them into a helical tool path with continuous feed in all three directions. Using the helical tool path the scarring is removed, the part is otherwise unchanged and a major disadvantage of using milling software for SPIF is removed. The solution...

  4. An online incremental orthogonal component analysis method for dimensionality reduction.

    Science.gov (United States)

    Zhu, Tao; Xu, Ye; Shen, Furao; Zhao, Jinxi

    2017-01-01

    In this paper, we introduce a fast linear dimensionality reduction method named incremental orthogonal component analysis (IOCA). IOCA is designed to automatically extract desired orthogonal components (OCs) in an online environment. The OCs and the low-dimensional representations of original data are obtained with only one pass through the entire dataset. Without solving matrix eigenproblem or matrix inversion problem, IOCA learns incrementally from continuous data stream with low computational cost. By proposing an adaptive threshold policy, IOCA is able to automatically determine the dimension of feature subspace. Meanwhile, the quality of the learned OCs is guaranteed. The analysis and experiments demonstrate that IOCA is simple, but efficient and effective. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. The Analysis of Forming Forces in Single Point Incremental Forming

    Directory of Open Access Journals (Sweden)

    Koh Kyung Hee

    2016-01-01

    Full Text Available Incremental forming is a process to produce sheet metal parts in quick. Because there is no need for dedicated dies and molds, this process is less cost and time spent. The purpose of this study is to investigate forming forces in single point incremental forming. Producing a cone frustum of aluminum is tested for forming forces. A dynamometer is used to collect forming forces and analyze them. These forces are compared with cutting forces upon producing same geometrical shapes of experimental parts. The forming forces in Z direction are 40 times larger than the machining forces. A spindle and its axis of a forming machine should be designed enough to withstand the forming forces.

  6. Incremental Knowledge Base Construction Using DeepDive.

    Science.gov (United States)

    Shin, Jaeho; Wu, Sen; Wang, Feiran; De Sa, Christopher; Zhang, Ce; Ré, Christopher

    2015-07-01

    Populating a database with unstructured information is a long-standing problem in industry and research that encompasses problems of extraction, cleaning, and integration. Recent names used for this problem include dealing with dark data and knowledge base construction (KBC). In this work, we describe DeepDive, a system that combines database and machine learning ideas to help develop KBC systems, and we present techniques to make the KBC process more efficient. We observe that the KBC process is iterative, and we develop techniques to incrementally produce inference results for KBC systems. We propose two methods for incremental inference, based respectively on sampling and variational techniques. We also study the tradeoff space of these methods and develop a simple rule-based optimizer. DeepDive includes all of these contributions, and we evaluate Deep-Dive on five KBC systems, showing that it can speed up KBC inference tasks by up to two orders of magnitude with negligible impact on quality.

  7. Final Safety Analysis Report (FSAR) for Building 332, Increment III

    Energy Technology Data Exchange (ETDEWEB)

    Odell, B. N.; Toy, Jr., A. J.

    1977-08-31

    This Final Safety Analysis Report (FSAR) supplements the Preliminary Safety Analysis Report (PSAR), dated January 18, 1974, for Building 332, Increment III of the Plutonium Materials Engineering Facility located at the Lawrence Livermore Laboratory (LLL). The FSAR, in conjunction with the PSAR, shows that the completed increment provides facilities for safely conducting the operations as described. These documents satisfy the requirements of ERDA Manual Appendix 6101, Annex C, dated April 8, 1971. The format and content of this FSAR complies with the basic requirements of the letter of request from ERDA San to LLL, dated March 10, 1972. Included as appendices in support of th FSAR are the Building 332 Operational Safety Procedure and the LLL Disaster Control Plan.

  8. Table incremental slow injection CE-CT in lung cancer

    International Nuclear Information System (INIS)

    Yoshida, Shoji; Maeda, Tomoho; Morita, Masaru

    1988-01-01

    The purpose of this study is to evaluate tumor enhancement in lung cancer under the table incremental study with slow injection of contrast media. The early serial 8 sliced images during the slow injection (1.5 ml/sec) of contrant media were obtained. Following the early images, delayed 8 same sliced images were taken in 2 minutes later. Chacteristic enhanced patterns of the primary cancer and metastatic mediastinal lymphnode were recognized in this study. Enhancement of the primary lesion was classified in 4 patterns, irregular geographic pattern, heterogeneous pattern, homogeneous pattern and rim-enhanced pattern. In mediastinal metastatic lymphadenopathy, three enhanced patterns were obtained, heterogeneous, homogeneous and ring enhanced pattern. Some characteristic enhancement patterns according to the histopathological finding of the lung cancer were obtained. With using this incremental slow injection CE-CT, precise information about the relationship between lung cancer and adjacent mediastinal structure, and obvious staining patterns of the tumor and mediastinal lymphnode were recognized. (author)

  9. Automobile sheet metal part production with incremental sheet forming

    Directory of Open Access Journals (Sweden)

    İsmail DURGUN

    2016-02-01

    Full Text Available Nowadays, effect of global warming is increasing drastically so it leads to increased interest on energy efficiency and sustainable production methods. As a result of adverse conditions, national and international project platforms, OEMs (Original Equipment Manufacturers, SMEs (Small and Mid-size Manufacturers perform many studies or improve existing methodologies in scope of advanced manufacturing techniques. In this study, advanced manufacturing and sustainable production method "Incremental Sheet Metal Forming (ISF" was used for sheet metal forming process. A vehicle fender was manufactured with or without die by using different toolpath strategies and die sets. At the end of the study, Results have been investigated under the influence of method and parameters used.Keywords: Template incremental sheet metal, Metal forming

  10. Predicting Robust Vocabulary Growth from Measures of Incremental Learning

    OpenAIRE

    Frishkoff, Gwen A.; Perfetti, Charles A.; Collins-Thompson, Kevyn

    2011-01-01

    We report a study of incremental learning of new word meanings over multiple episodes. A new method called MESA (Markov Estimation of Semantic Association) tracked this learning through the automated assessment of learner-generated definitions. The multiple word learning episodes varied in the strength of contextual constraint provided by sentences, in the consistency of this constraint, and in the spacing of sentences provided for each trained word. Effects of reading skill were also examine...

  11. Nonparametric causal effects based on incremental propensity score interventions

    OpenAIRE

    Kennedy, Edward H.

    2017-01-01

    Most work in causal inference considers deterministic interventions that set each unit's treatment to some fixed value. However, under positivity violations these interventions can lead to non-identification, inefficiency, and effects with little practical relevance. Further, corresponding effects in longitudinal studies are highly sensitive to the curse of dimensionality, resulting in widespread use of unrealistic parametric models. We propose a novel solution to these problems: incremental ...

  12. Diagnosis of small hepatocellular carcinoma by incremental dynamic CT

    International Nuclear Information System (INIS)

    Uchida, Masafumi; Kumabe, Tsutomu; Edamitsu, Osamu

    1993-01-01

    Thirty cases of pathologically confirmed small hepatocellular carcinoma were examined by Incremental Dynamic CT (ICT). ICT scanned the whole liver with single-breath-hold technique; therefore, effective early contrast enhancement could be obtained for diagnosis. Among the 30 tumors, 26 were detected. The detection rate was 87%. A high detection rate was obtained in tumors more than 20 mm in diameter. Twenty-two of 26 tumors could be diagnosed correctly. ICT examination was useful for detection of small hepatocellular carcinoma. (author)

  13. Observers for Systems with Nonlinearities Satisfying an Incremental Quadratic Inequality

    Science.gov (United States)

    Acikmese, Ahmet Behcet; Corless, Martin

    2004-01-01

    We consider the problem of state estimation for nonlinear time-varying systems whose nonlinearities satisfy an incremental quadratic inequality. These observer results unifies earlier results in the literature; and extend it to some additional classes of nonlinearities. Observers are presented which guarantee that the state estimation error exponentially converges to zero. Observer design involves solving linear matrix inequalities for the observer gain matrices. Results are illustrated by application to a simple model of an underwater.

  14. Retroactive Operations: On 'increments' in Mandarin Chinese conversations

    OpenAIRE

    Lim, Ni Eng

    2014-01-01

    Conversation Analysis (CA) has established repair (Schegloff, Jefferson & Sacks 1977; Schegloff 1979; Kitzinger 2013) as a conversational mechanism for managing contingencies of talk-in-interaction. In this dissertation, I look at a particular sort of `repair' termed TCU-continuations (or otherwise known increments in other literature) in Mandarin Chinese (henceforth "Chinese"), broadly defined as speakers producing further talk after a possibly complete utterance, which is fashioned not as a...

  15. Incremental validity of emotional intelligence ability in predicting academic achievement.

    Science.gov (United States)

    Lanciano, Tiziana; Curci, Antonietta

    2014-01-01

    We tested the incremental validity of an ability measure of emotional intelligence (El) in predicting academic achievement in undergraduate students, controlling for cognitive abilities and personality traits. Academic achievement has been conceptualized in terms of the number of exams, grade point average, and study time taken to prepare for each exam. Additionally, gender differences were taken into account in these relationships. Participants filled in the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), the Raven's Advanced Progressive Matrices, the reduced version of the Eysenck Personality Questionnaire, and academic achievement measures. Results showed that El abilities were positively related to academic achievement indices, such as the number of exams and grade point average; total El ability and the Perceiving branch were negatively associated with the study time spent preparing for exams. Furthermore, El ability adds a percentage of incremental variance with respect to cognitive ability and personality variables in explaining scholastic success. The magnitude of the associations between El abilities and academic achievement measures was generally higher for men than for women. Jointly considered, the present findings support the incremental validity of the MSCEIT and provide positive indications of the importance of El in students' academic development. The helpfulness of El training in the context of academic institutions is discussed.

  16. Health level seven interoperability strategy: big data, incrementally structured.

    Science.gov (United States)

    Dolin, R H; Rogers, B; Jaffe, C

    2015-01-01

    Describe how the HL7 Clinical Document Architecture (CDA), a foundational standard in US Meaningful Use, contributes to a "big data, incrementally structured" interoperability strategy, whereby data structured incrementally gets large amounts of data flowing faster. We present cases showing how this approach is leveraged for big data analysis. To support the assertion that semi-structured narrative in CDA format can be a useful adjunct in an overall big data analytic approach, we present two case studies. The first assesses an organization's ability to generate clinical quality reports using coded data alone vs. coded data supplemented by CDA narrative. The second leverages CDA to construct a network model for referral management, from which additional observations can be gleaned. The first case shows that coded data supplemented by CDA narrative resulted in significant variances in calculated performance scores. In the second case, we found that the constructed network model enables the identification of differences in patient characteristics among different referral work flows. The CDA approach goes after data indirectly, by focusing first on the flow of narrative, which is then incrementally structured. A quantitative assessment of whether this approach will lead to a greater flow of data and ultimately a greater flow of structured data vs. other approaches is planned as a future exercise. Along with growing adoption of CDA, we are now seeing the big data community explore the standard, particularly given its potential to supply analytic en- gines with volumes of data previously not possible.

  17. Information bottleneck based incremental fuzzy clustering for large biomedical data.

    Science.gov (United States)

    Liu, Yongli; Wan, Xing

    2016-08-01

    Incremental fuzzy clustering combines advantages of fuzzy clustering and incremental clustering, and therefore is important in classifying large biomedical literature. Conventional algorithms, suffering from data sparsity and high-dimensionality, often fail to produce reasonable results and may even assign all the objects to a single cluster. In this paper, we propose two incremental algorithms based on information bottleneck, Single-Pass fuzzy c-means (spFCM-IB) and Online fuzzy c-means (oFCM-IB). These two algorithms modify conventional algorithms by considering different weights for each centroid and object and scoring mutual information loss to measure the distance between centroids and objects. spFCM-IB and oFCM-IB are used to group a collection of biomedical text abstracts from Medline database. Experimental results show that clustering performances of our approaches are better than such prominent counterparts as spFCM, spHFCM, oFCM and oHFCM, in terms of accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Design and Performance Analysis of Incremental Networked Predictive Control Systems.

    Science.gov (United States)

    Pang, Zhong-Hua; Liu, Guo-Ping; Zhou, Donghua

    2016-06-01

    This paper is concerned with the design and performance analysis of networked control systems with network-induced delay, packet disorder, and packet dropout. Based on the incremental form of the plant input-output model and an incremental error feedback control strategy, an incremental networked predictive control (INPC) scheme is proposed to actively compensate for the round-trip time delay resulting from the above communication constraints. The output tracking performance and closed-loop stability of the resulting INPC system are considered for two cases: 1) plant-model match case and 2) plant-model mismatch case. For the former case, the INPC system can achieve the same output tracking performance and closed-loop stability as those of the corresponding local control system. For the latter case, a sufficient condition for the stability of the closed-loop INPC system is derived using the switched system theory. Furthermore, for both cases, the INPC system can achieve a zero steady-state output tracking error for step commands. Finally, both numerical simulations and practical experiments on an Internet-based servo motor system illustrate the effectiveness of the proposed method.

  19. Incremental Support Vector Machine Framework for Visual Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yuichi Motai

    2007-01-01

    Full Text Available Motivated by the emerging requirements of surveillance networks, we present in this paper an incremental multiclassification support vector machine (SVM technique as a new framework for action classification based on real-time multivideo collected by homogeneous sites. The technique is based on an adaptation of least square SVM (LS-SVM formulation but extends beyond the static image-based learning of current SVM methodologies. In applying the technique, an initial supervised offline learning phase is followed by a visual behavior data acquisition and an online learning phase during which the cluster head performs an ensemble of model aggregations based on the sensor nodes inputs. The cluster head then selectively switches on designated sensor nodes for future incremental learning. Combining sensor data offers an improvement over single camera sensing especially when the latter has an occluded view of the target object. The optimization involved alleviates the burdens of power consumption and communication bandwidth requirements. The resulting misclassification error rate, the iterative error reduction rate of the proposed incremental learning, and the decision fusion technique prove its validity when applied to visual sensor networks. Furthermore, the enabled online learning allows an adaptive domain knowledge insertion and offers the advantage of reducing both the model training time and the information storage requirements of the overall system which makes it even more attractive for distributed sensor networks communication.

  20. Parallel Algorithm for Incremental Betweenness Centrality on Large Graphs

    KAUST Repository

    Jamour, Fuad Tarek

    2017-10-17

    Betweenness centrality quantifies the importance of nodes in a graph in many applications, including network analysis, community detection and identification of influential users. Typically, graphs in such applications evolve over time. Thus, the computation of betweenness centrality should be performed incrementally. This is challenging because updating even a single edge may trigger the computation of all-pairs shortest paths in the entire graph. Existing approaches cannot scale to large graphs: they either require excessive memory (i.e., quadratic to the size of the input graph) or perform unnecessary computations rendering them prohibitively slow. We propose iCentral; a novel incremental algorithm for computing betweenness centrality in evolving graphs. We decompose the graph into biconnected components and prove that processing can be localized within the affected components. iCentral is the first algorithm to support incremental betweeness centrality computation within a graph component. This is done efficiently, in linear space; consequently, iCentral scales to large graphs. We demonstrate with real datasets that the serial implementation of iCentral is up to 3.7 times faster than existing serial methods. Our parallel implementation that scales to large graphs, is an order of magnitude faster than the state-of-the-art parallel algorithm, while using an order of magnitude less computational resources.

  1. Context-dependent incremental timing cells in the primate hippocampus.

    Science.gov (United States)

    Sakon, John J; Naya, Yuji; Wirth, Sylvia; Suzuki, Wendy A

    2014-12-23

    We examined timing-related signals in primate hippocampal cells as animals performed an object-place (OP) associative learning task. We found hippocampal cells with firing rates that incrementally increased or decreased across the memory delay interval of the task, which we refer to as incremental timing cells (ITCs). Three distinct categories of ITCs were identified. Agnostic ITCs did not distinguish between different trial types. The remaining two categories of cells signaled time and trial context together: One category of cells tracked time depending on the behavioral action required for a correct response (i.e., early vs. late release), whereas the other category of cells tracked time only for those trials cued with a specific OP combination. The context-sensitive ITCs were observed more often during sessions where behavioral learning was observed and exhibited reduced incremental firing on incorrect trials. Thus, single primate hippocampal cells signal information about trial timing, which can be linked with trial type/context in a learning-dependent manner.

  2. Product Quality Modelling Based on Incremental Support Vector Machine

    International Nuclear Information System (INIS)

    Wang, J; Zhang, W; Qin, B; Shi, W

    2012-01-01

    Incremental Support vector machine (ISVM) is a new learning method developed in recent years based on the foundations of statistical learning theory. It is suitable for the problem of sequentially arriving field data and has been widely used for product quality prediction and production process optimization. However, the traditional ISVM learning does not consider the quality of the incremental data which may contain noise and redundant data; it will affect the learning speed and accuracy to a great extent. In order to improve SVM training speed and accuracy, a modified incremental support vector machine (MISVM) is proposed in this paper. Firstly, the margin vectors are extracted according to the Karush-Kuhn-Tucker (KKT) condition; then the distance from the margin vectors to the final decision hyperplane is calculated to evaluate the importance of margin vectors, where the margin vectors are removed while their distance exceed the specified value; finally, the original SVs and remaining margin vectors are used to update the SVM. The proposed MISVM can not only eliminate the unimportant samples such as noise samples, but also can preserve the important samples. The MISVM has been experimented on two public data and one field data of zinc coating weight in strip hot-dip galvanizing, and the results shows that the proposed method can improve the prediction accuracy and the training speed effectively. Furthermore, it can provide the necessary decision supports and analysis tools for auto control of product quality, and also can extend to other process industries, such as chemical process and manufacturing process.

  3. Product Quality Modelling Based on Incremental Support Vector Machine

    Science.gov (United States)

    Wang, J.; Zhang, W.; Qin, B.; Shi, W.

    2012-05-01

    Incremental Support vector machine (ISVM) is a new learning method developed in recent years based on the foundations of statistical learning theory. It is suitable for the problem of sequentially arriving field data and has been widely used for product quality prediction and production process optimization. However, the traditional ISVM learning does not consider the quality of the incremental data which may contain noise and redundant data; it will affect the learning speed and accuracy to a great extent. In order to improve SVM training speed and accuracy, a modified incremental support vector machine (MISVM) is proposed in this paper. Firstly, the margin vectors are extracted according to the Karush-Kuhn-Tucker (KKT) condition; then the distance from the margin vectors to the final decision hyperplane is calculated to evaluate the importance of margin vectors, where the margin vectors are removed while their distance exceed the specified value; finally, the original SVs and remaining margin vectors are used to update the SVM. The proposed MISVM can not only eliminate the unimportant samples such as noise samples, but also can preserve the important samples. The MISVM has been experimented on two public data and one field data of zinc coating weight in strip hot-dip galvanizing, and the results shows that the proposed method can improve the prediction accuracy and the training speed effectively. Furthermore, it can provide the necessary decision supports and analysis tools for auto control of product quality, and also can extend to other process industries, such as chemical process and manufacturing process.

  4. Incremental Closed-loop Identification of Linear Parameter Varying Systems

    DEFF Research Database (Denmark)

    Bendtsen, Jan Dimon; Trangbæk, Klaus

    2011-01-01

    , closed-loop system identification is more difficult than open-loop identification. In this paper we prove that the so-called Hansen Scheme, a technique known from linear time-invariant systems theory for transforming closed-loop system identification problems into open-loop-like problems, can be extended......This paper deals with system identification for control of linear parameter varying systems. In practical applications, it is often important to be able to identify small plant changes in an incremental manner without shutting down the system and/or disconnecting the controller; unfortunately...... to accommodate linear parameter varying systems as well....

  5. Minimizing System Modification in an Incremental Design Approach

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Pop, Traian

    2001-01-01

    In this paper we present an approach to mapping and scheduling of distributed embedded systems for hard real-time applications, aiming at minimizing the system modification cost. We consider an incremental design process that starts from an already existing sys-tem running a set of applications. We...... are interested to implement new functionality so that the already running applications are dis-turbed as little as possible and there is a good chance that, later, new functionality can easily be added to the resulted system. The mapping and scheduling problem are considered in the context of a realistic...

  6. An Approach to Incremental Design of Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Pop, Traian

    2001-01-01

    In this paper we present an approach to incremental design of distributed embedded systems for hard real-time applications. We start from an already existing system running a set of applications and the design problem is to implement new functionality on this system. Thus, we propose mapping...... strategies of functionality so that the already running functionality is not disturbed and there is a good chance that, later, new functionality can easily be mapped on the resulted system. The mapping and scheduling for hard real-time embedded systems are considered the context of a realistic communication...

  7. Spectral evolution with incremental nanocoating of long period fiber gratings

    Science.gov (United States)

    Del Villar, Ignacio; Corres, Jesus M.; Achaerandio, Miguel; Arregui, Francisco J.; Matias, Ignacio R.

    2006-12-01

    The incremental deposition of a thin overlay on the cladding of a long-period fiber grating (LPFG) induces important resonance wavelength shifts in the transmission spectrum. The phenomenon is proved theoretically with a vectorial method based on hybrid modes and coupled mode theory, and experimentally with electrostatic self-assembly monolayer process. The phenomenon is repeated periodically for specific overlay thickness values with the particularity that the shape of the resonance wavelength shift depends on the thickness of the overlay. The main applications are the design of wide optical filters and multiparameter sensing devices.

  8. Resolution enhancement of terahertz imaging by incremental Wiener filtering

    Science.gov (United States)

    Hong, Zhi; Xiao, Wenhua; Chen, Haibin

    2009-07-01

    A two dimensional raster scanning terahertz imaging system based on a continuous backward-wave oscillator (BWO) terahertz source is built up. To improve the spatial resolution of the system, a scanning step smaller than the focused spot size of the terahertz beam is used to get the transmission image. After pre-processing of the image with wavelet filtering, and using incremental Wiener filtering for image restoration, a terahertz image with resolution higher than the diffraction limit of the system can be obtained. Our imaging experiment shows that this method can both enhance spatial resolution and suppress imaging noise effectively.

  9. Testing single point incremental forming molds for thermoforming operations

    Science.gov (United States)

    Afonso, Daniel; de Sousa, Ricardo Alves; Torcato, Ricardo

    2016-10-01

    Low pressure polymer processing processes as thermoforming or rotational molding use much simpler molds then high pressure processes like injection. However, despite the low forces involved with the process, molds manufacturing for this operations is still a very material, energy and time consuming operation. The goal of the research is to develop and validate a method for manufacturing plastically formed sheets metal molds by single point incremental forming (SPIF) operation for thermoforming operation. Stewart platform based SPIF machines allow the forming of thick metal sheets, granting the required structural stiffness for the mold surface, and keeping the short lead time manufacture and low thermal inertia.

  10. Incremental Beliefs About Ability Ameliorate Self-Doubt Effects

    Directory of Open Access Journals (Sweden)

    Qin Zhao

    2015-12-01

    Full Text Available Past research has typically shown negative effects of self-doubt on performance and psychological well-being. We suggest that these self-doubt effects largely may be due to an underlying assumption that ability is innate and fixed. The present research investigated the main hypothesis that incremental beliefs about ability might ameliorate negative effects of self-doubt. We examined our hypotheses using two lab tasks: verbal reasoning and anagram tasks. Participants’ self-doubt was measured and beliefs about ability were measured after participants read articles advocating either for incremental or entity theories of ability. American College Testing (ACT scores were obtained to index actual ability level. Consistent with our hypothesis, for participants who believed ability was relatively fixed, higher self-doubt was associated with increased negative affect and lower task performance and engagement. In contrast, for participants who believed that ability was malleable, negative self-doubt effects were ameliorated; self-doubt was even associated with better task performance. These effects were further moderated by participants’ academic ability. These findings suggest that mind-sets about ability moderate self-doubt effects. Self-doubt may have negative effects only when it is interpreted as signaling that ability is immutably low.

  11. Incremental Scheduling Engines for Human Exploration of the Cosmos

    Science.gov (United States)

    Jaap, John; Phillips, Shaun

    2005-01-01

    As humankind embarks on longer space missions farther from home, the requirements and environments for scheduling the activities performed on these missions are changing. As we begin to prepare for these missions it is appropriate to evaluate the merits and applicability of the different types of scheduling engines. Scheduling engines temporally arrange tasks onto a timeline so that all constraints and objectives are met and resources are not overbooked. Scheduling engines used to schedule space missions fall into three general categories: batch, mixed-initiative, and incremental. This paper presents an assessment of the engine types, a discussion of the impact of human exploration of the moon and Mars on planning and scheduling, and the applicability of the different types of scheduling engines. This paper will pursue the hypothesis that incremental scheduling engines may have a place in the new environment; they have the potential to reduce cost, to improve the satisfaction of those who execute or benefit from a particular timeline (the customers), and to allow astronauts to plan their own tasks and those of their companion robots.

  12. Communication: Phase incremented echo train acquisition in NMR spectroscopy.

    Science.gov (United States)

    Baltisberger, Jay H; Walder, Brennan J; Keeler, Eric G; Kaseman, Derrick C; Sanders, Kevin J; Grandinetti, Philip J

    2012-06-07

    We present an improved and general approach for implementing echo train acquisition (ETA) in magnetic resonance spectroscopy, particularly where the conventional approach of Carr-Purcell-Meiboom-Gill (CPMG) acquisition would produce numerous artifacts. Generally, adding ETA to any N-dimensional experiment creates an N + 1 dimensional experiment, with an additional dimension associated with the echo count, n, or an evolution time that is an integer multiple of the spacing between echo maxima. Here we present a modified approach, called phase incremented echo train acquisition (PIETA), where the phase of the mixing pulse and every other refocusing pulse, φ(P), is incremented as a single variable, creating an additional phase dimension in what becomes an N + 2 dimensional experiment. A Fourier transform with respect to the PIETA phase, φ(P), converts the φ(P) dimension into a Δp dimension where desired signals can be easily separated from undesired coherence transfer pathway signals, thereby avoiding cumbersome or intractable phase cycling schemes where the receiver phase must follow a master equation. This simple modification eliminates numerous artifacts present in NMR experiments employing CPMG acquisition and allows "single-scan" measurements of transverse relaxation and J-couplings. Additionally, unlike CPMG, we show how PIETA can be appended to experiments with phase modulated signals after the mixing pulse.

  13. Incremental concept learning with few training examples and hierarchical classification

    Science.gov (United States)

    Bouma, Henri; Eendebak, Pieter T.; Schutte, Klamer; Azzopardi, George; Burghouts, Gertjan J.

    2015-10-01

    Object recognition and localization are important to automatically interpret video and allow better querying on its content. We propose a method for object localization that learns incrementally and addresses four key aspects. Firstly, we show that for certain applications, recognition is feasible with only a few training samples. Secondly, we show that novel objects can be added incrementally without retraining existing objects, which is important for fast interaction. Thirdly, we show that an unbalanced number of positive training samples leads to biased classifier scores that can be corrected by modifying weights. Fourthly, we show that the detector performance can deteriorate due to hard-negative mining for similar or closely related classes (e.g., for Barbie and dress, because the doll is wearing a dress). This can be solved by our hierarchical classification. We introduce a new dataset, which we call TOSO, and use it to demonstrate the effectiveness of the proposed method for the localization and recognition of multiple objects in images.

  14. Efficient incremental relaying for packet transmission over fading channels

    KAUST Repository

    Fareed, Muhammad Mehboob

    2014-07-01

    In this paper, we propose a novel relaying scheme for packet transmission over fading channels, which improves the spectral efficiency of cooperative diversity systems by utilizing limited feedback from the destination. Our scheme capitalizes on the fact that relaying is only required when direct transmission suffers deep fading. We calculate the packet error rate for the proposed efficient incremental relaying (EIR) scheme with both amplify and forward and decode and forward relaying. We compare the performance of the EIR scheme with the threshold-based incremental relaying (TIR) scheme. It is shown that the efficiency of the TIR scheme is better for lower values of the threshold. However, the efficiency of the TIR scheme for higher values of threshold is outperformed by the EIR. In addition, three new threshold-based adaptive EIR are devised to further improve the efficiency of the EIR scheme. We calculate the packet error rate and the efficiency of these new schemes to provide the analytical insight. © 2014 IEEE.

  15. Incremental parameter estimation of kinetic metabolic network models

    Directory of Open Access Journals (Sweden)

    Jia Gengjie

    2012-11-01

    Full Text Available Abstract Background An efficient and reliable parameter estimation method is essential for the creation of biological models using ordinary differential equation (ODE. Most of the existing estimation methods involve finding the global minimum of data fitting residuals over the entire parameter space simultaneously. Unfortunately, the associated computational requirement often becomes prohibitively high due to the large number of parameters and the lack of complete parameter identifiability (i.e. not all parameters can be uniquely identified. Results In this work, an incremental approach was applied to the parameter estimation of ODE models from concentration time profiles. Particularly, the method was developed to address a commonly encountered circumstance in the modeling of metabolic networks, where the number of metabolic fluxes (reaction rates exceeds that of metabolites (chemical species. Here, the minimization of model residuals was performed over a subset of the parameter space that is associated with the degrees of freedom in the dynamic flux estimation from the concentration time-slopes. The efficacy of this method was demonstrated using two generalized mass action (GMA models, where the method significantly outperformed single-step estimations. In addition, an extension of the estimation method to handle missing data is also presented. Conclusions The proposed incremental estimation method is able to tackle the issue on the lack of complete parameter identifiability and to significantly reduce the computational efforts in estimating model parameters, which will facilitate kinetic modeling of genome-scale cellular metabolism in the future.

  16. Incremental learning of skill collections based on intrinsic motivation

    Science.gov (United States)

    Metzen, Jan H.; Kirchner, Frank

    2013-01-01

    Life-long learning of reusable, versatile skills is a key prerequisite for embodied agents that act in a complex, dynamic environment and are faced with different tasks over their lifetime. We address the question of how an agent can learn useful skills efficiently during a developmental period, i.e., when no task is imposed on him and no external reward signal is provided. Learning of skills in a developmental period needs to be incremental and self-motivated. We propose a new incremental, task-independent skill discovery approach that is suited for continuous domains. Furthermore, the agent learns specific skills based on intrinsic motivation mechanisms that determine on which skills learning is focused at a given point in time. We evaluate the approach in a reinforcement learning setup in two continuous domains with complex dynamics. We show that an intrinsically motivated, skill learning agent outperforms an agent which learns task solutions from scratch. Furthermore, we compare different intrinsic motivation mechanisms and how efficiently they make use of the agent's developmental period. PMID:23898265

  17. Incremental Learning of Skill Collections based on Intrinsic Motivation

    Directory of Open Access Journals (Sweden)

    Jan Hendrik Metzen

    2013-07-01

    Full Text Available Life-long learning of reusable, versatile skills is a key prerequisite forembodied agents that act in a complex, dynamic environment and are faced withdifferent tasks over their lifetime. We address the question of how an agentcan learn useful skills efficiently during a developmental period,i.e., when no task is imposed on him and no external reward signal is provided.Learning of skills in a developmental period needs to be incremental andself-motivated. We propose a new incremental, task-independent skill discoveryapproach that is suited for continuous domains. Furthermore, the agent learnsspecific skills based on intrinsic motivation mechanisms thatdetermine on which skills learning is focused at a given point in time. Weevaluate the approach in a reinforcement learning setup in two continuousdomains with complex dynamics. We show that an intrinsically motivated, skilllearning agent outperforms an agent which learns task solutions from scratch.Furthermore, we compare different intrinsic motivation mechanisms and howefficiently they make use of the agent's developmental period.

  18. Identifying the Academic Rising Stars via Pairwise Citation Increment Ranking

    KAUST Repository

    Zhang, Chuxu

    2017-08-02

    Predicting the fast-rising young researchers (the Academic Rising Stars) in the future provides useful guidance to the research community, e.g., offering competitive candidates to university for young faculty hiring as they are expected to have success academic careers. In this work, given a set of young researchers who have published the first first-author paper recently, we solve the problem of how to effectively predict the top k% researchers who achieve the highest citation increment in Δt years. We explore a series of factors that can drive an author to be fast-rising and design a novel pairwise citation increment ranking (PCIR) method that leverages those factors to predict the academic rising stars. Experimental results on the large ArnetMiner dataset with over 1.7 million authors demonstrate the effectiveness of PCIR. Specifically, it outperforms all given benchmark methods, with over 8% average improvement. Further analysis demonstrates that temporal features are the best indicators for rising stars prediction, while venue features are less relevant.

  19. Incremental learning of skill collections based on intrinsic motivation.

    Science.gov (United States)

    Metzen, Jan H; Kirchner, Frank

    2013-01-01

    Life-long learning of reusable, versatile skills is a key prerequisite for embodied agents that act in a complex, dynamic environment and are faced with different tasks over their lifetime. We address the question of how an agent can learn useful skills efficiently during a developmental period, i.e., when no task is imposed on him and no external reward signal is provided. Learning of skills in a developmental period needs to be incremental and self-motivated. We propose a new incremental, task-independent skill discovery approach that is suited for continuous domains. Furthermore, the agent learns specific skills based on intrinsic motivation mechanisms that determine on which skills learning is focused at a given point in time. We evaluate the approach in a reinforcement learning setup in two continuous domains with complex dynamics. We show that an intrinsically motivated, skill learning agent outperforms an agent which learns task solutions from scratch. Furthermore, we compare different intrinsic motivation mechanisms and how efficiently they make use of the agent's developmental period.

  20. Failure mechanisms in single-point incremental forming of metals

    DEFF Research Database (Denmark)

    Silva, Maria B.; Nielsen, Peter Søe; Bay, Niels

    2011-01-01

    The last years saw the development of two different views on how failure develops in single-point incremental forming (SPIF). Today, researchers are split between those claiming that fracture is always preceded by necking and those considering that fracture occurs with suppression of necking. Eac...... and involves independent determination of formability limits by necking and fracture using tensile and hydraulic bulge tests in conjunction with SPIF of benchmark shapes under laboratory conditions.......The last years saw the development of two different views on how failure develops in single-point incremental forming (SPIF). Today, researchers are split between those claiming that fracture is always preceded by necking and those considering that fracture occurs with suppression of necking. Each...... on formability limits and development of fracture. The unified view conciliates the aforementioned different explanations on the role of necking in fracture and is consistent with the experimental observations that have been reported in the past years. The work is performed on aluminium AA1050-H111 sheets...

  1. Incremental support vector machines for fast reliable image recognition

    Energy Technology Data Exchange (ETDEWEB)

    Makili, L., E-mail: makili_le@yahoo.com [Instituto Superior Politécnico da Universidade Katyavala Bwila, Benguela (Angola); Vega, J. [Asociación EURATOM/CIEMAT para Fusión, Madrid (Spain); Dormido-Canto, S. [Dpto. Informática y Automática – UNED, Madrid (Spain)

    2013-10-15

    Highlights: ► A conformal predictor using SVM as the underlying algorithm was implemented. ► It was applied to image recognition in the TJ–II's Thomson Scattering Diagnostic. ► To improve time efficiency an approach to incremental SVM training has been used. ► Accuracy is similar to the one reached when standard SVM is used. ► Computational time saving is significant for large training sets. -- Abstract: This paper addresses the reliable classification of images in a 5-class problem. To this end, an automatic recognition system, based on conformal predictors and using Support Vector Machines (SVM) as the underlying algorithm has been developed and applied to the recognition of images in the Thomson Scattering Diagnostic of the TJ–II fusion device. Using such conformal predictor based classifier is a computationally intensive task since it implies to train several SVM models to classify a single example and to perform this training from scratch takes a significant amount of time. In order to improve the classification time efficiency, an approach to the incremental training of SVM has been used as the underlying algorithm. Experimental results show that the overall performance of the new classifier is high, comparable to the one corresponding to the use of standard SVM as the underlying algorithm and there is a significant improvement in time efficiency.

  2. Incremental learning of concept drift in nonstationary environments.

    Science.gov (United States)

    Elwell, Ryan; Polikar, Robi

    2011-10-01

    We introduce an ensemble of classifiers-based approach for incremental learning of concept drift, characterized by nonstationary environments (NSEs), where the underlying data distributions change over time. The proposed algorithm, named Learn(++). NSE, learns from consecutive batches of data without making any assumptions on the nature or rate of drift; it can learn from such environments that experience constant or variable rate of drift, addition or deletion of concept classes, as well as cyclical drift. The algorithm learns incrementally, as other members of the Learn(++) family of algorithms, that is, without requiring access to previously seen data. Learn(++). NSE trains one new classifier for each batch of data it receives, and combines these classifiers using a dynamically weighted majority voting. The novelty of the approach is in determining the voting weights, based on each classifier's time-adjusted accuracy on current and past environments. This approach allows the algorithm to recognize, and act accordingly, to the changes in underlying data distributions, as well as to a possible reoccurrence of an earlier distribution. We evaluate the algorithm on several synthetic datasets designed to simulate a variety of nonstationary environments, as well as a real-world weather prediction dataset. Comparisons with several other approaches are also included. Results indicate that Learn(++). NSE can track the changing environments very closely, regardless of the type of concept drift. To allow future use, comparison and benchmarking by interested researchers, we also release our data used in this paper. © 2011 IEEE

  3. Incremental first pass technique to measure left ventricular ejection fraction

    International Nuclear Information System (INIS)

    Kocak, R.; Gulliford, P.; Hoggard, C.; Critchley, M.

    1980-01-01

    An incremental first pass technique was devised to assess the acute effects of any drug on left ventricular ejection fraction (LVEF) with or without a physiological stress. In particular, the effects of the vasodilater isosorbide dinitrate on LVEF before and after exercise were studied in 11 patients who had suffered cardiac failure. This was achieved by recording the passage of sup(99m)Tc pertechnetate through the heart at each stage of the study using a gamma camera computer system. Consistent values for four consecutive first pass values without exercise or drug in normal subjects illustrated the reproducibility of the technique. There was no significant difference between LVEF values obtained at rest and exercise before or after oral isosorbide dinitrate with the exception of one patient with gross mitral regurgitation. The advantages of the incremental first pass technique are that the patient need not be in sinus rhythm, the effects of physiological intervention may be studied and tests may also be repeated at various intervals during long term follow-up of patients. A disadvantage of the method is the limitation in the number of sequential measurements which can be carried out due to the amount of radioactivity injected. (U.K.)

  4. Optimal Output of Distributed Generation Based On Complex Power Increment

    Science.gov (United States)

    Wu, D.; Bao, H.

    2017-12-01

    In order to meet the growing demand for electricity and improve the cleanliness of power generation, new energy generation, represented by wind power generation, photovoltaic power generation, etc has been widely used. The new energy power generation access to distribution network in the form of distributed generation, consumed by local load. However, with the increase of the scale of distribution generation access to the network, the optimization of its power output is becoming more and more prominent, which needs further study. Classical optimization methods often use extended sensitivity method to obtain the relationship between different power generators, but ignore the coupling parameter between nodes makes the results are not accurate; heuristic algorithm also has defects such as slow calculation speed, uncertain outcomes. This article proposes a method called complex power increment, the essence of this method is the analysis of the power grid under steady power flow. After analyzing the results we can obtain the complex scaling function equation between the power supplies, the coefficient of the equation is based on the impedance parameter of the network, so the description of the relation of variables to the coefficients is more precise Thus, the method can accurately describe the power increment relationship, and can obtain the power optimization scheme more accurately and quickly than the extended sensitivity method and heuristic method.

  5. Demonstration/Validation of Incremental Sampling at Two Diverse Military Ranges and Development of an Incremental Sampling Tool

    Science.gov (United States)

    2010-06-01

    characterize ten decision units $62,725 $39,500 $18,900 E2S2 - June 20102 2 –40National Defense Center for Energy and Environment Validation of EVC Soil...Stick • EVC Soil Stick used in Fort Lewis to collect eight replicate samples (0-2.5 cm depth) of 100 increments each • Same decision unit as...Energy and Environment Fort Lewis Live-Fire Laboratory Replicates NG Results Using EVC Tool (mg/kg) Sample Type Replicates Mean Std Dev % RSD 1 2 3

  6. Most Recent Match Queries in On-Line Suffix Trees

    DEFF Research Database (Denmark)

    Larsson, N. Jesper

    2014-01-01

    A suffix tree is able to efficiently locate a pattern in an indexed string, but not in general the most recent copy of the pattern in an online stream, which is desirable in some applications. We study the most general version of the problem of locating a most recent match: supporting queries......-window indexing, and sketch a possible optimization for use in the special case of Lempel-Ziv compression....

  7. Corrections of the NIST Statistical Test Suite for Randomness

    OpenAIRE

    Kim, Song-Ju; Umeno, Ken; Hasegawa, Akio

    2004-01-01

    It is well known that the NIST statistical test suite was used for the evaluation of AES candidate algorithms. We have found that the test setting of Discrete Fourier Transform test and Lempel-Ziv test of this test suite are wrong. We give four corrections of mistakes in the test settings. This suggests that re-evaluation of the test results should be needed.

  8. Incremental Innovation and Competitive Pressure in the Presence of Discrete Innovation

    DEFF Research Database (Denmark)

    Ghosh, Arghya; Kato, Takao; Morita, Hodaka

    2017-01-01

    Technical progress consists of improvements made upon the existing technology (incremental innovation) and innovative activities aiming at entirely new technology (discrete innovation). Incremental innovation is often of limited relevance to the new technology invented by successful discrete...... innovation. Previous theoretical studies have indicated that higher competitive pressure measured by product substitutability increases incremental innovation. In contrast, we find that intensified competition can decrease incremental innovation. A firm's market share upon its failure in discrete innovation...... decreases as competition intensifies. This effect decreases firms’ incentives for incremental innovation because the innovation outcome can be applied to a smaller amount of units....

  9. Incremental Volumetric Remapping Method: Analysis and Error Evaluation

    International Nuclear Information System (INIS)

    Baptista, A. J.; Oliveira, M. C.; Rodrigues, D. M.; Menezes, L. F.; Alves, J. L.

    2007-01-01

    In this paper the error associated with the remapping problem is analyzed. A range of numerical results that assess the performance of three different remapping strategies, applied to FE meshes that typically are used in sheet metal forming simulation, are evaluated. One of the selected strategies is the previously presented Incremental Volumetric Remapping method (IVR), which was implemented in the in-house code DD3TRIM. The IVR method fundaments consists on the premise that state variables in all points associated to a Gauss volume of a given element are equal to the state variable quantities placed in the correspondent Gauss point. Hence, given a typical remapping procedure between a donor and a target mesh, the variables to be associated to a target Gauss volume (and point) are determined by a weighted average. The weight function is the Gauss volume percentage of each donor element that is located inside the target Gauss volume. The calculus of the intersecting volumes between the donor and target Gauss volumes is attained incrementally, for each target Gauss volume, by means of a discrete approach. The other two remapping strategies selected are based in the interpolation/extrapolation of variables by using the finite element shape functions or moving least square interpolants. The performance of the three different remapping strategies is address with two tests. The first remapping test was taken from a literature work. The test consists in remapping successively a rotating symmetrical mesh, throughout N increments, in an angular span of 90 deg. The second remapping error evaluation test consists of remapping an irregular element shape target mesh from a given regular element shape donor mesh and proceed with the inverse operation. In this second test the computation effort is also measured. The results showed that the error level associated to IVR can be very low and with a stable evolution along the number of remapping procedures when compared with the

  10. Improving process performance in Incremental Sheet Forming (ISF)

    International Nuclear Information System (INIS)

    Ambrogio, G.; Filice, L.; Manco, G. L.

    2011-01-01

    Incremental Sheet Forming (ISF) is a relatively new process in which a sheet clamped along the borders is progressively deformed through a hemispherical tool. The tool motion is CNC controlled and the path is designed using a CAD-CAM approach, with the aim to reproduce the final shape contour such as in the surface milling. The absence of a dedicated setup and the related high flexibility is the main point of strength and the reason why several researchers focused their attentions on the ISF process.On the other hand the process slowness is the most relevant drawback which reduces a wider industrial application. In the paper, a first attempt to overcome this process limitation is presented taking into account a relevant speed increasing respect to the values currently used.

  11. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  12. ARIES: Acquisition of Requirements and Incremental Evolution of Specifications

    Science.gov (United States)

    Roberts, Nancy A.

    1993-01-01

    This paper describes a requirements/specification environment specifically designed for large-scale software systems. This environment is called ARIES (Acquisition of Requirements and Incremental Evolution of Specifications). ARIES provides assistance to requirements analysts for developing operational specifications of systems. This development begins with the acquisition of informal system requirements. The requirements are then formalized and gradually elaborated (transformed) into formal and complete specifications. ARIES provides guidance to the user in validating formal requirements by translating them into natural language representations and graphical diagrams. ARIES also provides ways of analyzing the specification to ensure that it is correct, e.g., testing the specification against a running simulation of the system to be built. Another important ARIES feature, especially when developing large systems, is the sharing and reuse of requirements knowledge. This leads to much less duplication of effort. ARIES combines all of its features in a single environment that makes the process of capturing a formal specification quicker and easier.

  13. Incremental and developmental perspectives for general-purpose learning systems

    Directory of Open Access Journals (Sweden)

    Fernando Martínez-Plumed

    2017-02-01

    Full Text Available The stupefying success of Articial Intelligence (AI for specic problems, from recommender systems to self-driving cars, has not yet been matched with a similar progress in general AI systems, coping with a variety of (dierent problems. This dissertation deals with the long-standing problem of creating more general AI systems, through the analysis of their development and the evaluation of their cognitive abilities. It presents a declarative general-purpose learning system and a developmental and lifelong approach for knowledge acquisition, consolidation and forgetting. It also analyses the use of the use of more ability-oriented evaluation techniques for AI evaluation and provides further insight for the understanding of the concepts of development and incremental learning in AI systems.

  14. Transferring the Incremental Capacity Analysis to Lithium-Sulfur Batteries

    DEFF Research Database (Denmark)

    Knap, Vaclav; Kalogiannis, Theodoros; Purkayastha, Rajlakshmi

    2017-01-01

    In order to investigate the battery degradation and to estimate their health, various techniques can be applied. One of them, which is widely used for Lithium-ion batteries, is the incremental capacity analysis (ICA). In this work, we apply the ICA to Lithium-Sulfur batteries, which differ in many...... aspects from Lithium-ion batteries and possess unique behavior. One of the challenges of applying the ICA to Lithium-Sulfur batteries is the representation of the IC curves, as their voltage profiles are often non-monotonic, resulting in more complex IC curves. The ICA is at first applied to charge...... and discharge processes at various temperature levels and afterward the technique is applied to a cell undergoing cycling degradation. It is shown that the ageing processes are trackable from the IC curves and it opens a possibility for their utilization for state-of-health estimation....

  15. Single Point Incremental Forming using a Dummy Sheet

    DEFF Research Database (Denmark)

    Skjødt, Martin; Silva, Beatriz; Bay, Niels

    2007-01-01

    A new version of single point incremental forming (SPIF) is presented. This version includes a dummy sheet on top of the work piece, thus forming two sheets instead of one. The dummy sheet, which is in contact with the rotating tool pin, is discarded after forming. The new set-up influences...... the process and furthermore offers a number of new possibilities for solving some of the problems appearing in SPIF. Investigations of the influence of dummy sheet on: formability, wear, surface quality and bulging of planar sides is done by forming to test shapes: a hyperboloid and a truncated pyramid....... The possible influence of friction between the two sheets is furthermore investigated. The results show that the use of a dummy sheet reduces wear of the work piece to almost zero, but also causes a decrease in formability. Bulging of the planar sides of the pyramid is reduced and surface roughness...

  16. Compiler-Enhanced Incremental Checkpointing for OpenMP Applications

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Marques, D; Pingali, K; Rugina, R; McKee, S A

    2008-01-21

    As modern supercomputing systems reach the peta-flop performance range, they grow in both size and complexity. This makes them increasingly vulnerable to failures from a variety of causes. Checkpointing is a popular technique for tolerating such failures, enabling applications to periodically save their state and restart computation after a failure. Although a variety of automated system-level checkpointing solutions are currently available to HPC users, manual application-level checkpointing remains more popular due to its superior performance. This paper improves performance of automated checkpointing via a compiler analysis for incremental checkpointing. This analysis, which works with both sequential and OpenMP applications, reduces checkpoint sizes by as much as 80% and enables asynchronous checkpointing.

  17. Automating the Incremental Evolution of Controllers for Physical Robots

    DEFF Research Database (Denmark)

    Faina, Andres; Jacobsen, Lars Toft; Risi, Sebastian

    2017-01-01

    the evolution of digital objects.…” The work presented here investigates how fully autonomous evolution of robot controllers can be realized in hardware, using an industrial robot and a marker-based computer vision system. In particular, this article presents an approach to automate the reconfiguration......Evolutionary robotics is challenged with some key problems that must be solved, or at least mitigated extensively, before it can fulfill some of its promises to deliver highly autonomous and adaptive robots. The reality gap and the ability to transfer phenotypes from simulation to reality...... of the test environment and shows that it is possible, for the first time, to incrementally evolve a neural robot controller for different obstacle avoidance tasks with no human intervention. Importantly, the system offers a high level of robustness and precision that could potentially open up the range...

  18. Incremental peritoneal dialysis: a 10 year single-centre experience.

    Science.gov (United States)

    Sandrini, Massimo; Vizzardi, Valerio; Valerio, Francesca; Ravera, Sara; Manili, Luigi; Zubani, Roberto; Lucca, Bernardo J A; Cancarini, Giovanni

    2016-12-01

    Incremental dialysis consists in prescribing a dialysis dose aimed towards maintaining total solute clearance (renal + dialysis) near the targets set by guidelines. Incremental peritoneal dialysis (incrPD) is defined as one or two dwell-times per day on CAPD, whereas standard peritoneal dialysis (stPD) consists in three-four dwell-times per day. Single-centre cohort study. Enrollement period: January 2002-December 2007; end of follow up (FU): December 2012. incident patients with FU ≥6 months, initial residual renal function (RRF) 3-10 ml/min/1.73 sqm BSA, renal indication for PD. Median incrPD duration was 17 months (I-III Q: 10; 30). There were no statistically significant differences between 29 patients on incrPD and 76 on stPD regarding: clinical, demographic and anthropometric characteristics at the beginning of treatment, adequacy indices, peritonitis-free survival (peritonitis incidence: 1/135 months-patients in incrPD vs. 1/52 months-patients in stPD) and patient survival. During the first 6 months, RRF remained stable in incrPD (6.20 ± 2.02 vs. 6.08 ± 1.47 ml/min/1.73 sqm BSA; p = 0.792) whereas it decreased in stPD (4.48 ± 2.12 vs. 5.61 ± 1.49; p peritonitis incidence and slower reduction of renal function.

  19. Entropy estimation of very short symbolic sequences

    Science.gov (United States)

    Lesne, Annick; Blanc, Jean-Luc; Pezard, Laurent

    2009-04-01

    While entropy per unit time is a meaningful index to quantify the dynamic features of experimental time series, its estimation is often hampered in practice by the finite length of the data. We here investigate the performance of entropy estimation procedures, relying either on block entropies or Lempel-Ziv complexity, when only very short symbolic sequences are available. Heuristic analytical arguments point at the influence of temporal correlations on the bias and statistical fluctuations, and put forward a reduced effective sequence length suitable for error estimation. Numerical studies are conducted using, as benchmarks, the wealth of different dynamic regimes generated by the family of logistic maps and stochastic evolutions generated by a Markov chain of tunable correlation time. Practical guidelines and validity criteria are proposed. For instance, block entropy leads to a dramatic overestimation for sequences of low entropy, whereas it outperforms Lempel-Ziv complexity at high entropy. As a general result, the quality of entropy estimation is sensitive to the sequence temporal correlation hence self-consistently depends on the entropy value itself, thus promoting a two-step procedure. Lempel-Ziv complexity is to be preferred in the first step and remains the best estimator for highly correlated sequences.

  20. Global Combat Support System - Army Increment 2 (GCSS-A Inc 2)

    Science.gov (United States)

    2016-03-01

    conducted on GCSS-Army Increment 2 by OSD Cost Assessment and Program Evaluation in advance of MS B. Certification of Business Case Alignment...2016 Major Automated Information System Annual Report Global Combat Support System - Army Increment 2 (GCSS-A Inc 2) Defense Acquisition...Date Assigned: Program Information Program Name Global Combat Support System - Army Increment 2 (GCSS-A Inc 2) DoD Component Army Responsible

  1. The relations of growth and increment in thrace oak forests

    Directory of Open Access Journals (Sweden)

    Gafura Aylak Özdemir

    2016-01-01

    Full Text Available In this study, increment and growth relationships of oak forests of Thrace in different ages, densities and site indexes were examined. For this purpose, double-entry tree volume table, site quality table and the density - dependent yield table was created with the help of the data provided to 101 sample plots were taken. Density – dependent yield table was programmed using VBA macro feature of MS Excel 2010 program. Thus, yield table can be taken as output in computer environment according to the desired age, density and site index. Trends of stand volume and volume elements provided to density – dependent yield table for the oak forest of Thrace according to the age in different site conditions and densities was presented comparatively. Values obtained by density dependent yield table were compared with values of the yield table generated by Eraslan (1954 and Eraslan – Evcimen (1967 for the oak forests. The same comparison also was made with values of the yield table generated by Carus (1998 for beech forests with broad-leaved species.

  2. Incremental Sampling Methodology: Applications for Background Screening Assessments.

    Science.gov (United States)

    Pooler, Penelope S; Goodrum, Philip E; Crumbling, Deana; Stuchal, Leah D; Roberts, Stephen M

    2018-01-01

    This article presents the findings from a numerical simulation study that was conducted to evaluate the performance of alternative statistical analysis methods for background screening assessments when data sets are generated with incremental sampling methods (ISMs). A wide range of background and site conditions are represented in order to test different ISM sampling designs. Both hypothesis tests and upper tolerance limit (UTL) screening methods were implemented following U.S. Environmental Protection Agency (USEPA) guidance for specifying error rates. The simulations show that hypothesis testing using two-sample t-tests can meet standard performance criteria under a wide range of conditions, even with relatively small sample sizes. Key factors that affect the performance include unequal population variances and small absolute differences in population means. UTL methods are generally not recommended due to conceptual limitations in the technique when applied to ISM data sets from single decision units and due to insufficient power given standard statistical sample sizes from ISM. © 2017 Society for Risk Analysis.

  3. Robust, Causal, and Incremental Approaches to Investigating Linguistic Adaptation

    Science.gov (United States)

    Roberts, Seán G.

    2018-01-01

    This paper discusses the maximum robustness approach for studying cases of adaptation in language. We live in an age where we have more data on more languages than ever before, and more data to link it with from other domains. This should make it easier to test hypotheses involving adaptation, and also to spot new patterns that might be explained by adaptation. However, there is not much discussion of the overall approach to research in this area. There are outstanding questions about how to formalize theories, what the criteria are for directing research and how to integrate results from different methods into a clear assessment of a hypothesis. This paper addresses some of those issues by suggesting an approach which is causal, incremental and robust. It illustrates the approach with reference to a recent claim that dry environments select against the use of precise contrasts in pitch. Study 1 replicates a previous analysis of the link between humidity and lexical tone with an alternative dataset and finds that it is not robust. Study 2 performs an analysis with a continuous measure of tone and finds no significant correlation. Study 3 addresses a more recent analysis of the link between humidity and vowel use and finds that it is robust, though the effect size is small and the robustness of the measurement of vowel use is low. Methodological robustness of the general theory is addressed by suggesting additional approaches including iterated learning, a historical case study, corpus studies, and studying individual speech. PMID:29515487

  4. A novel instrument for generating angular increments of 1 nanoradian

    Science.gov (United States)

    Alcock, Simon G.; Bugnar, Alex; Nistea, Ioana; Sawhney, Kawal; Scott, Stewart; Hillman, Michael; Grindrod, Jamie; Johnson, Iain

    2015-12-01

    Accurate generation of small angles is of vital importance for calibrating angle-based metrology instruments used in a broad spectrum of industries including mechatronics, nano-positioning, and optic fabrication. We present a novel, piezo-driven, flexure device capable of reliably generating micro- and nanoradian angles. Unlike many such instruments, Diamond Light Source's nano-angle generator (Diamond-NANGO) does not rely on two separate actuators or rotation stages to provide coarse and fine motion. Instead, a single Physik Instrumente NEXLINE "PiezoWalk" actuator provides millimetres of travel with nanometre resolution. A cartwheel flexure efficiently converts displacement from the linear actuator into rotary motion with minimal parasitic errors. Rotation of the flexure is directly measured via a Magnescale "Laserscale" angle encoder. Closed-loop operation of the PiezoWalk actuator, using high-speed feedback from the angle encoder, ensures that the Diamond-NANGO's output drifts by only ˜0.3 nrad rms over ˜30 min. We show that the Diamond-NANGO can reliably move with unprecedented 1 nrad (˜57 ndeg) angular increments over a range of >7000 μrad. An autocollimator, interferometer, and capacitive displacement sensor are used to independently confirm the Diamond-NANGO's performance by simultaneously measuring the rotation of a reflective cube.

  5. Incremental cost of PACS in a medical intensive care unit

    Science.gov (United States)

    Langlotz, Curtis P.; Cleff, Bridget; Even-Shoshan, Orit; Bozzo, Mary T.; Redfern, Regina O.; Brikman, Inna; Seshadri, Sridhar B.; Horii, Steven C.; Kundel, Harold L.

    1995-05-01

    Our purpose is to determine the incremental costs (or savings) due to the introduction of picture archiving and communication systems (PACS) and computed radiology (CR) in a medical intensive care unit (MICU). Our economic analysis consists of three measurement methods. The first method is an assessment of the direct costs to the radiology department, implemented in a spreadsheet model. The second method consists of a series of brief observational studies to measure potential changes in personnel costs that might not be reflected in administrative claims. The third method (results not reported here) is a multivariate modeling technique which estimates the independent effect of PACS/CR on the cost of care (estimated from administrative claims data), while controlling for clinical case- mix variables. Our direct cost model shows no cost savings to the radiology department after the introduction of PACS in the medical intensive care unit. Savings in film supplies and film library personnel are offset by increases in capital equipment costs and PACS operation personnel. The results of observational studies to date demonstrate significant savings in clinician film-search time, but no significant change in technologist time or lost films. Our model suggests that direct radiology costs will increase after the limited introduction of PACS/CR in the MICU. Our observational studies show a small but significant effect on clinician film search time by the introduction of PACS/CR in the MICU, but no significant effect on other variables. The projected costs of a hospital-wide PACS are currently under study.

  6. Numerical Simulation of Incremental Sheet Forming by Simplified Approach

    Science.gov (United States)

    Delamézière, A.; Yu, Y.; Robert, C.; Ayed, L. Ben; Nouari, M.; Batoz, J. L.

    2011-01-01

    The Incremental Sheet Forming (ISF) is a process, which can transform a flat metal sheet in a 3D complex part using a hemispherical tool. The final geometry of the product is obtained by the relative movement between this tool and the blank. The main advantage of that process is that the cost of the tool is very low compared to deep drawing with rigid tools. The main disadvantage is the very low velocity of the tool and thus the large amount of time to form the part. Classical contact algorithms give good agreement with experimental results, but are time consuming. A Simplified Approach for the contact management between the tool and the blank in ISF is presented here. The general principle of this approach is to imposed displacement of the nodes in contact with the tool at a given position. On a benchmark part, the CPU time of the present Simplified Approach is significantly reduced compared with a classical simulation performed with Abaqus implicit.

  7. Business Collaboration in Food Networks: Incremental Solution Development

    Directory of Open Access Journals (Sweden)

    Harald Sundmaeker

    2014-10-01

    Full Text Available The paper will present an approach for an incremental solution development that is based on the usage of the currently developed Internet based FIspace business collaboration platform. Key element is the clear segmentation of infrastructures that are either internal or external to the collaborating business entity in the food network. On the one hand, the approach enables to differentiate between specific centralised as well as decentralised ways for data storage and hosting of IT based functionalities. The selection of specific dataexchange protocols and data models is facilitated. On the other hand, the supported solution design and subsequent development is focusing on reusable “software Apps” that can be used on their own and are incorporating a clear added value for the business actors. It will be outlined on how to push the development and introduction of Apps that do not require basic changes of the existing infrastructure. The paper will present an example that is based on the development of a set of Apps for the exchange of product quality related information in food networks, specifically addressing fresh fruits and vegetables. It combines workflow support for data exchange from farm to retail as well as to provide quality feedback information to facilitate the business process improvement. Finally, the latest status of theFIspace platform development will be outlined. Key features and potential ways for real users and software developers in using the FIspace platform that is initiated by science and industry will be outlined.

  8. Distribution of incremental static stress caused by earthquakes

    Directory of Open Access Journals (Sweden)

    Y. Y. Kagan

    1994-01-01

    Full Text Available Theoretical calculations, simulations and measurements of rotation of earthquake focal mechanisms suggest that the stress in earthquake focal zones follows the Cauchy distribution which is one of the stable probability distributions (with the value of the exponent α equal to 1. We review the properties of the stable distributions and show that the Cauchy distribution is expected to approximate the stress caused by earthquakes occurring over geologically long intervals of a fault zone development. However, the stress caused by recent earthquakes recorded in instrumental catalogues, should follow symmetric stable distributions with the value of α significantly less than one. This is explained by a fractal distribution of earthquake hypocentres: the dimension of a hypocentre set, ��, is close to zero for short-term earthquake catalogues and asymptotically approaches 2¼ for long-time intervals. We use the Harvard catalogue of seismic moment tensor solutions to investigate the distribution of incremental static stress caused by earthquakes. The stress measured in the focal zone of each event is approximated by stable distributions. In agreement with theoretical considerations, the exponent value of the distribution approaches zero as the time span of an earthquake catalogue (ΔT decreases. For large stress values α increases. We surmise that it is caused by the δ increase for small inter-earthquake distances due to location errors.

  9. Automated Dimension Determination for NMF-based Incremental Collaborative Filtering

    Directory of Open Access Journals (Sweden)

    Xiwei Wang

    2015-12-01

    Full Text Available The nonnegative matrix factorization (NMF based collaborative filtering t e chniques h a ve a c hieved great success in product recommendations. It is well known that in NMF, the dimensions of the factor matrices have to be determined in advance. Moreover, data is growing fast; thus in some cases, the dimensions need to be changed to reduce the approximation error. The recommender systems should be capable of updating new data in a timely manner without sacrificing the prediction accuracy. In this paper, we propose an NMF based data update approach with automated dimension determination for collaborative filtering purposes. The approach can determine the dimensions of the factor matrices and update them automatically. It exploits the nearest neighborhood based clustering algorithm to cluster users and items according to their auxiliary information, and uses the clusters as the constraints in NMF. The dimensions of the factor matrices are associated with the cluster quantities. When new data becomes available, the incremental clustering algorithm determines whether to increase the number of clusters or merge the existing clusters. Experiments on three different datasets (MovieLens, Sushi, and LibimSeTi were conducted to examine the proposed approach. The results show that our approach can update the data quickly and provide encouraging prediction accuracy.

  10. Incremental Dynamic Analysis of Koyna Dam under Repeated Ground Motions

    Science.gov (United States)

    Zainab Nik Azizan, Nik; Majid, Taksiah A.; Nazri, Fadzli Mohamed; Maity, Damodar; Abdullah, Junaidah

    2018-03-01

    This paper discovers the incremental dynamic analysis (IDA) of concrete gravity dam under single and repeated earthquake loadings to identify the limit state of the dam. Seven ground motions with horizontal and vertical direction as seismic input considered in the nonlinear dynamic analysis based on the real repeated earthquake in the worldwide. All the ground motions convert to respond spectrum and scaled according to the developed elastic respond spectrum in order to match the characteristic of the ground motion to the soil type. The scaled was depends on the fundamental period, T1 of the dam. The Koyna dam has been selected as a case study for the purpose of the analysis by assuming that no sliding and rigid foundation, has been estimated. IDA curves for Koyna dam developed for single and repeated ground motions and the performance level of the dam identifies. The IDA curve of repeated ground motion shown stiffer rather than single ground motion. The ultimate state displacement for a single event is 45.59mm and decreased to 39.33mm under repeated events which are decreased about 14%. This showed that the performance level of the dam based on seismic loadings depend on ground motion pattern.

  11. Incremental Frequent Subgraph Mining on Large Evolving Graphs

    KAUST Repository

    Abdelhamid, Ehab

    2017-08-22

    Frequent subgraph mining is a core graph operation used in many domains, such as graph data management and knowledge exploration, bioinformatics and security. Most existing techniques target static graphs. However, modern applications, such as social networks, utilize large evolving graphs. Mining these graphs using existing techniques is infeasible, due to the high computational cost. In this paper, we propose IncGM+, a fast incremental approach for continuous frequent subgraph mining problem on a single large evolving graph. We adapt the notion of “fringe” to the graph context, that is the set of subgraphs on the border between frequent and infrequent subgraphs. IncGM+ maintains fringe subgraphs and exploits them to prune the search space. To boost the efficiency, we propose an efficient index structure to maintain selected embeddings with minimal memory overhead. These embeddings are utilized to avoid redundant expensive subgraph isomorphism operations. Moreover, the proposed system supports batch updates. Using large real-world graphs, we experimentally verify that IncGM+ outperforms existing methods by up to three orders of magnitude, scales to much larger graphs and consumes less memory.

  12. Validation of daily increments periodicity in otoliths of spotted gar

    Science.gov (United States)

    Snow, Richard A.; Long, James M.; Frenette, Bryan D.

    2017-01-01

    Accurate age and growth information is essential in successful management of fish populations and for understanding early life history. We validated daily increment deposition, including the timing of first ring formation, for spotted gar (Lepisosteus oculatus) through 127 days post hatch. Fry were produced from hatchery-spawned specimens, and up to 10 individuals per week were sacrificed and their otoliths (sagitta, lapillus, and asteriscus) removed for daily age estimation. Daily age estimates for all three otolith pairs were significantly related to known age. The strongest relationships existed for measurements from the sagitta (r2 = 0.98) and the lapillus (r2 = 0.99) with asteriscus (r2 = 0.95) the lowest. All age prediction models resulted in a slope near unity, indicating that ring deposition occurred approximately daily. Initiation of ring formation varied among otolith types, with deposition beginning 3, 7, and 9 days for the sagitta, lapillus, and asteriscus, respectively. Results of this study suggested that otoliths are useful to estimate daily age of spotted gar juveniles; these data may be used to back calculate hatch dates, estimate early growth rates, and correlate with environmental factor that influence spawning in wild populations. is early life history information will be valuable in better understanding the ecology of this species. 

  13. An incremental anomaly detection model for virtual machines

    Science.gov (United States)

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245

  14. Automating the Incremental Evolution of Controllers for Physical Robots.

    Science.gov (United States)

    Faíña, Andrés; Jacobsen, Lars Toft; Risi, Sebastian

    2017-01-01

    Evolutionary robotics is challenged with some key problems that must be solved, or at least mitigated extensively, before it can fulfill some of its promises to deliver highly autonomous and adaptive robots. The reality gap and the ability to transfer phenotypes from simulation to reality constitute one such problem. Another lies in the embodiment of the evolutionary processes, which links to the first, but focuses on how evolution can act on real agents and occur independently from simulation, that is, going from being, as Eiben, Kernbach, & Haasdijk [2012, p. 261] put it, "the evolution of things, rather than just the evolution of digital objects.…" The work presented here investigates how fully autonomous evolution of robot controllers can be realized in hardware, using an industrial robot and a marker-based computer vision system. In particular, this article presents an approach to automate the reconfiguration of the test environment and shows that it is possible, for the first time, to incrementally evolve a neural robot controller for different obstacle avoidance tasks with no human intervention. Importantly, the system offers a high level of robustness and precision that could potentially open up the range of problems amenable to embodied evolution.

  15. Robust, Causal, and Incremental Approaches to Investigating Linguistic Adaptation.

    Science.gov (United States)

    Roberts, Seán G

    2018-01-01

    This paper discusses the maximum robustness approach for studying cases of adaptation in language. We live in an age where we have more data on more languages than ever before, and more data to link it with from other domains. This should make it easier to test hypotheses involving adaptation, and also to spot new patterns that might be explained by adaptation. However, there is not much discussion of the overall approach to research in this area. There are outstanding questions about how to formalize theories, what the criteria are for directing research and how to integrate results from different methods into a clear assessment of a hypothesis. This paper addresses some of those issues by suggesting an approach which is causal, incremental and robust. It illustrates the approach with reference to a recent claim that dry environments select against the use of precise contrasts in pitch. Study 1 replicates a previous analysis of the link between humidity and lexical tone with an alternative dataset and finds that it is not robust. Study 2 performs an analysis with a continuous measure of tone and finds no significant correlation. Study 3 addresses a more recent analysis of the link between humidity and vowel use and finds that it is robust, though the effect size is small and the robustness of the measurement of vowel use is low. Methodological robustness of the general theory is addressed by suggesting additional approaches including iterated learning, a historical case study, corpus studies, and studying individual speech.

  16. Inspiratory muscle activation increases with COPD severity as confirmed by non-invasive mechanomyographic analysis.

    Directory of Open Access Journals (Sweden)

    Leonardo Sarlabous

    Full Text Available There is a lack of instruments for assessing respiratory muscle activation during the breathing cycle in clinical conditions. The aim of the present study was to evaluate the usefulness of the respiratory muscle mechanomyogram (MMG for non-invasively assessing the mechanical activation of the inspiratory muscles of the lower chest wall in both patients with chronic obstructive pulmonary disease (COPD and healthy subjects, and to investigate the relationship between inspiratory muscle activation and pulmonary function parameters. Both inspiratory mouth pressure and respiratory muscle MMG were simultaneously recorded under two different respiratory conditions, quiet breathing and incremental ventilatory effort, in 13 COPD patients and 7 healthy subjects. The mechanical activation of the inspiratory muscles was characterised by the non-linear multistate Lempel-Ziv index (MLZ calculated over the inspiratory time of the MMG signal. Subsequently, the efficiency of the inspiratory muscle mechanical activation was expressed as the ratio between the peak inspiratory mouth pressure to the amplitude of the mechanical activation. This activation estimated using the MLZ index correlated strongly with peak inspiratory mouth pressure throughout the respiratory protocol in both COPD patients (r = 0.80, p<0.001 and healthy (r = 0.82, p<0.001. Moreover, the greater the COPD severity in patients, the greater the level of muscle activation (r = -0.68, p = 0.001, between muscle activation at incremental ventilator effort and FEV1. Furthermore, the efficiency of the mechanical activation of inspiratory muscle was lower in COPD patients than healthy subjects (7.61±2.06 vs 20.42±10.81, respectively, p = 0.0002, and decreased with increasing COPD severity (r = 0.78, p<0.001, between efficiency of the mechanical activation at incremental ventilatory effort and FEV1. These results suggest that the respiratory muscle mechanomyogram is a good reflection of inspiratory

  17. Maximum Power Point Tracking With Improved Incremental Conductance Method for Fast Changing Solar Irradiation Level

    Science.gov (United States)

    >Tey Kok Soon, Saad Mekhilef,

    2013-06-01

    This paper proposed an improved incremental conductance method to track the Maximum Power Point (MPP) for PV Panel under fast changing solar irradiation. When there is increment in solar irradiation level, the conventional incremental conductance method is confused and responses incorrectly. The proposed method response correctly and there is no steady state oscillation compared to the conventional method. Matlab simulation is carried out for both the improved and conventional incremental conductance method under fast changing solar irradiation level. The simulation results showed the system able to track the MPP faster than the conventional method.

  18. Mean annual increment of Dysoxylum parasiticum (Osbeck Kosterm. in “Eka Karya” Botanical Garden, Bali

    Directory of Open Access Journals (Sweden)

    NI KADEK EROSI UNDAHARTA

    2008-10-01

    Full Text Available Dysoxylum parasiticum (Osbeck Kosterm. including into the tribe of Meliace. The society in Bali calls this plant with majegau. It gives a special characteristic for Balinese and has advantages in their religion. In Bali, it is more known as divine tree (kayu dewa, which able to be used for purified building. The objective of this research was to measure the mean annual increment of height, diameter and volume of D. parasiticum. Recognizing the characteristic of this plant is an important step to get information about the growth of majegau. Estimation of age harvest and also the conservation effort as one of effort to take care of its continuity. The results showed that highest height increment in XIVA garden bed, and the highest diameter increment in XVIIIA garden bed. The height increment is more optimal in the area with high light intensity (XIV A bed garden and the diameter increment is more optimal in low light intensity area (XVIIIA bed garden. The increment of height is more influentially to create volume increment. The model showed that height and volume increment have higher R2 adjusted value than diameter and volume (0.645 and 0.132, respectively.

  19. Influence of Rotation Increments on Imaging Performance for a Rotatory Dual-Head PET System

    Directory of Open Access Journals (Sweden)

    Fanzhen Meng

    2017-01-01

    Full Text Available For a rotatory dual-head positron emission tomography (PET system, how to determine the rotation increments is an open problem. In this study, we simulated the characteristics of a rotatory dual-head PET system. The influences of different rotation increments were compared and analyzed. Based on this simulation, the imaging performance of a prototype system was verified. A reconstruction flowchart was proposed based on a precalculated system response matrix (SRM. The SRM made the relationships between the voxels and lines of response (LORs fixed; therefore, we added the interpolation method into the flowchart. Five metrics, including spatial resolution, normalized mean squared error (NMSE, peak signal-to-noise ratio (PSNR, contrast-to-noise (CNR, and structure similarity (SSIM, were applied to assess the reconstructed image quality. The results indicated that the 60° rotation increments with the bilinear interpolation had advantages in resolution, PSNR, NMSE, and SSIM. In terms of CNR, the 90° rotation increments were better than other increments. In addition, the reconstructed images of 90° rotation increments were also flatter than that of 60° increments. Therefore, both the 60° and 90° rotation increments could be used in the real experiments, and which one to choose may depend on the application requirement.

  20. Word decoding development in incremental phonics instruction in a transparent orthography

    NARCIS (Netherlands)

    Schaars, M.M.H.; Segers, P.C.J.; Verhoeven, L.T.W.

    2017-01-01

    The present longitudinal study aimed to investigate the development of word decoding skills during incremental phonics instruction in Dutch as a transparent orthography. A representative sample of 973 Dutch children in the first grade (M age  = 6;1, SD = 0;5) was exposed to incremental subsets of

  1. Lactate and ammonia concentration in blood and sweat during incremental cycle ergometer exercise

    NARCIS (Netherlands)

    Ament, W; Huizenga, [No Value; Mook, GA; Gips, CH; Verkerke, GJ

    It is known that the concentrations of ammonia and lactate in blood increase during incremental exercise. Sweat also contains lactate and ammonia. The aim of the present study was to investigate the physiological response of lactate and ammonia in plasma and sweat during a stepwise incremental cycle

  2. On the Design of a Knowledge Management System for Incremental Process Improvement for Software Product Management

    NARCIS (Netherlands)

    Vlaanderen, K.; Brinkkemper, S.; van de Weerd, I.

    2012-01-01

    Incremental software process improvement deals with the challenges of step-wise process improvement in a time where resources are scarce and many organizations are struggling with the challenges of effective management of software products. Effective knowledge sharing and incremental approaches are

  3. Incremental validity of mindfulness skills in relation to emotional dysregulation among a young adult community sample.

    Science.gov (United States)

    Vujanovic, Anka A; Bonn-Miller, Marcel O; Bernstein, Amit; McKee, Laura G; Zvolensky, Michael J

    2010-01-01

    The present investigation examined the incremental predictive validity of mindfulness skills, as measured by the Kentucky Inventory of Mindfulness Skills (KIMS), in relation to multiple facets of emotional dysregulation, as indexed by the Difficulties in Emotion Regulation Scale (DERS), above and beyond variance explained by negative affectivity, anxiety sensitivity, and distress tolerance. Participants were a nonclinical community sample of 193 young adults (106 women, 87 men; M(age) = 23.91 years). The KIMS Accepting without Judgment subscale was incrementally negatively predictive of all facets of emotional dysregulation, as measured by the DERS. Furthermore, KIMS Acting with Awareness was incrementally negatively related to difficulties engaging in goal-directed behavior. Additionally, both observing and describing mindfulness skills were incrementally negatively related to lack of emotional awareness, and describing skills also were incrementally negatively related to lack of emotional clarity. Findings are discussed in relation to advancing scientific understanding of emotional dysregulation from a mindfulness skills-based framework.

  4. An Empirical Examination of the Incremental Contribution of Stock Characteristics in UK Stock Returns

    Directory of Open Access Journals (Sweden)

    Jonathan Fletcher

    2017-10-01

    Full Text Available This study uses the Bayesian approach to examine the incremental contribution of stock characteristics to the investment opportunity set in U.K. stock returns. The paper finds that size, book-to-market (BM ratio, and momentum characteristics all make a significant incremental contribution to the investment opportunity set when there is unrestricted short selling. However, no short selling constraints eliminate the incremental contribution of the size and BM characteristics, but not the momentum characteristic. The use of additional stock characteristics such as stock issues, accruals, profitability, and asset growth leads to a significant incremental contribution beyond the size, BM, and momentum characteristics when there is unrestricted short selling, but no short selling constraints largely eliminates the incremental contribution of the additional characteristics.

  5. Neuromuscular responses to incremental caffeine doses: performance and side effects.

    Science.gov (United States)

    Pallarés, Jesús G; Fernández-Elías, Valentín E; Ortega, Juan F; Muñoz, Gloria; Muñoz-Guerra, Jesús; Mora-Rodríguez, Ricardo

    2013-11-01

    The purpose of this study was to determine the oral dose of caffeine needed to increase muscle force and power output during all-out single multijoint movements. Thirteen resistance-trained men underwent a battery of muscle strength and power tests in a randomized, double-blind, crossover design, under four different conditions: (a) placebo ingestion (PLAC) or with caffeine ingestion at doses of (b) 3 mg · kg(-1) body weight (CAFF 3mg), (c) 6 mg · kg(-1) (CAFF 6mg), and (d) 9 mg · kg(-1) (CAFF 9mg). The muscle strength and power tests consisted in the measurement of bar displacement velocity and muscle power output during free-weight full-squat (SQ) and bench press (BP) exercises against four incremental loads (25%, 50%, 75%, and 90% one-repetition maximum [1RM]). Cycling peak power output was measured using a 4-s inertial load test. Caffeine side effects were evaluated at the end of each trial and 24 h later. Mean propulsive velocity at light loads (25%-50% 1RM) increased significantly above PLAC for all caffeine doses (5.4%-8.5%, P = 0.039-0.003). At the medium load (75% 1RM), CAFF 3mg did not improve SQ or BP muscle power or BP velocity. CAFF 9mg was needed to enhance BP velocity and SQ power at the heaviest load (90% 1RM) and cycling peak power output (6.8%-11.7%, P = 0.03-0.05). The CAFF 9mg trial drastically increased the frequency of the adverse side effects (15%-62%). The ergogenic dose of caffeine required to enhance neuromuscular performance during a single all-out contraction depends on the magnitude of load used. A dose of 3 mg · kg(-1) is enough to improve high-velocity muscle actions against low loads, whereas a higher caffeine dose (9 mg · kg(-1)) is necessary against high loads, despite the appearance of adverse side effects.

  6. Local and Global Parsing with Functional (FX-bar Theory and SCD Linguistic Strategy (I. Part I. FX-bar Schemes and Theory. Local and Global FX-bar Projections

    Directory of Open Access Journals (Sweden)

    Neculai Curteanu

    2006-05-01

    Full Text Available This paper surveys latest developments of SCD (Segmentation-Cohesion-Dependency linguistic strategy, with its basic components: FX-bar theory with local and (two extensions to global structures, the hierarchy graph of SCD marker classes, and improved versions of SCD algorithms for segmentation and parsing of local and global text structures. Briefly, Part I brings theoretical support (predicational feature and semantic diathesis for handing down the predication from syntactic to lexical level, introduces the new local / global FX-bar schemes (graphs for clause-level and discourse-level, the (global extension of dependency graph for SCD marker classes, the problem of (direct and inverse local FX-bar projection of the verbal group (verbal complex, and the FX-bar global projections, with the special case of sub-clausal discourse segments. Part II discusses the implications of the functional generativity concept for local and global markers, with a novel understanding on the taxonomy of text parsing algorithms, specifies the SCD marker classes, both at clause and discourse level, and presents (variants of SCD local and global segmentation / parsing algorithms, along with their latest running results.

  7. Do otolith increments allow correct inferences about age and growth of coral reef fishes?

    Science.gov (United States)

    Booth, D. J.

    2014-03-01

    Otolith increment structure is widely used to estimate age and growth of marine fishes. Here, I test the accuracy of the long-term otolith increment analysis of the lemon damselfish Pomacentrus moluccensis to describe age and growth characteristics. I compare the number of putative annual otolith increments (as a proxy for actual age) and widths of these increments (as proxies for somatic growth) with actual tagged fish-length data, based on a 6-year dataset, the longest time course for a coral reef fish. Estimated age from otoliths corresponded closely with actual age in all cases, confirming annual increment formation. However, otolith increment widths were poor proxies for actual growth in length [linear regression r 2 = 0.44-0.90, n = 6 fish] and were clearly of limited value in estimating annual growth. Up to 60 % of the annual growth variation was missed using otolith increments, suggesting the long-term back calculations of otolith growth characteristics of reef fish populations should be interpreted with caution.

  8. On the validity of the incremental approach to estimate the impact of cities on air quality

    Science.gov (United States)

    Thunis, Philippe

    2018-01-01

    The question of how much cities are the sources of their own air pollution is not only theoretical as it is critical to the design of effective strategies for urban air quality planning. In this work, we assess the validity of the commonly used incremental approach to estimate the likely impact of cities on their air pollution. With the incremental approach, the city impact (i.e. the concentration change generated by the city emissions) is estimated as the concentration difference between a rural background and an urban background location, also known as the urban increment. We show that the city impact is in reality made up of the urban increment and two additional components and consequently two assumptions need to be fulfilled for the urban increment to be representative of the urban impact. The first assumption is that the rural background location is not influenced by emissions from within the city whereas the second requires that background concentration levels, obtained with zero city emissions, are equal at both locations. Because the urban impact is not measurable, the SHERPA modelling approach, based on a full air quality modelling system, is used in this work to assess the validity of these assumptions for some European cities. Results indicate that for PM2.5, these two assumptions are far from being fulfilled for many large or medium city sizes. For this type of cities, urban increments are largely underestimating city impacts. Although results are in better agreement for NO2, similar issues are met. In many situations the incremental approach is therefore not an adequate estimate of the urban impact on air pollution. This poses issues in terms of interpretation when these increments are used to define strategic options in terms of air quality planning. We finally illustrate the interest of comparing modelled and measured increments to improve our confidence in the model results.

  9. Annual increments, specific gravity and energy of Eucalyptus grandis by gamma-ray attenuation technique

    International Nuclear Information System (INIS)

    Rezende, M.A.; Guerrini, I.A.; Ferraz, E.S.B.

    1990-01-01

    Specific gravity annual increments in volume, mass and energy of Eucalyptus grandis at thirteen years of age were made taking into account measurements of the calorific value for wood. It was observed that the calorific value for wood decrease slightly, while the specific gravity increase significantly with age. The so-called culmination age for the Annual Volume Increment was determined to be around fourth year of growth while for the Annual Mass and Energy Increment was around the eighty year. These results show that a tree in a particular age may not have a significant growth in volume, yet one is mass and energy. (author)

  10. Robust flight control using incremental nonlinear dynamic inversion and angular acceleration prediction

    NARCIS (Netherlands)

    Sieberling, S.; Chu, Q.P.; Mulder, J.A.

    2010-01-01

    This paper presents a flight control strategy based on nonlinear dynamic inversion. The approach presented, called incremental nonlinear dynamic inversion, uses properties of general mechanical systems and nonlinear dynamic inversion by feeding back angular accelerations. Theoretically, feedback of

  11. MUNIX and incremental stimulation MUNE in ALS patients and control subjects

    DEFF Research Database (Denmark)

    Furtula, Jasna; Johnsen, Birger; Christensen, Peter Broegger

    2013-01-01

    This study compares the new Motor Unit Number Estimation (MUNE) technique, MUNIX, with the more common incremental stimulation MUNE (IS-MUNE) with respect to reproducibility in healthy subjects and as potential biomarker of disease progression in patients with ALS....

  12. Observers for a class of systems with nonlinearities satisfying an incremental quadratic inequality

    Science.gov (United States)

    Acikmese, Ahmet Behcet; Martin, Corless

    2004-01-01

    We consider the problem of state estimation from nonlinear time-varying system whose nonlinearities satisfy an incremental quadratic inequality. Observers are presented which guarantee that the state estimation error exponentially converges to zero.

  13. Incremental Evolution of a 10/250 NLV into a 20/450 NMSLV, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The technical innovation proposed here is the continued functional evolution and concept refinement of an incremental series of test vehicles that will ultimately...

  14. Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A)

    Science.gov (United States)

    2016-03-01

    2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A...Program Name Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A) DoD Component Air Force Responsible Office Program...present, plan, source, mobilize, deploy, account for, sustain, redeploy and reconstitute forces to conduct National Command Authority authorized

  15. Incremental Identification of Reaction and Mass-Transfer Kinetics Using the Concept of Extents

    OpenAIRE

    Bhatt, Nirav; Amrhein, Michael; Bonvin, Dominique

    2011-01-01

    This paper proposes a variation of the incremental approach to identify reaction and mass-transfer kinetics (rate expressions and the corresponding rate parameters) from concentration measurements for both homogeneous and gas-liquid reaction systems. This incremental approach proceeds in two steps: (i) computation of the extents of reaction and mass transfer from concentration measurements without explicit knowledge of the reaction and mass-transfer rate expressions, and (ii) estimation of ...

  16. Application of incremental algorithms to CT image reconstruction for sparse-view, noisy data

    DEFF Research Database (Denmark)

    Rose, Sean; Andersen, Martin Skovgaard; Sidky, Emil Y.

    2014-01-01

    This conference contribution adapts an incremental framework for solving optimization problems of interest for sparse-view CT. From the incremental framework two algorithms are derived: one that combines a damped form of the algebraic reconstruction technique (ART) with a total-variation (TV......) projection, and one that employs a modified damped ART, accounting for a weighted-quadratic data fidelity term, combined with TV projection. The algorithms are demonstrated on simulated, noisy, sparseview CT data....

  17. Incremental Reconstruction of Urban Environments by Edge-Points Delaunay Triangulation

    OpenAIRE

    Romanoni, Andrea; Matteucci, Matteo

    2016-01-01

    Urban reconstruction from a video captured by a surveying vehicle constitutes a core module of automated mapping. When computational power represents a limited resource and, a detailed map is not the primary goal, the reconstruction can be performed incrementally, from a monocular video, carving a 3D Delaunay triangulation of sparse points; this allows online incremental mapping for tasks such as traversability analysis or obstacle avoidance. To exploit the sharp edges of urban landscape, we ...

  18. 24 CFR 982.102 - Allocation of budget authority for renewal of expiring consolidated ACC funding increments.

    Science.gov (United States)

    2010-04-01

    ... renewal of expiring consolidated ACC funding increments. 982.102 Section 982.102 Housing and Urban... budget authority for renewal of expiring consolidated ACC funding increments. (a) Applicability. This section applies to the renewal of consolidated ACC funding increments in the program (as described in...

  19. A System to Derive Optimal Tree Diameter Increment Models from the Eastwide Forest Inventory Data Base (EFIDB)

    Science.gov (United States)

    Don C. Bragg

    2002-01-01

    This article is an introduction to the computer software used by the Potential Relative Increment (PRI) approach to optimal tree diameter growth modeling. These DOS programs extract qualified tree and plot data from the Eastwide Forest Inventory Data Base (EFIDB), calculate relative tree increment, sort for the highest relative increments by diameter class, and...

  20. Split-increment technique: an alternative approach for large cervical composite resin restorations.

    Science.gov (United States)

    Hassan, Khamis A; Khier, Salwa E

    2007-02-01

    This article proposes and describes the split-increment technique as an alternative for placement of composite resin in large cervical carious lesions which extend onto the root surface. Two flat 1.5 mm thick composite resin increments were used to restore these cervical carious lesions. Prior to light-curing, two diagonal cuts were made in each increment in order to split it into four triangular-shaped flat portions. The first increment was applied to cover the entire axial wall and portions of the four surrounding walls. The second increment was applied to fill the cavity completely covering the first one and the rest of the four surrounding walls as well as sealing all cavity margins. This technique results in the reduction of the C-factor and the generated shrinkage stresses by directing the shrinking composite resin during curing towards the free, unbonded areas created by the two diagonal cuts. The proposed technique would also produce a more naturally looking restoration by inserting flat dentin and enamel increments of composite resin of a uniform thickness which closely resembles the arrangement of natural tooth structure.

  1. Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture.

    Science.gov (United States)

    Chen, C L Philip; Liu, Zhulin

    2018-01-01

    Broad Learning System (BLS) that aims to offer an alternative way of learning in deep structure is proposed in this paper. Deep structure and learning suffer from a time-consuming training process because of a large number of connecting parameters in filters and layers. Moreover, it encounters a complete retraining process if the structure is not sufficient to model the system. The BLS is established in the form of a flat network, where the original inputs are transferred and placed as "mapped features" in feature nodes and the structure is expanded in wide sense in the "enhancement nodes." The incremental learning algorithms are developed for fast remodeling in broad expansion without a retraining process if the network deems to be expanded. Two incremental learning algorithms are given for both the increment of the feature nodes (or filters in deep structure) and the increment of the enhancement nodes. The designed model and algorithms are very versatile for selecting a model rapidly. In addition, another incremental learning is developed for a system that has been modeled encounters a new incoming input. Specifically, the system can be remodeled in an incremental way without the entire retraining from the beginning. Satisfactory result for model reduction using singular value decomposition is conducted to simplify the final structure. Compared with existing deep neural networks, experimental results on the Modified National Institute of Standards and Technology database and NYU NORB object recognition dataset benchmark data demonstrate the effectiveness of the proposed BLS.

  2. A Rapidly-Incremented Tethered-Swimming Test for Defining Domain-Specific Training Zones

    Directory of Open Access Journals (Sweden)

    Pessôa Filho Dalton M.

    2017-06-01

    Full Text Available The purpose of this study was to investigate whether a tethered-swimming incremental test comprising small increases in resistive force applied every 60 seconds could delineate the isocapnic region during rapidly-incremented exercise. Sixteen competitive swimmers (male, n = 11; female, n = 5 performed: (a a test to determine highest force during 30 seconds of all-out tethered swimming (Favg and the ΔF, which represented the difference between Favg and the force required to maintain body alignment (Fbase, and (b an incremental test beginning with 60 seconds of tethered swimming against a load that exceeded Fbase by 30% of ΔF followed by increments of 5% of ΔF every 60 seconds. This incremental test was continued until the limit of tolerance with pulmonary gas exchange (rates of oxygen uptake and carbon dioxide production and ventilatory (rate of minute ventilation data collected breath by breath. These data were subsequently analyzed to determine whether two breakpoints defining the isocapnic region (i.e., gas exchange threshold and respiratory compensation point were present. We also determined the peak rate of O2 uptake and exercise economy during the incremental test. The gas exchange threshold and respiratory compensation point were observed for each test such that the associated metabolic rates, which bound the heavy-intensity domain during constant-work-rate exercise, could be determined. Significant correlations (Spearman’s were observed for exercise economy along with (a peak rate of oxygen uptake (ρ = .562; p < 0.025, and (b metabolic rate at gas exchange threshold (ρ = −.759; p < 0.005. A rapidly-incremented tethered-swimming test allows for determination of the metabolic rates that define zones for domain-specific constant-work-rate training.

  3. Stem analysis program (GOAP for evaluating of increment and growth data at individual tree

    Directory of Open Access Journals (Sweden)

    Gafura Aylak Özdemir

    2016-07-01

    Full Text Available Stem analysis is a method evaluating in a detailed way data of increment and growth of individual tree at the past periods and widely used in various forestry disciplines. Untreated data of stem analysis consist of annual ring count and measurement procedures performed on cross sections taken from individual tree by section method. The evaluation of obtained this untreated data takes quite some time. Thus, a computer software was developed in this study to quickly and efficiently perform stem analysis. This computer software developed to evaluate untreated data of stem analysis as numerical and graphical was programmed as macro by utilizing Visual Basic for Application feature of MS Excel 2013 program currently the most widely used. In developed this computer software, growth height model is formed from two different approaches, individual tree volume depending on section method, cross-sectional area, increments of diameter, height and volume, volume increment percent and stem form factor at breast height are calculated depending on desired period lengths. This calculated values are given as table. Development of diameter, height, volume, increments of these variables, volume increment percent and stem form factor at breast height according to periodic age are given as chart. Stem model showing development of diameter, height and shape of individual tree in the past periods also can be taken from computer software as chart.

  4. Parsing statistical machine translation output

    NARCIS (Netherlands)

    Carter, S.; Monz, C.; Vetulani, Z.

    2009-01-01

    Despite increasing research into the use of syntax during statistical machine translation, the incorporation of syntax into language models has seen limited success. We present a study of the discriminative abilities of generative syntax-based language models, over and above standard n-gram models,

  5. The balanced scorecard: an incremental approach model to health care management.

    Science.gov (United States)

    Pineno, Charles J

    2002-01-01

    The balanced scorecard represents a technique used in strategic management to translate an organization's mission and strategy into a comprehensive set of performance measures that provide the framework for implementation of strategic management. This article develops an incremental approach for decision making by formulating a specific balanced scorecard model with an index of nonfinancial as well as financial measures. The incremental approach to costs, including profit contribution analysis and probabilities, allows decisionmakers to assess, for example, how their desire to meet different health care needs will cause changes in service design. This incremental approach to the balanced scorecard may prove to be useful in evaluating the existence of causality relationships between different objective and subjective measures to be included within the balanced scorecard.

  6. Ethical leadership: meta-analytic evidence of criterion-related and incremental validity.

    Science.gov (United States)

    Ng, Thomas W H; Feldman, Daniel C

    2015-05-01

    This study examines the criterion-related and incremental validity of ethical leadership (EL) with meta-analytic data. Across 101 samples published over the last 15 years (N = 29,620), we observed that EL demonstrated acceptable criterion-related validity with variables that tap followers' job attitudes, job performance, and evaluations of their leaders. Further, followers' trust in the leader mediated the relationships of EL with job attitudes and performance. In terms of incremental validity, we found that EL significantly, albeit weakly in some cases, predicted task performance, citizenship behavior, and counterproductive work behavior-even after controlling for the effects of such variables as transformational leadership, use of contingent rewards, management by exception, interactional fairness, and destructive leadership. The article concludes with a discussion of ways to strengthen the incremental validity of EL. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  7. Hypoxia affects tissue oxygenation differently in the thigh and calf muscles during incremental running.

    Science.gov (United States)

    Osawa, Takuya; Arimitsu, Takuma; Takahashi, Hideyuki

    2017-10-01

    The present study was performed to determine the impact of hypoxia on working muscle oxygenation during incremental running, and to compare tissue oxygenation between the thigh and calf muscles. Nine distance runners and triathletes performed incremental running tests to exhaustion under normoxic and hypoxic conditions (fraction of inspired oxygen = 0.15). Peak pulmonary oxygen uptake ([Formula: see text]) and tissue oxygen saturation (StO 2 ) were measured simultaneously in both the vastus lateralis and medial gastrocnemius. Hypoxia significantly decreased peak running speed and [Formula: see text] (p muscles was significantly decreased under hypoxic compared with normoxic conditions at all running speeds (p calf under hypoxic conditions, and that the effects of hypoxia on tissue oxygenation differ between these two muscles during incremental running.

  8. EFFECT OF COST INCREMENT DISTRIBUTION PATTERNS ON THE PERFORMANCE OF JIT SUPPLY CHAIN

    Directory of Open Access Journals (Sweden)

    Ayu Bidiawati J.R

    2008-01-01

    Full Text Available Cost is an important consideration in supply chain (SC optimisation. This is due to emphasis placed on cost reduction in order to optimise profit. Some researchers use cost as one of their performance measures and others propose ways of accurately calculating cost. As product moves across SC, the product cost also increases. This paper studied the effect of cost increment distribution patterns on the performance of a JIT Supply Chain. In particular, it is necessary to know if inventory allocation across SC needs to be modified to accommodate different cost increment distribution patterns. It was found that funnel is still the best card distribution pattern for JIT-SC regardless the cost increment distribution patterns used.

  9. Radical and Incremental Innovation Preferences in Information Technology: An Empirical Study in an Emerging Economy

    Directory of Open Access Journals (Sweden)

    Tarun K. Sen

    2011-11-01

    Full Text Available Radical and Incremental Innovation Preferences in Information Technology: An Empirical Study in an Emerging Economy Abstract Innovation in information technology is a primary driver for growth in developed economies. Research indicates that countries go through three stages in the adoption of innovation strategies: buying innovation through global trade, incremental innovation from other countries by enhancing efficiency, and, at the most developed stage, radically innovating independently for competitive advantage. The first two stages of innovation maturity depend more on cross-border trade than the third stage. In this paper, we find that IT professionals in in an emerging economy such as India believe in radical innovation over incremental innovation (adaptation as a growth strategy, even though competitive advantage may rest in adaptation. The results of the study report the preference for innovation strategies among IT professionals in India and its implications for other rapidly growing emerging economies.

  10. The role of local framework luminance range in the simultaneous lightness contrast illusion with double increments

    Directory of Open Access Journals (Sweden)

    Economou Elias

    2015-01-01

    Full Text Available In double increment Simultaneous Lightness Contrast two equiluminant squares rest on darker backgrounds that differ in luminance. Research on such displays has produced conflicting results as to whether an illusion is observed. Anchoring theory of lightness predicts no illusion with double increment displays. Here we test the hypothesis that an illusion can be predicted if the framework containing the target and the darkest background carries more weight in lightness computations. We tested two displays, one with a general white and one with a general black background. An illusion was obtained only in the first display. These results suggest that an illusion can occur with double increment displays when the global value of the targets differs from their local value, as these conditions allow different local framework weighting to affect the targets’ final lightness. We propose that this parameter be added to the Anchoring Theory.

  11. Incremental impact of adding boys to current human papillomavirus vaccination programs: role of herd immunity.

    Science.gov (United States)

    Brisson, Marc; van de Velde, Nicolas; Franco, Eduardo L; Drolet, Mélanie; Boily, Marie-Claude

    2011-08-01

    Our aim was to examine the potential incremental impact of vaccinating boys against human papillomavirus (HPV) on vaccine-type infection in females and males, using an individual-based HPV transmission-dynamic model. Under base assumptions (vaccine efficacy = 99%, duration of protection = 20 years, coverage = 70%), vaccinating 12-year-old boys, in addition to girls, resulted in an incremental reduction in HPV-16/18 (HPV-6/11) incidence over 70 years of 16% (3%) in females and 23% (4%) in males. The benefit of vaccinating boys decreased with improved vaccination coverage in girls. Given the important predicted herd immunity impact of vaccinating girls under moderate to high vaccine coverage, the potential incremental gains of vaccinating boys are limited.

  12. A Rapidly-Incremented Tethered-Swimming test for Defining Domain-Specific Training Zones.

    Science.gov (United States)

    Pessôa Filho, Dalton M; Siqueira, Leandro O C; Simionato, Astor R; Espada, Mário A C; Pestana, Daniel S; DiMenna, Fred J

    2017-06-01

    The purpose of this study was to investigate whether a tethered-swimming incremental test comprising small increases in resistive force applied every 60 seconds could delineate the isocapnic region during rapidly-incremented exercise. Sixteen competitive swimmers (male, n = 11; female, n = 5) performed: (a) a test to determine highest force during 30 seconds of all-out tethered swimming (F avg ) and the ΔF, which represented the difference between F avg and the force required to maintain body alignment (F base ), and (b) an incremental test beginning with 60 seconds of tethered swimming against a load that exceeded F base by 30% of ΔF followed by increments of 5% of ΔF every 60 seconds. This incremental test was continued until the limit of tolerance with pulmonary gas exchange (rates of oxygen uptake and carbon dioxide production) and ventilatory (rate of minute ventilation) data collected breath by breath. These data were subsequently analyzed to determine whether two breakpoints defining the isocapnic region (i.e., gas exchange threshold and respiratory compensation point) were present. We also determined the peak rate of O 2 uptake and exercise economy during the incremental test. The gas exchange threshold and respiratory compensation point were observed for each test such that the associated metabolic rates, which bound the heavy-intensity domain during constant-work-rate exercise, could be determined. Significant correlations (Spearman's) were observed for exercise economy along with (a) peak rate of oxygen uptake (ρ = .562; p rate at gas exchange threshold (ρ = -.759; p rates that define zones for domain-specific constant-work-rate training.

  13. Predicting success of methotrexate treatment by pretreatment HCG level and 24-hour HCG increment.

    Science.gov (United States)

    Levin, Gabriel; Saleh, Narjes A; Haj-Yahya, Rani; Matan, Liat S; Avi, Benshushan

    2018-04-01

    To evaluate β-human chorionic gonadotropin (β-HCG) level and its 24-hour increment as predictors of successful methotrexate treatment for ectopic pregnancy. Data were retrospectively reviewed from women with ectopic pregnancy who were treated by single-dose methotrexate (50 mg/m 2 ) at a university hospital in Jerusalem, Israel, between January 1, 2000, and June 30, 2015. Serum β-HCG before treatment and its percentage increment in the 24 hours before treatment were compared between treatment success and failure groups. Sixty-nine women were included in the study. Single-dose methotrexate treatment was successful for 44 (63.8%) women. Both mean β-HCG level and its 24-hour increment were lower for women with successful treatment than for those with failed treatment (respectively, 1224 IU\\L vs 2362 IU\\L, P=0.018; and 13.5% vs 29.6%, P=0.009). Receiver operator characteristic curve analysis yielded cutoff values of 1600 IU\\L and 14% increment with a positive predictive value of 75% and 82%, respectively, for treatment success. β-HCG level and its 24-hour increment were independent predictors of treatment outcome by logistic regression (both PHCG increment of less than 14% in the 24 hours before single-dose methotrexate and serum β-HCG of less than 1600 IU\\L were found to be good predictors of treatment success. © 2017 International Federation of Gynecology and Obstetrics.

  14. BMI and BMI SDS in childhood: annual increments and conditional change

    OpenAIRE

    Brannsether-Ellingsen, Bente; Eide, Geir Egil; Roelants, Mathieu; Bjerknes, Robert; Juliusson, Petur Benedikt

    2016-01-01

    Background: Early detection of abnormal weight gain in childhood may be important for preventive purposes. It is still debated which annual changes in BMI should warrant attention. Aim: To analyse 1-year increments of Body Mass Index (BMI) and standardised BMI (BMI SDS) in childhood and explore conditional change in BMI SDS as an alternative method to evaluate 1-year changes in BMI. Subjects and methods: The distributions of 1-year increments of BMI (kg/m2) and BMI SDS are summarised by...

  15. A power-driven increment borer for sampling high-density tropical wood

    OpenAIRE

    Stefan Krottenthaler; Philipp Pitsch; G. Helle; Giuliano Maselli Locosselli; Gregório Ceccantini; Jan Altman; Miroslav Svoboda; Jiri Dolezal; Gerhard Schleser; Dieter Anhuf

    2015-01-01

    High-density hardwood trees with large diameters have been found to damage manually operated increment borers, thus limiting their use in the tropics. Therefore, we herein report a new, low-cost gasoline-powered sampling system for high-density tropical hardwood trees with large diameters. This system provides increment cores 15 mm in diameter and up to 1.35 m in length, allowing minimally invasive sampling of tropical hardwood tree species, which, up to the present, could not be collected by...

  16. A program for the numerical control of a pulse increment system

    Energy Technology Data Exchange (ETDEWEB)

    Gray, D.C.

    1963-08-21

    This report will describe the important features of the development of magnetic tapes for the numerical control of a pulse-increment system consisting of a modified Gorton lathe and its associated control unit developed by L. E. Foley of Equipment Development Service, Engineering Services, General Electric Co., Schenectady, N.Y. Included is a description of CUPID (Control and Utilization of Pulse Increment Devices), a FORTRAN program for the design of these tapes on the IBM 7090 computer, and instructions for its operation.

  17. Incremental electrohydraulic forming - A new approach for the manufacture of structured multifunctional sheet metal blanks

    Science.gov (United States)

    Djakow, Eugen; Springer, Robert; Homberg, Werner; Piper, Mark; Tran, Julian; Zibart, Alexander; Kenig, Eugeny

    2017-10-01

    Electrohydraulic Forming (EHF) processes permit the production of complex, sharp-edged geometries even when high-strength materials are used. Unfortunately, the forming zone is often limited as compared to other sheet metal forming processes. The use of a special industrial-robot-based tool setup and an incremental process strategy could provide a promising solution for this problem. This paper describes such an innovative approach using an electrohydraulic incremental forming machine, which can be employed to manufacture the large multifunctional and complex part geometries in steel, aluminium, magnesium and reinforced plastic that are employed in lightweight constructions or heating elements.

  18. Maximal power output during incremental exercise by resistance and endurance trained athletes.

    Science.gov (United States)

    Sakthivelavan, D S; Sumathilatha, S

    2010-01-01

    This study was aimed at comparing the maximal power output by resistance trained and endurance trained athletes during incremental exercise. Thirty male athletes who received resistance training (Group I) and thirty male athletes of similar age group who received endurance training (Group II) for a period of more than 1 year were chosen for the study. Physical parameters were measured and exercise stress testing was done on a cycle ergometer with a portable gas analyzing system. The maximal progressive incremental cycle ergometer power output at peak exercise and carbon dioxide production at VO2max were measured. Highly significant (P biofeedback and perk up the athlete's performance.

  19. The period adding and incrementing bifurcations: from rotation theory to applications

    DEFF Research Database (Denmark)

    Granados, Albert; Alseda, Lluis; Krupa, Maciej

    2017-01-01

    for maps on the circle. In the second scenario, symbolic sequences are obtained by consecutive attachment of a given symbolic block and the periods of periodic orbits are incremented by a constant term. It is called the period incrementing bifurcation, in its proof relies on results for maps...... on the interval. We also discuss the expanding cases, as some of the partial results found in the literature also hold when these maps lose contractiveness. The higher dimensional case is also discussed by means of quasi-contractions. We also provide applied examples in control theory, power electronics...

  20. Wake up and smell the ginseng: International trade and the rise of incremental innovation in low-wage countries

    OpenAIRE

    Diego Puga; Daniel Trefler

    2009-01-01

    Increasingly, a small number of lowwage countries such as China, India and Mexico are involved in incremental innovation. That is, they are responsible for resolving productionline bugs and suggesting product improvements. We provide evidence of this new phenomenon and develop a model in which there is a transition from oldstyle productcycle trade to trade involving incremental innovation in lowwage countries. The model explains why levels of involvement in incremental innovation vary across ...

  1. BMI and BMI SDS in childhood: annual increments and conditional change.

    Science.gov (United States)

    Brannsether, Bente; Eide, Geir Egil; Roelants, Mathieu; Bjerknes, Robert; Júlíusson, Pétur Benedikt

    2017-02-01

    Background Early detection of abnormal weight gain in childhood may be important for preventive purposes. It is still debated which annual changes in BMI should warrant attention. Aim To analyse 1-year increments of Body Mass Index (BMI) and standardised BMI (BMI SDS) in childhood and explore conditional change in BMI SDS as an alternative method to evaluate 1-year changes in BMI. Subjects and methods The distributions of 1-year increments of BMI (kg/m 2 ) and BMI SDS are summarised by percentiles. Differences according to sex, age, height, weight, initial BMI and weight status on the BMI and BMI SDS increments were assessed with multiple linear regression. Conditional change in BMI SDS was based on the correlation between annual BMI measurements converted to SDS. Results BMI increments depended significantly on sex, height, weight and initial BMI. Changes in BMI SDS depended significantly only on the initial BMI SDS. The distribution of conditional change in BMI SDS using a two-correlation model was close to normal (mean = 0.11, SD = 1.02, n = 1167), with 3.2% (2.3-4.4%) of the observations below -2 SD and 2.8% (2.0-4.0%) above +2 SD. Conclusion Conditional change in BMI SDS can be used to detect unexpected large changes in BMI SDS. Although this method requires the use of a computer, it may be clinically useful to detect aberrant weight development.

  2. Even Highly Correlated Measures Can Add Incrementally to Predicting Recidivism among Sex Offenders

    Science.gov (United States)

    Babchishin, Kelly M.; Hanson, R. Karl; Helmus, Leslie

    2012-01-01

    Criterion-referenced measures, such as those used in the assessment of crime and violence, prioritize predictive accuracy (discrimination) at the expense of construct validity. In this article, we compared the discrimination and incremental validity of three commonly used criterion-referenced measures for sex offenders (Rapid Risk Assessment for…

  3. Incremental Validity of Thinking Styles in Predicting Academic Achievements: An Experimental Study in Hypermedia Learning Environments

    Science.gov (United States)

    Fan, Weiqiao; Zhang, Li-Fang; Watkins, David

    2010-01-01

    The study examined the incremental validity of thinking styles in predicting academic achievement after controlling for personality and achievement motivation in the hypermedia-based learning environment. Seventy-two Chinese college students from Shanghai, the People's Republic of China, took part in this instructional experiment. The…

  4. Characterization of the finite variation property for a class of stationary increment infinitely divisible processes

    DEFF Research Database (Denmark)

    Basse-O'Connor, Andreas; Rosiński, Jan

    2013-01-01

    We characterize the finite variation property for stationary increment mixed moving averages driven by infinitely divisible random measures. Such processes include fractional and moving average processes driven by Levy processes, and also their mixtures. We establish two types of zero-one laws...

  5. Lead 210 and moss-increment dating of two Finnish Sphagnum hummocks

    International Nuclear Information System (INIS)

    El-Daoushy, F.

    1982-01-01

    A comparison is presented of 210 Pb dating data with mass-increment dates of selected peat material from Finland. The measurements of 210 Pb were carried out by determining the granddaughter product 210 Po by means of the isotope dilution. The ages in 210 Pb yr were calculated using the constant initial concentration and the constant rate of supply models. (U.K.)

  6. 40 CFR Table 3 to Subpart Ggg of... - Generic Compliance Schedule and Increments of Progress a

    Science.gov (United States)

    2010-07-01

    ... Increments of Progress a 3 Table 3 to Subpart GGG of Part 62 Protection of Environment ENVIRONMENTAL... Pt. 62, Subpt. GGG, Table 3 Table 3 to Subpart GGG of Part 62—Generic Compliance Schedule and... NMOC emissions ≥ 50 Mg/yr.b a Table 3 of subpart GGG applies to landfills with design capacities ≥2.5...

  7. Electronic music effects on neuromuscular and cardiovascular systems and psychophysiological parameters during exhaustive incremental test

    Directory of Open Access Journals (Sweden)

    B.P.C. Smirmaul

    2011-01-01

    Full Text Available The aim of this study was to analyze the music effects on physiological and psychophysiological responses, as well as on the maximum power output attained during an incremental test. A sample of 10 healthy individuals (20.8 ± 1.4 years, 77.0 ± 12.0 kg, 179.2 ± 6.3 cm participated in this study. It was recorded the electromyographic activity (muscles Rectus Femoris − RF and Vastus Lateralis − VL, heart rate (HR, rating of perceived exertion (RPE, ratings of perceived time (RPT and the maximum power output attained (PMax during music (WM and without music (WTM conditions. The individuals completed four maximal incremental tests (MIT ramp-like on a cycle simulator with initial load of 100 W and increments of 10 W•min-1. The mean values of PMax between conditions WTM (260.5 ± 27.7 W and WM (263.2 ± 17.2 W were not statistically different. The comparison between the rates of increase of the values expressed in root-mean-square (RMS and median frequency (MF for both muscles (RF and VL also showed no statistical difference, as well as HR, RPE and RPT. It is concluded that the use of the electronic music during an incremental test to exhaustion showed no effect on the analyzed variables for the investigated group.

  8. Electronic music effects on neuromuscular and cardiovascular systems and psychophysiological parameters during exhaustive incremental test

    Directory of Open Access Journals (Sweden)

    Bruno de Paula Caraça Smirmaul

    2011-09-01

    Full Text Available The aim of this study was to analyze the music effects on physiological and psychophysiological responses, as well as on the maximum power output attained during an incremental test. A sample of 10 healthy individuals (20.8 ± 1.4 years, 77.0 ± 12.0 kg, 179.2 ± 6.3 cm participated in this study. It was recorded the electromyographic activity (muscles Rectus Femoris − RF and Vastus Lateralis − VL, heart rate (HR, rating of perceived exertion (RPE, ratings of perceived time (RPT and the maximum power output attained (PMax during music (WM and without music (WTM conditions. The individuals completed four maximal incremental tests (MIT ramp-like on a cycle simulator with initial load of 100 W and increments of 10 W·min-1. The mean values of PMax between conditions WTM (260.5 ± 27.7 W and WM (263.2 ± 17.2 W were not statistically different. The comparison between the rates of increase of the values expressed in root-mean-square (RMS and median frequency (MF for both muscles (RF and VL also showed no statistical difference, as well as HR, RPE and RPT. It is concluded that the use of the electronic music during an incremental test to exhaustion showed no effect on the analyzed variables for the investigated group.

  9. Compositional Temporal Analysis Model for Incremental Hard Real-Time System Design

    NARCIS (Netherlands)

    Hausmans, J.P.H.M.; Geuns, S.J.; Wiggers, M.H.; Bekooij, Marco Jan Gerrit

    2012-01-01

    The incremental design and analysis of parallel hard real-time stream processing applications is hampered by the lack of an intuitive compositional temporal analysis model that supports arbitrary cyclic dependencies between tasks. This paper introduces a temporal analysis model for hard real-time

  10. The Interpersonal Measure of Psychopathy: Construct and Incremental Validity in Male Prisoners

    Science.gov (United States)

    Zolondek, Stacey; Lilienfeld, Scott O.; Patrick, Christopher J.; Fowler, Katherine A.

    2006-01-01

    The authors examined the construct and incremental validity of the Interpersonal Measure of Psychopathy (IM-P), a relatively new instrument designed to detect interpersonal behaviors associated with psychopathy. Observers of videotaped Psychopathy Checklist-Revised (PCL-R) interviews rated male prisoners (N = 93) on the IM-P. The IM-P correlated…

  11. Robust flight control using incremental nonlinear dynamic inversion and angular acceleration prediction

    OpenAIRE

    Sieberling, S.; Chu, Q.P.; Mulder, J.A.

    2010-01-01

    This paper presents a flight control strategy based on nonlinear dynamic inversion. The approach presented, called incremental nonlinear dynamic inversion, uses properties of general mechanical systems and nonlinear dynamic inversion by feeding back angular accelerations. Theoretically, feedback of angular accelerations eliminates sensitivity to model mismatch, greatly increasing the robust performance of the system compared with conventional nonlinear dynamic inversion. However, angular acce...

  12. A power-driven increment borer for sampling high-density tropical wood

    Czech Academy of Sciences Publication Activity Database

    Krottenthaler, S.; Pitsch, P.; Helle, G.; Locosselli, G. M.; Ceccantini, G.; Altman, Jan; Svoboda, M.; Doležal, Jiří; Schleser, G.; Anhuf, D.

    2015-01-01

    Roč. 36, November (2015), s. 40-44 ISSN 1125-7865 R&D Projects: GA ČR GAP504/12/1952; GA ČR(CZ) GA14-12262S Institutional support: RVO:67985939 Keywords : tropical dendrochronology * tree sampling methods * increment cores Subject RIV: EF - Botanics Impact factor: 2.107, year: 2015

  13. Estimating the variance and integral scale of the transmissivity field using head residual increments

    Science.gov (United States)

    Zheng, Lingyun; Silliman, S.E.

    2000-01-01

    A modification of previously published solutions regarding the spatial variation of hydraulic heads is discussed whereby the semivariogram of increments of head residuals (termed head residual increments HRIs) are related to the variance and integral scale of the transmissivity field. A first-order solution is developed for the case of a transmissivity field which is isotropic and whose second-order behavior can be characterized by an exponential covariance structure. The estimates of the variance ??(Y)/2 and the integral scale ?? of the log transmissivity field are then obtained via fitting a theoretical semivariogram for the HRI to its sample semivariogram. This approach is applied to head data sampled from a series of two-dimensional, simulated aquifers with isotropic, exponential covariance structures and varying degrees of heterogeneity (??(Y)/2 = 0.25, 0.5, 1.0, 2.0, and 5.0). The results show that this method provided reliable estimates for both ?? and ??(Y)/2 in aquifers with the value of ??(Y)/2 up to 2.0, but the errors in those estimates were higher for ??(Y)/2 equal to 5.0. It is also demonstrated through numerical experiments and theoretical arguments that the head residual increments will provide a sample semivariogram with a lower variance than will the use of the head residuals without calculation of increments.

  14. A Geostatistical Scaling Approach for the Generation of Non Gaussian Random Variables and Increments

    Science.gov (United States)

    Guadagnini, Alberto; Neuman, Shlomo P.; Riva, Monica; Panzeri, Marco

    2016-04-01

    We address manifestations of non-Gaussian statistical scaling displayed by many variables, Y, and their (spatial or temporal) increments. Evidence of such behavior includes symmetry of increment distributions at all separation distances (or lags) with sharp peaks and heavy tails which tend to decay asymptotically as lag increases. Variables reported to exhibit such distributions include quantities of direct relevance to hydrogeological sciences, e.g. porosity, log permeability, electrical resistivity, soil and sediment texture, sediment transport rate, rainfall, measured and simulated turbulent fluid velocity, and other. No model known to us captures all of the documented statistical scaling behaviors in a unique and consistent manner. We recently proposed a generalized sub-Gaussian model (GSG) which reconciles within a unique theoretical framework the probability distributions of a target variable and its increments. We presented an algorithm to generate unconditional random realizations of statistically isotropic or anisotropic GSG functions and illustrated it in two dimensions. In this context, we demonstrated the feasibility of estimating all key parameters of a GSG model underlying a single realization of Y by analyzing jointly spatial moments of Y data and corresponding increments. Here, we extend our GSG model to account for noisy measurements of Y at a discrete set of points in space (or time), present an algorithm to generate conditional realizations of corresponding isotropic or anisotropic random field, and explore them on one- and two-dimensional synthetic test cases.

  15. Comparision of Perturb and Observer and Incremental Conductance MPPT Based Solar Tracking System

    OpenAIRE

    Nilam Rajendra Deshmukh

    2015-01-01

    This paper presents a detailed analysis of the two most well-known hill-climbing maximum power point tracking (MPPT) algorithms: the perturb-and-observe (P&O) and incremental conductance (INC). The purpose of the analysis is to clarify some common misconceptions in the literature regarding these two trackers, therefore helping the selection process of MPPT

  16. A height increment equation for young ponderosa pine plantations using precipitation and soil factors

    Science.gov (United States)

    Fabian C.C. Uzoh

    2001-01-01

    A height increment equation was used to determine the effects of site quality and competing herbaceous vegetation on the development of ponderosa pine seedlings (Pinus ponderosa var. scopulorum Engelm.). Study areas were established in 36 plantations across northwest and west-central Montana on Champion International Corporation's timberland (...

  17. Periodic Annual Diameter Increment After Overstory Removal in Mixed Conifer Stands

    Science.gov (United States)

    Fabian C.C. Uzoh; K. Leroy Dolph; John R. Anstead

    1998-01-01

    Diameter growth rates of understory trees were measured for periods both before and after overstory removal on six study areas in northern California. All the species responded with increased diameter growth after adjusting to their new environments. Linear regression equations that predict periodic annual increment of the diameters of the residual trees after...

  18. Joint Space Operations Center (JSpOC) Mission System Increment 3 (JMS Inc 3)

    Science.gov (United States)

    2016-03-01

    Component Command (JFCC) Space, to make rapid , responsive decisions for the protection of space assets from proliferating threats (adversary as well as... orbiting debris). JMS Increment-1 provided the foundational infrastructure, service oriented architecture, and user-defined operational picture. JMS

  19. Gradient nanostructured surface of a Cu plate processed by incremental frictional sliding

    DEFF Research Database (Denmark)

    Hong, Chuanshi; Huang, Xiaoxu; Hansen, Niels

    2015-01-01

    The flat surface of a Cu plate was processed by incremental frictional sliding at liquid nitrogen temperature. The surface treatment results in a hardened gradient surface layer as thick as 1 mm in the Cu plate, which contains a nanostructured layer on the top with a boundary spacing of the order...

  20. Analogical reasoning: An incremental or insightful process? What cognitive and cortical evidence suggests.

    Science.gov (United States)

    Antonietti, Alessandro; Balconi, Michela

    2010-06-01

    Abstract The step-by-step, incremental nature of analogical reasoning can be questioned, since analogy making appears to be an insight-like process. This alternative view of analogical thinking can be integrated in Speed's model, even though the alleged role played by dopaminergic subcortical circuits needs further supporting evidence.

  1. Incrementally Detecting Change Types of Spatial Area Object: A Hierarchical Matching Method Considering Change Process

    Directory of Open Access Journals (Sweden)

    Yanhui Wang

    2018-01-01

    Full Text Available Detecting and extracting the change types of spatial area objects can track area objects’ spatiotemporal change pattern and provide the change backtracking mechanism for incrementally updating spatial datasets. To respond to the problems of high complexity of detection methods, high redundancy rate of detection factors, and the low automation degree during incrementally update process, we take into account the change process of area objects in an integrated way and propose a hierarchical matching method to detect the nine types of changes of area objects, while minimizing the complexity of the algorithm and the redundancy rate of detection factors. We illustrate in details the identification, extraction, and database entry of change types, and how we achieve a close connection and organic coupling of incremental information extraction and object type-of-change detection so as to characterize the whole change process. The experimental results show that this method can successfully detect incremental information about area objects in practical applications, with the overall accuracy reaching above 90%, which is much higher than the existing weighted matching method, making it quite feasible and applicable. It helps establish the corresponding relation between new-version and old-version objects, and facilitate the linked update processing and quality control of spatial data.

  2. How to Perform Precise Soil and Sediment Sampling? One solution: The Fine Increment Soil Collector (FISC)

    International Nuclear Information System (INIS)

    Mabit, L.; Toloza, A.; Meusburger, K.; Alewell, C.; Iurian, A-R.; Owens, P.N.

    2014-01-01

    Soil and sediment related research for terrestrial agrienvironmental assessments requires accurate depth incremental sampling to perform detailed analysis of physical, geochemical and biological properties of soil and exposed sediment profiles. Existing equipment does not allow collecting soil/sediment increments at millimetre resolution. The Fine Increment Soil Collector (FISC), developed by the SWMCN Laboratory, allows much greater precision in incremental soil/sediment sampling. It facilitates the easy recovery of collected material by using a simple screw-thread extraction system (see Figure 1). The FISC has been designed specifically to enable standardized scientific investigation of shallow soil/sediment samples. In particular, applications have been developed in two IAEA Coordinated Research Projects (CRPs): CRP D1.20.11 on “Integrated Isotopic Approaches for an Area-wide Precision Conservation to Control the Impacts of Agricultural Practices on Land Degradation and Soil Erosion” and CRP D1.50.15 on “Response to Nuclear Emergencies Affecting Food and Agriculture.”

  3. Feasibility of Incremental 2-Times Weekly Hemodialysis in Incident Patients With Residual Kidney Function

    Directory of Open Access Journals (Sweden)

    Andrew I. Chin

    2017-09-01

    Discussion: More than 50% of incident HD patients with RKF have adequate kidney urea clearance to be considered for 2-times weekly HD. When additionally ultrafiltration volume and blood pressure stability are taken into account, more than one-fourth of the total cohort could optimally start HD in an incremental fashion.

  4. Statistical Discriminability Estimation for Pattern Classification Based on Neural Incremental Attribute Learning

    DEFF Research Database (Denmark)

    Wang, Ting; Guan, Sheng-Uei; Puthusserypady, Sadasivan

    2014-01-01

    Feature ordering is a significant data preprocessing method in Incremental Attribute Learning (IAL), a novel machine learning approach which gradually trains features according to a given order. Previous research has shown that, similar to feature selection, feature ordering is also important based...

  5. Literature Review of Data on the Incremental Costs to Design and Build Low-Energy Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, W. D.

    2008-05-14

    This document summarizes findings from a literature review into the incremental costs associated with low-energy buildings. The goal of this work is to help establish as firm an analytical foundation as possible for the Building Technology Program's cost-effective net-zero energy goal in the year 2025.

  6. The effects of the pine processionary moth on the increment of ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-05-18

    May 18, 2009 ... sycophanta L. (Coleoptera: Carabidae) used against the pine processionary moth (Thaumetopoea pityocampa Den. & Schiff.) (Lepidoptera: Thaumetopoeidae) in biological control. T. J. Zool. 30:181-185. Kanat M, Sivrikaya F (2005). Effect of the pine processionary moth on diameter increment of Calabrian ...

  7. Differentiating Major and Incremental New Product Development: The Effects of Functional and Numerical Workforce Flexibility

    NARCIS (Netherlands)

    Kok, R.A.W.; Ligthart, P.E.M.

    2014-01-01

    This study seeks to explain the differential effects of workforce flexibility on incremental and major new product development (NPD). Drawing on the resource-based theory of the firm, human resource management research, and innovation management literature, the authors distinguish two types of

  8. Time-incremental creep–fatigue damage rule for single crystal Ni-base

    NARCIS (Netherlands)

    W.A.M. Brekelmans; T. Tinga; M.G.D. Geers

    2009-01-01

    In the present paper a damage model for single crystal Ni-base superalloys is proposed that integrates time-dependent and cyclic damage into a generally applicable time-incremental damage rule. A criterion based on the Orowan stress is introduced to detect slip reversal on the microscopic level

  9. Time-incremental creep–fatigue damage rule for single crystal Ni-base superalloys

    NARCIS (Netherlands)

    Tinga, Tiedo; Brekelmans, W.A.M.; Geers, M.G.D.

    2009-01-01

    In the present paper a damage model for single crystal Ni-base superalloys is proposed that integrates time-dependent and cyclic damage into a generally applicable time-incremental damage rule. A criterion based on the Orowan stress is introduced to detect slip reversal on the microscopic level and

  10. Between structures and norms : Assessing tax increment financing for the Dutch spatial planning toolkit

    NARCIS (Netherlands)

    Root, Liz; Van Der Krabben, Erwin; Spit, Tejo

    2015-01-01

    The aim of the paper is to assess the institutional (mis)fit of tax increment financing for the Dutch spatial planning financial toolkit. By applying an institutionally oriented assessment framework, we analyse the interconnectivity of Dutch municipal finance and spatial planning structures and

  11. The Trait Emotional Intelligence Questionnaire: Internal Structure, Convergent, Criterion, and Incremental Validity in an Italian Sample

    Science.gov (United States)

    Andrei, Federica; Smith, Martin M.; Surcinelli, Paola; Baldaro, Bruno; Saklofske, Donald H.

    2016-01-01

    This study investigated the structure and validity of the Italian translation of the Trait Emotional Intelligence Questionnaire. Data were self-reported from 227 participants. Confirmatory factor analysis supported the four-factor structure of the scale. Hierarchical regressions also demonstrated its incremental validity beyond demographics, the…

  12. Equity and Entrepreneurialism: The Impact of Tax Increment Financing on School Finance.

    Science.gov (United States)

    Weber, Rachel

    2003-01-01

    Describes tax increment financing (TIF), an entrepreneurial strategy with significant fiscal implications for overlapping taxing jurisdictions that provide these functions. Statistical analysis of TIF's impact on the finances of one Illinois county's school districts indicates that municipal use of TIF depletes the property tax revenues of schools…

  13. A Self-Organizing Incremental Neural Network based on local distribution learning.

    Science.gov (United States)

    Xing, Youlu; Shi, Xiaofeng; Shen, Furao; Zhou, Ke; Zhao, Jinxi

    2016-12-01

    In this paper, we propose an unsupervised incremental learning neural network based on local distribution learning, which is called Local Distribution Self-Organizing Incremental Neural Network (LD-SOINN). The LD-SOINN combines the advantages of incremental learning and matrix learning. It can automatically discover suitable nodes to fit the learning data in an incremental way without a priori knowledge such as the structure of the network. The nodes of the network store rich local information regarding the learning data. The adaptive vigilance parameter guarantees that LD-SOINN is able to add new nodes for new knowledge automatically and the number of nodes will not grow unlimitedly. While the learning process continues, nodes that are close to each other and have similar principal components are merged to obtain a concise local representation, which we call a relaxation data representation. A denoising process based on density is designed to reduce the influence of noise. Experiments show that the LD-SOINN performs well on both artificial and real-word data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Incremental Reactivity Effects of Anthropogenic and Biogenic Volatile Organic Compounds on Secondary Organic Aerosol Formation

    Science.gov (United States)

    Kacarab, M.; Li, L.; Carter, W. P. L.; Cocker, D. R., III

    2015-12-01

    Two surrogate reactive organic gas (ROG) mixtures were developed to create a controlled reactivity environment simulating different urban atmospheres with varying levels of anthropogenic (e.g. Los Angeles reactivity) and biogenic (e.g. Atlanta reactivity) influences. Traditional chamber experiments focus on the oxidation of one or two volatile organic compound (VOC) precursors, allowing the reactivity of the system to be dictated by those compounds. Surrogate ROG mixtures control the overall reactivity of the system, allowing for the incremental aerosol formation from an added VOC to be observed. The surrogate ROG mixtures were developed based on that used to determine maximum incremental reactivity (MIR) scales for O3 formation from VOC precursors in a Los Angeles smog environment. Environmental chamber experiments were designed to highlight the incremental aerosol formation in the simulated environment due to the addition of an added anthropogenic (aromatic) or biogenic (terpene) VOC. All experiments were conducted in the UC Riverside/CE-CERT dual 90m3 environmental chambers. It was found that the aerosol precursors behaved differently under the two altered reactivity conditions, with more incremental aerosol being formed in the anthropogenic ROG system than in the biogenic ROG system. Further, the biogenic reactivity condition inhibited the oxidation of added anthropogenic aerosol precursors, such as m-xylene. Data will be presented on aerosol properties (density, volatility, hygroscopicity) and bulk chemical composition in the gas and particle phases (from a SYFT Technologies selected ion flow tube mass spectrometer, SIFT-MS, and Aerodyne high resolution time of flight aerosol mass spectrometer, HR-ToF-AMS, respectively) comparing the two controlled reactivity systems and single precursor VOC/NOx studies. Incremental aerosol yield data at different controlled reactivities provide a novel and valuable insight in the attempt to extrapolate environmental chamber

  15. Endogenous-cue prospective memory involving incremental updating of working memory: an fMRI study.

    Science.gov (United States)

    Halahalli, Harsha N; John, John P; Lukose, Ammu; Jain, Sanjeev; Kutty, Bindu M

    2015-11-01

    Prospective memory paradigms are conventionally classified on the basis of event-, time-, or activity-based intention retrieval. In the vast majority of such paradigms, intention retrieval is provoked by some kind of external event. However, prospective memory retrieval cues that prompt intention retrieval in everyday life are commonly endogenous, i.e., linked to a specific imagined retrieval context. We describe herein a novel prospective memory paradigm wherein the endogenous cue is generated by incremental updating of working memory, and investigated the hemodynamic correlates of this task. Eighteen healthy adult volunteers underwent functional magnetic resonance imaging while they performed a prospective memory task where the delayed intention was triggered by an endogenous cue generated by incremental updating of working memory. Working memory and ongoing task control conditions were also administered. The 'endogenous-cue prospective memory condition' with incremental working memory updating was associated with maximum activations in the right rostral prefrontal cortex, and additional activations in the brain regions that constitute the bilateral fronto-parietal network, central and dorsal salience networks as well as cerebellum. In the working memory control condition, maximal activations were noted in the left dorsal anterior insula. Activation of the bilateral dorsal anterior insula, a component of the central salience network, was found to be unique to this 'endogenous-cue prospective memory task' in comparison to previously reported exogenous- and endogenous-cue prospective memory tasks without incremental working memory updating. Thus, the findings of the present study highlight the important role played by the dorsal anterior insula in incremental working memory updating that is integral to our endogenous-cue prospective memory task.

  16. Influence of increment thickness on dentin bond strength and light transmission of composite base materials.

    Science.gov (United States)

    Omran, Tarek A; Garoushi, Sufyan; Abdulmajeed, Aous A; Lassila, Lippo V; Vallittu, Pekka K

    2017-06-01

    Bulk-fill resin composites (BFCs) are gaining popularity in restorative dentistry due to the reduced chair time and ease of application. This study aimed to evaluate the influence of increment thickness on dentin bond strength and light transmission of different BFCs and a new discontinuous fiber-reinforced composite. One hundred eighty extracted sound human molars were prepared for a shear bond strength (SBS) test. The teeth were divided into four groups (n = 45) according to the resin composite used: regular particulate filler resin composite: (1) G-ænial Anterior [GA] (control); bulk-fill resin composites: (2) Tetric EvoCeram Bulk Fill [TEBF] and (3) SDR; and discontinuous fiber-reinforced composite: (4) everX Posterior [EXP]. Each group was subdivided according to increment thickness (2, 4, and 6 mm). The irradiance power through the material of all groups/subgroups was quantified (MARC® Resin Calibrator; BlueLight Analytics Inc.). Data were analyzed using two-way ANOVA followed by Tukey's post hoc test. SBS and light irradiance decreased as the increment's height increased (p composite used. EXP presented the highest SBS in 2- and 4-mm-thick increments when compared to other composites, although the differences were not statistically significant (p > 0.05). Light irradiance mean values arranged in descending order were (p composites. Discontinuous fiber-reinforced composite showed the highest value of curing light transmission, which was also seen in improved bonding strength to the underlying dentin surface. Discontinuous fiber-reinforced composite can be applied safely in bulks of 4-mm increments same as other bulk-fill composites, although, in 2-mm thickness, the investigated composites showed better performance.

  17. 78 FR 46611 - Certain Digital Models, Digital Data, and Treatment Plans for Use in Making Incremental Dental...

    Science.gov (United States)

    2013-08-01

    ..., and Treatment Plans for Use in Making Incremental Dental Appliances, the Appliances Made Therefrom... models, digital data, and treatment plans for use in making incremental dental appliances, the appliances...), which was previously performed in an analog manner, the type of advance which does not render the...

  18. 78 FR 46610 - Certain Digital Models, Digital Data, and Treatment Plans for Use in Making Incremental Dental...

    Science.gov (United States)

    2013-08-01

    ..., and Treatment Plans for Use in Making Incremental Dental Appliances, the Appliances Made Therefrom..., digital data, and treatment plans for use in making incremental dental appliances, the appliances made...), which was previously performed in an analog manner, the type of advance which does not render the...

  19. 77 FR 20648 - Certain Digital Models, Digital Data, and Treatment Plans for Use in Making Incremental Dental...

    Science.gov (United States)

    2012-04-05

    ... COMMISSION Certain Digital Models, Digital Data, and Treatment Plans for Use in Making Incremental Dental... of certain digital models, digital data, and treatment plans for use in making incremental dental... importation, or the sale within the United States after importation of certain digital models, digital data...

  20. Job Embeddedness Demonstrates Incremental Validity When Predicting Turnover Intentions for Australian University Employees

    Science.gov (United States)

    Heritage, Brody; Gilbert, Jessica M.; Roberts, Lynne D.

    2016-01-01

    Job embeddedness is a construct that describes the manner in which employees can be enmeshed in their jobs, reducing their turnover intentions. Recent questions regarding the properties of quantitative job embeddedness measures, and their predictive utility, have been raised. Our study compared two competing reflective measures of job embeddedness, examining their convergent, criterion, and incremental validity, as a means of addressing these questions. Cross-sectional quantitative data from 246 Australian university employees (146 academic; 100 professional) was gathered. Our findings indicated that the two compared measures of job embeddedness were convergent when total scale scores were examined. Additionally, job embeddedness was capable of demonstrating criterion and incremental validity, predicting unique variance in turnover intention. However, this finding was not readily apparent with one of the compared job embeddedness measures, which demonstrated comparatively weaker evidence of validity. We discuss the theoretical and applied implications of these findings, noting that job embeddedness has a complementary place among established determinants of turnover intention. PMID:27199817

  1. Improved incremental conductance method for maximum power point tracking using cuk converter

    Directory of Open Access Journals (Sweden)

    M. Saad Saoud

    2014-03-01

    Full Text Available The Algerian government relies on a strategy focused on the development of inexhaustible resources such as solar and uses to diversify energy sources and prepare the Algeria of tomorrow: about 40% of the production of electricity for domestic consumption will be from renewable sources by 2030, Therefore it is necessary to concentrate our forces in order to reduce the application costs and to increment their performances, Their performance is evaluated and compared through theoretical analysis and digital simulation. This paper presents simulation of improved incremental conductance method for maximum power point tracking (MPPT using DC-DC cuk converter. This improved algorithm is used to track MPPs because it performs precise control under rapidly changing Atmospheric conditions, Matlab/ Simulink were employed for simulation studies.

  2. Are Fearless Dominance Traits Superfluous in Operationalizing Psychopathy? Incremental Validity and Sex Differences

    Science.gov (United States)

    Murphy, Brett; Lilienfeld, Scott; Skeem, Jennifer; Edens, John

    2016-01-01

    Researchers are vigorously debating whether psychopathic personality includes seemingly adaptive traits, especially social and physical boldness. In a large sample (N=1565) of adult offenders, we examined the incremental validity of two operationalizations of boldness (Fearless Dominance traits in the Psychopathy Personality Inventory, Lilienfeld & Andrews, 1996; Boldness traits in the Triarchic Model of Psychopathy, Patrick et al, 2009), above and beyond other characteristics of psychopathy, in statistically predicting scores on four psychopathy-related measures, including the Psychopathy Checklist-Revised (PCL-R). The incremental validity added by boldness traits in predicting the PCL-R’s representation of psychopathy was especially pronounced for interpersonal traits (e.g., superficial charm, deceitfulness). Our analyses, however, revealed unexpected sex differences in the relevance of these traits to psychopathy, with boldness traits exhibiting reduced importance for psychopathy in women. We discuss the implications of these findings for measurement models of psychopathy. PMID:26866795

  3. IMPROVED VARIABLE STEP SIZE INCREMENTAL CONDUCTANCE MPPT METHOD WITH HIGH CONVERGENCE SPEED FOR PV SYSTEMS

    Directory of Open Access Journals (Sweden)

    BEHZAD AZIZIAN ISALOO

    2016-04-01

    Full Text Available Maximum power point tracking (MPPT algorithms are employed in photovoltaic (PV systems to provide full utilization of PV array output power. Among all the MPPT algorithms, the Incremental Conductance (INC algorithm is widely used in PV systems due to the high tracking speed and accuracy. In this paper an improved variable step size algorithm which is based on incremental conductance algorithm is proposed that adjusts the step size according to PV output current. The result of this adaption is to make the algorithm suitable for practical operating conditions due to a wider operating range of irradiation changes. Simulation results confirm that the proposed algorithm increases convergence speed and efficiency in comparison with conventional fixed and variable step size INC algorithms.

  4. Ductility, strength and hardness relation after prior incremental deformation (ratcheting) of austenitic steel

    International Nuclear Information System (INIS)

    Kussmaul, K.; Diem, H.K.; Wachter, O.

    1993-01-01

    Experimental investigations into the stress/strain behavior of the niobium stabilized austenitic material with the German notation X6 CrNiNb 18 10 proved that a limited incrementally applied prior deformation will reduce the total deformation capability only by the amount of the prior deformation. It could especially be determined on the little changes in the reduction of area that the basically ductile deformation behavior will not be changed by the type of the prior loading. There is a correlation between the amount of deformation and the increase in hardness. It is possible to correlate both the changes in hardness and the material properties. In the case of low cycle fatigue tests with alternating temperature an incremental increase in total strain (ratcheting) was noted to depend on the strain range applied

  5. Bigger is Better, but at What Cost? Estimating the Economic Value of Incremental Data Assets.

    Science.gov (United States)

    Dalessandro, Brian; Perlich, Claudia; Raeder, Troy

    2014-06-01

    Many firms depend on third-party vendors to supply data for commercial predictive modeling applications. An issue that has received very little attention in the prior research literature is the estimation of a fair price for purchased data. In this work we present a methodology for estimating the economic value of adding incremental data to predictive modeling applications and present two cases studies. The methodology starts with estimating the effect that incremental data has on model performance in terms of common classification evaluation metrics. This effect is then translated into economic units, which gives an expected economic value that the firm might realize with the acquisition of a particular data asset. With this estimate a firm can then set a data acquisition price that targets a particular return on investment. This article presents the methodology in full detail and illustrates it in the context of two marketing case studies.

  6. Classifier-ensemble incremental-learning procedure for nuclear transient identification at different operational conditions

    International Nuclear Information System (INIS)

    Baraldi, Piero; Razavi-Far, Roozbeh; Zio, Enrico

    2011-01-01

    An important requirement for the practical implementation of empirical diagnostic systems is the capability of classifying transients in all plant operational conditions. The present paper proposes an approach based on an ensemble of classifiers for incrementally learning transients under different operational conditions. New classifiers are added to the ensemble where transients occurring in new operational conditions are not satisfactorily classified. The construction of the ensemble is made by bagging; the base classifier is a supervised Fuzzy C Means (FCM) classifier whose outcomes are combined by majority voting. The incremental learning procedure is applied to the identification of simulated transients in the feedwater system of a Boiling Water Reactor (BWR) under different reactor power levels.

  7. Combining Compact Representation and Incremental Generation in Large Games with Sequential Strategies

    DEFF Research Database (Denmark)

    Bosansky, Branislav; Xin Jiang, Albert; Tambe, Milind

    2015-01-01

    Many search and security games played on a graph can be modeled as normal-form zero-sum games with strategies consisting of sequences of actions. The size of the strategy space provides a computational challenge when solving these games. This complexity is tackled either by using the compact...... with incremental strategy generation. We experimentally compare CS-DO with the standard approaches and analyze the impact of the size of the support on the performance of the algorithms. Results show that CS-DO dramatically improves the convergence rate in games with non-trivial support...... representation of sequential strategies and linear programming, or by incremental strategy generation of iterative double-oracle methods. In this paper, we present novel hybrid of these two approaches: compact-strategy double-oracle (CS-DO) algorithm that combines the advantages of the compact representation...

  8. Accounting for between-study variation in incremental net benefit in value of information methodology.

    Science.gov (United States)

    Willan, Andrew R; Eckermann, Simon

    2012-10-01

    Previous applications of value of information methods for determining optimal sample size in randomized clinical trials have assumed no between-study variation in mean incremental net benefit. By adopting a hierarchical model, we provide a solution for determining optimal sample size with this assumption relaxed. The solution is illustrated with two examples from the literature. Expected net gain increases with increasing between-study variation, reflecting the increased uncertainty in incremental net benefit and reduced extent to which data are borrowed from previous evidence. Hence, a trial can become optimal where current evidence is sufficient assuming no between-study variation. However, despite the expected net gain increasing, the optimal sample size in the illustrated examples is relatively insensitive to the amount of between-study variation. Further percentage losses in expected net gain were small even when choosing sample sizes that reflected widely different between-study variation. Copyright © 2011 John Wiley & Sons, Ltd.

  9. The scope of application of incremental rapid prototyping methods in foundry engineering

    Directory of Open Access Journals (Sweden)

    M. Stankiewicz

    2010-01-01

    Full Text Available The article presents the scope of application of selected incremental Rapid Prototyping methods in the process of manufacturing casting models, casting moulds and casts. The Rapid Prototyping methods (SL, SLA, FDM, 3DP, JS are predominantly used for the production of models and model sets for casting moulds. The Rapid Tooling methods, such as: ZCast-3DP, ProMetalRCT and VoxelJet, enable the fabrication of casting moulds in the incremental process. The application of the RP methods in cast production makes it possible to speed up the prototype preparation process. This is particularly vital to elements of complex shapes. The time required for the manufacture of the model, the mould and the cast proper may vary from a few to several dozen hours.

  10. Sufficient conditions for a period incrementing big bang bifurcation in one-dimensional maps

    International Nuclear Information System (INIS)

    Avrutin, V; Granados, A; Schanz, M

    2011-01-01

    Typically, big bang bifurcation occurs for one (or higher)-dimensional piecewise-defined discontinuous systems whenever two border collision bifurcation curves collide transversely in the parameter space. At that point, two (feasible) fixed points collide with one boundary in state space and become virtual, and, in the one-dimensional case, the map becomes continuous. Depending on the properties of the map near the codimension-two bifurcation point, there exist different scenarios regarding how the infinite number of periodic orbits are born, mainly the so-called period adding and period incrementing. In our work we prove that, in order to undergo a big bang bifurcation of the period incrementing type, it is sufficient for a piecewise-defined one-dimensional map that the colliding fixed points are attractive and with associated eigenvalues of different signs

  11. Sufficient conditions for a period incrementing big bang bifurcation in one-dimensional maps

    Science.gov (United States)

    Avrutin, V.; Granados, A.; Schanz, M.

    2011-09-01

    Typically, big bang bifurcation occurs for one (or higher)-dimensional piecewise-defined discontinuous systems whenever two border collision bifurcation curves collide transversely in the parameter space. At that point, two (feasible) fixed points collide with one boundary in state space and become virtual, and, in the one-dimensional case, the map becomes continuous. Depending on the properties of the map near the codimension-two bifurcation point, there exist different scenarios regarding how the infinite number of periodic orbits are born, mainly the so-called period adding and period incrementing. In our work we prove that, in order to undergo a big bang bifurcation of the period incrementing type, it is sufficient for a piecewise-defined one-dimensional map that the colliding fixed points are attractive and with associated eigenvalues of different signs.

  12. A Batch-Incremental Video Background Estimation Model using Weighted Low-Rank Approximation of Matrices

    KAUST Repository

    Dutta, Aritra

    2017-07-02

    Principal component pursuit (PCP) is a state-of-the-art approach for background estimation problems. Due to their higher computational cost, PCP algorithms, such as robust principal component analysis (RPCA) and its variants, are not feasible in processing high definition videos. To avoid the curse of dimensionality in those algorithms, several methods have been proposed to solve the background estimation problem in an incremental manner. We propose a batch-incremental background estimation model using a special weighted low-rank approximation of matrices. Through experiments with real and synthetic video sequences, we demonstrate that our method is superior to the state-of-the-art background estimation algorithms such as GRASTA, ReProCS, incPCP, and GFL.

  13. Open Source Tools for Remote Incremental Backups on Linux: An Experimental Evaluation

    Directory of Open Access Journals (Sweden)

    Aurélio Santos

    2014-07-01

    Full Text Available Computer data has become one of the most valuable assets that individuals, organizations and enterprises own today. The majority of people agree that losing their data (programs, data sets, documentation files, email addresses, photos, customer data, etc. would be a disaster. The reason most individuals avoid performing data backups though, is because they feel the process is complicated, tedious and expensive. This is not always true. In fact, with the right tool, it’s very easy and affordable. In this paper we compare the performance and system resources usage of five remote incremental backup open source tools for Linux: Rsync, Rdiff-backup, Duplicity, Areca and Link-Backup. These tools are tested using three distinct remote backup operations: full backup, incremental backup and data restoration. The advantages of each tool are described and we select the most efficient backup tool for simple replication operations and the overall best backup tool.

  14. Job Embeddedness Demonstrates Incremental Validity When Predicting Turnover Intentions for Australian University Employees.

    Science.gov (United States)

    Heritage, Brody; Gilbert, Jessica M; Roberts, Lynne D

    2016-01-01

    Job embeddedness is a construct that describes the manner in which employees can be enmeshed in their jobs, reducing their turnover intentions. Recent questions regarding the properties of quantitative job embeddedness measures, and their predictive utility, have been raised. Our study compared two competing reflective measures of job embeddedness, examining their convergent, criterion, and incremental validity, as a means of addressing these questions. Cross-sectional quantitative data from 246 Australian university employees (146 academic; 100 professional) was gathered. Our findings indicated that the two compared measures of job embeddedness were convergent when total scale scores were examined. Additionally, job embeddedness was capable of demonstrating criterion and incremental validity, predicting unique variance in turnover intention. However, this finding was not readily apparent with one of the compared job embeddedness measures, which demonstrated comparatively weaker evidence of validity. We discuss the theoretical and applied implications of these findings, noting that job embeddedness has a complementary place among established determinants of turnover intention.

  15. Validade incremental do Zulliger e do Pfister no contexto da toxicomania Validez incremental del Zulliger y del Pfister en el contexto de la toxicomania Incremental validity of the Zulliger and Pfister in the contexto of drug addiction

    Directory of Open Access Journals (Sweden)

    Renata da Rocha Campos Franco

    2012-04-01

    Full Text Available O objetivo desta pesquisa foi verificar a validade incremental de duas técnicas projetivas, a partir da compreensão da personalidade de 20 dependentes químicos. Uma avaliação da personalidade de dez brasileiros adictos do álcool e dez franceses dependentes da heroína, que estavam vivenciando o processo do tratamento da desintoxicação da droga em centros especializados no Brasil ou na França, foi feita pela perspectiva da psicopatologia fenômeno-estrutural, que é uma abordagem teórica de origem francesa que compreende o funcionamento mental do indivíduo a partir do modo de viver o tempo e o espaço. A relação espaço-temporal e a personalidade foram compreendidas pela maneira como as pessoas se expressaram verbalmente e pelo modo como viram e construíram as imagens das técnicas projetivas de Zulliger e Pfister. Os resultados demonstraram coerência entre as informações geradas pelos instrumentos, provando que o Zulliger e o Pfister, quando aplicados de forma associada e interpretados pelo método fenômeno-estrutural, mostraram-se eficientes para conhecer as vivências de espaço e tempo dos sujeitos.El objetivo de esta investigación fue verificar la validez incremental de dos técnicas proyectivas, a partir de la comprensión de la personalidad de 20 dependientes químicos. Una evaluación de la personalidad de diez brasileños adictos del alcohol y diez franceses dependientes de heroína, que estaban vivenciando el proceso del tratamiento de la desintoxicación de droga en centros especializados en Brasil o en Francia, fue hecha por la perspectiva de la psicopatología fenómeno-estructural que es un abordaje teórico de origen francesa que comprende el funcionamiento mental del individuo por su modo de vivir el tiempo y el espacio. La relación espacio-temporal y la personalidad fueron comprendidas por el modo como las personas se expresaron verbalmente y como vieron y construyeron los imágenes de las t

  16. Detection of milled 100Cr6 steel surface by eddy current and incremental permeance methods

    Czech Academy of Sciences Publication Activity Database

    Perevertov, Oleksiy; Neslušan, M.; Stupakov, Alexandr

    2017-01-01

    Roč. 87, Apr (2017), s. 15-23 ISSN 0963-8695 R&D Projects: GA ČR GB14-36566G; GA ČR GA13-18993S Institutional support: RVO:68378271 Keywords : Eddy currents * hard milling * incremental permeance * magnetic materials * surface characterization Subject RIV: JB - Sensors, Measurment, Regulation OBOR OECD: Electrical and electronic engineering Impact factor: 2.726, year: 2016

  17. Inspiratory Muscle Performance of Former Smokers and Nonsmokers Using the Test of Incremental Respiratory Endurance.

    Science.gov (United States)

    Formiga, Magno F; Campos, Michael A; Cahalin, Lawrence P

    2018-01-01

    Smoking has potential deleterious effects on respiratory muscle function. Smokers may present with reduced inspiratory muscle strength and endurance. We compared inspiratory muscle performance of nonsmokers with that of former smokers without overt respiratory problems via the Test of Incremental Respiratory Endurance. This study was performed on 42 healthy subjects between the ages of 30 and 79 y (mean ± SD of 56.5 ± 14.4 y). Fourteen male and 7 female former smokers were matched to nonsmokers based on sex, age, height, and weight. Subjects completed a questionnaire about their health and current smoking status. Testing included the best of 3 or more consistent trials. The Test of Incremental Respiratory Endurance measurements included maximal inspiratory pressure measured from residual volume as well as sustained maximal inspiratory pressure and inspiratory duration measured from residual volume to total lung capacity during a maximal sustained inhalation. No significant difference in inspiratory performance of the entire group of former smokers compared with nonsmokers was found. However, separate sex analyses found a significant difference in sustained maximal inspiratory pressure between male former smokers and nonsmokers (518.7 ± 205.0 pressure time units vs 676.5 ± 255.2 pressure time units, P = .041). We found similar maximal inspiratory pressure between former smokers and nonsmokers via the Test of Incremental Respiratory Endurance, but the significant difference in sustained maximal inspiratory pressure between male former smokers and nonsmokers suggests that the sustained maximal inspiratory pressure may have greater discriminatory ability in assessing the effects of smoking on inspiratory muscle performance. Further investigation of the effects of smoking on inspiratory performance via the Test of Incremental Respiratory Endurance is warranted. Copyright © 2018 by Daedalus Enterprises.

  18. Just-in-Time Technology to Encourage Incremental, Dietary Behavior Change

    Science.gov (United States)

    Intille, Stephen S.; Kukla, Charles; Farzanfar, Ramesh; Bakr, Waseem

    2003-01-01

    Our multi-disciplinary team is developing mobile computing software that uses “just-in-time” presentation of information to motivate behavior change. Using a participatory design process, preliminary interviews have helped us to establish 10 design goals. We have employed some to create a prototype of a tool that encourages better dietary decision making through incremental, just-in-time motivation at the point of purchase. PMID:14728379

  19. Top-down attention based on object representation and incremental memory for knowledge building and inference.

    Science.gov (United States)

    Kim, Bumhwi; Ban, Sang-Woo; Lee, Minho

    2013-10-01

    Humans can efficiently perceive arbitrary visual objects based on an incremental learning mechanism with selective attention. This paper proposes a new task specific top-down attention model to locate a target object based on its form and color representation along with a bottom-up saliency based on relativity of primitive visual features and some memory modules. In the proposed model top-down bias signals corresponding to the target form and color features are generated, which draw the preferential attention to the desired object by the proposed selective attention model in concomitance with the bottom-up saliency process. The object form and color representation and memory modules have an incremental learning mechanism together with a proper object feature representation scheme. The proposed model includes a Growing Fuzzy Topology Adaptive Resonance Theory (GFTART) network which plays two important roles in object color and form biased attention; one is to incrementally learn and memorize color and form features of various objects, and the other is to generate a top-down bias signal to localize a target object by focusing on the candidate local areas. Moreover, the GFTART network can be utilized for knowledge inference which enables the perception of new unknown objects on the basis of the object form and color features stored in the memory during training. Experimental results show that the proposed model is successful in focusing on the specified target objects, in addition to the incremental representation and memorization of various objects in natural scenes. In addition, the proposed model properly infers new unknown objects based on the form and color features of previously trained objects. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. A modular design of incremental Lyapunov functions for microgrid control with power sharing

    OpenAIRE

    Persis, Claudio De; Monshizadeh, Nima

    2015-01-01

    In this paper we contribute a theoretical framework that sheds a new light on the problem of microgrid analysis and control. The starting point is an energy function comprising the kinetic energy associated with the elements that emulate the rotating machinery and terms taking into account the reactive power stored in the lines and dissipated on shunt elements. We then shape this energy function with the addition of an adjustable voltage-dependent term, and construct incremental storage funct...

  1. A High-Performance Adaptive Incremental Conductance MPPT Algorithm for Photovoltaic Systems

    OpenAIRE

    Chendi Li; Yuanrui Chen; Dongbao Zhou; Junfeng Liu; Jun Zeng

    2016-01-01

    The output characteristics of photovoltaic (PV) arrays vary with the change of environment, and maximum power point (MPP) tracking (MPPT) techniques are thus employed to extract the peak power from PV arrays. Based on the analysis of existing MPPT methods, a novel incremental conductance (INC) MPPT algorithm is proposed with an adaptive variable step size. The proposed algorithm automatically regulates the step size to track the MPP through a step size adjustment coefficient, and a user prede...

  2. Integrated Personnel and Pay System-Army Increment 1 (IPPS-A Inc 1)

    Science.gov (United States)

    2016-03-01

    Alternatives dated January 8, 2003, to use a Commercial-off-the-Shelf (COTS) Enterprise Resource Planning ( ERP ) product to develop and implement IPPS...A. The Milestone Decision Authority directed the continued use of the COTS ERP product in the DIMHRS Capability Way Ahead ADM dated September 8, 2009...Feasibility: The determination of the IPPS-A Increment 1 development and implementation contract type was based on cost risk associated with the

  3. Incremental stiffness and electrical contact conductance in the contact of rough finite bodies

    Science.gov (United States)

    Barber, J. R.

    2013-01-01

    If two half spaces are in contact, there exists a formal mathematical relation between the electrical contact resistance and the incremental elastic compliance. Here, this relation is extended to the contact of finite bodies. In particular, it is shown that the additional resistance due to roughness of the contacting surfaces (the interface resistance) bears a similar relation to the additional compliance as that obtained for the total resistance in the half-space problem.

  4. Cost and Performance Report of Incremental Sampling Methodology for Soil Containing Metallic Residues

    Science.gov (United States)

    2013-09-01

    19 3 Map showing location of Fort Eustis, VA, and the 1000-in. Rifle Range............................... 20 4 Map of Fort Wainwright, AK, and the...Realignment and Closure CEC Cation Exchange Capacity CMIST CRREL Multi-Increment Sampling Tool CRREL Cold Regions Research and Engineering Laboratory...sieved, and then mechanically pulverized . Table 2 sum- marizes the proposed changes to the sampling processing procedures for Method 3050B

  5. Just-in-Time Technology to Encourage Incremental, Dietary Behavior Change

    OpenAIRE

    Intille, Stephen S.; Kukla, Charles; Farzanfar, Ramesh; Bakr, Waseem

    2003-01-01

    Our multi-disciplinary team is developing mobile computing software that uses “just-in-time” presentation of information to motivate behavior change. Using a participatory design process, preliminary interviews have helped us to establish 10 design goals. We have employed some to create a prototype of a tool that encourages better dietary decision making through incremental, just-in-time motivation at the point of purchase.

  6. A Variable Incremental Conductance MPPT Algorithm Applied to Photovoltaic Water Pumping System

    OpenAIRE

    S. Abdourraziq; R. El Bachtiri

    2015-01-01

    The use of solar energy as a source for pumping water is one of the promising areas in the photovoltaic (PV) application. The energy of photovoltaic pumping systems (PVPS) can be widely improved by employing an MPPT algorithm. This will lead consequently to maximize the electrical motor speed of the system. This paper presents a modified incremental conductance (IncCond) MPPT algorithm with direct control method applied to a standalone PV pumping system. The influence of ...

  7. Sleep quality and duration are associated with performance in maximal incremental test.

    Science.gov (United States)

    Antunes, B M; Campos, E Z; Parmezzani, S S; Santos, R V; Franchini, E; Lira, F S

    2017-08-01

    Inadequate sleep patterns may be considered a trigger to development of several metabolic diseases. Additionally, sleep deprivation and poor sleep quality can negatively impact performance in exercise training. However, the impact of sleep duration and sleep quality on performance during incremental maximal test performed by healthy men is unclear. Therefore, the purpose of the study was to analyze the association between sleep pattern (duration and quality) and performance during maximal incremental test in healthy male individuals. A total of 28 healthy males volunteered to take part in the study. Sleep quality, sleep duration and physical activity were subjectively assessed by questionnaires. Sleep pattern was classified by sleep duration (>7h or sleep per night) and sleep quality according to the sum of measured points and/or scores by the Pittsburgh Sleep Quality Index (PSQI). Incremental exercise test was performed at 35 watts for untrained subjects, 70 watts for physically active subjects and 105 watts for well-trained subjects. HR max was correlated with sleep quality (r=0.411, p=0.030) and sleep duration (r=-0.430, p=0.022). Participants reporting good sleep quality presented higher values of W max , VO 2max and lower values of HR max when compared to participants with altered sleep. Regarding sleep duration, only W max was influenced by the amount of sleeping hours per night and this association remained significant even after adjustment by VO 2max . Sleep duration and quality are associated, at least in part, with performance during maximal incremental test among healthy men, with losses in W max and HR max . In addition, our results suggest that the relationship between sleep patterns and performance, mainly in W max , is independent of fitness condition. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Mechanical ventilatory constraints during incremental exercise in healthy and cystic fibrosis children.

    Science.gov (United States)

    Borel, Benoit; Leclair, Erwan; Thevenet, Delphine; Beghin, Laurent; Gottrand, Frédéric; Fabre, Claudine

    2014-03-01

    To analyze breathing pattern and mechanical ventilatory constraints during incremental exercise in healthy and cystic fibrosis (CF) children. Thirteen healthy children and 6 children with cystic fibrosis volunteered to perform an incremental test on a treadmill. Exercise tidal flow/volume loops were plotted every minute within a maximal flow/volume loop (MFVL). Expiratory flow limitation (expFL expressed in %Vt) was evaluated and end-expiratory and end-inspiratory lung volumes (EELV and EILV) were estimated from expiratory reserve volume relative to vital capacity (ERV/FVC) and from inspiratory reserve volume relative to vital capacity (IRV/FVC). During the incremental exercise, expFL was first observed at 40% of maximal aerobic speed in both groups. At maximal exercise, 46% of healthy children and 83% of CF children presented expFL, without significant effect of cystic fibrosis on the severity of expFL. According to the two-way ANOVA results, both groups adopted similar breathing pattern and breathing strategies as no significant effect of CF has been revealed. But, according to one-way ANOVA results, a significant increase of ERV/FVC associated with a significant decrease of IRV/FVC from resting value shave been observed in healthy children at maximal exercise, but not in CF children. The hypothesis of this study was based on the assumption that mild cystic fibrosis could induce more frequent and more severe mechanical ventilatory constraints due to pulmonary impairment and breathing pattern disturbances. But, this study did not succeed to highlight an effect of mild cystic fibrosis on the mechanical ventilatory constraints (expFL and dynamic hyperinflation) that occur during an incremental exercise. This absence of effect could be due to the absence of an impact of the disease on spirometric data, breathing pattern regulation during exercise and breathing strategy. © 2013 Wiley Periodicals, Inc.

  9. The value increment of mass-customized products: An empirical assessment

    OpenAIRE

    Schreier, Martin

    2006-01-01

    The primary argument in favor of mass customization is the delivery of superior customer value. Using willingness-to-pay (WTP) measurements, Franke & Piller (2004) have recently shown that customers designing their own watches with design toolkits are willing to pay premiums of more than 100% (DWTP). In the course of three studies, we found that this type of value increment is not a singular occurrence but might rather be a general phenomenon, as we again found average DWTPs of...

  10. LED session prior incremental step test enhance VO2maxin running.

    Science.gov (United States)

    Mezzaroba, Paulo V; Pessôa Filho, Dalton M; Zagatto, Alessandro M; Machado, Fabiana Andrade

    2018-03-15

    This study aimed to investigate the effect of prior LED sessions on the responses of cardiorespiratory parameters during the running incremental step test. Twenty-six healthy, physically active, young men, aged between 20 and 30 years, took part in this study. Participants performed two incremental load tests after placebo (PLA) and light-emitting diode application (LED), and had their gas exchange, heart rate (HR), blood lactate, and rating of perceived exertion (RPE) monitored during all tests. The PLA and LED conditions were compared using the dependent Student t test with significance set at 5%. The T test showed higher maximum oxygen uptake (VO 2max ) (PLA = 47.2 ± 5.7; LED = 48.0 ± 5.4 ml kg -1  min -1 , trivial effect size), peak velocity (V peak ) (PLA = 13.4 ± 1.2; LED = 13.6 ± 1.2 km h -1 , trivial effect size), and lower maximum HR (PLA = 195.3 ± 3.4; LED = 193.3 ± 3.9 b min -1 , moderate effect size) for LED compared to PLA conditions. Furthermore, submaximal values of HR and RPE were lower, and submaximal VO 2 values were higher when LED sessions prior to the incremental step test were applied. A positive response of the previous LED application in the blood lactate disappearance was also demonstrated, especially 13 and 15 min after the test. It is concluded that LED sessions prior to exercise modify cardiorespiratory response by affecting running tolerance during the incremental step test, metabolite clearance, and RPE. Therefore, LED could be used as a prior exercise strategy to modulate oxidative response acutely in targeted muscle and enhance exercise tolerance.

  11. Incremental test design, peak 'aerobic' running speed and endurance performance in runners.

    Science.gov (United States)

    Machado, Fabiana A; Kravchychyn, Ana Claudia P; Peserico, Cecilia S; da Silva, Danilo F; Mezzaroba, Paulo V

    2013-11-01

    Peak running speed obtained during an incremental treadmill test (Vpeak) is a good predictor of endurance run performance. However, the best-designed protocol for Vpeak determination and the best Vpeak definition remain unknown. Therefore, this study examined the influence of stage duration and Vpeak definition on the relationship between Vpeak and endurance run performance. Relationship. Twenty-seven male, recreational, endurance-trained runners (10-km running pace: 10-17 k mh(-1)) performed, in counterbalanced order, three continuous incremental treadmill tests of different stage durations (1-, 2-, or 3-min) to determine Vpeak, and two 5-km and two 10-km time trials on a 400-m track to obtain their 5-km and 10-km run performances. Vpeak was defined as either (a) the highest speed that could be maintained for a complete minute (Vpeak-60 s), (b) the speed of the last complete stage (Vpeak-C), or (c) the speed of the last complete stage added to the multiplication of the speed increment by the completed fraction of the incomplete stage (Vpeak-P). The Vpeak determined during the 3-min stage duration protocol was the most highly correlated with both the 5-km (r=0.95) and 10-km (r=0.92) running performances and these relationships were minimally influenced by the Vpeak definition. However, independent of the stage duration, the Vpeak-P provided the highest correlation with both running performances. Incremental treadmill tests comprising 3-min stage duration is preferred to 1-min and 2-min stage duration protocols in order to determine Vpeak to accurately predict 5-km and 10-km running performances. Further, Vpeak-P should be used as standard for the determination of Vpeak. Copyright © 2013 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  12. Technological aspects regarding machining the titanium alloys by means of incremental forming

    Directory of Open Access Journals (Sweden)

    Bologa Octavian

    2017-01-01

    Full Text Available Titanium alloys are materials with reduced formability, due to their low plasticity. However, today there are high demands regarding their use in the automotive industry and in bio-medical industry, for prosthetic devices. This paper presents some technological aspects regarding the machinability of titanium alloys by means of incremental forming. The research presented in this paper aimed to demonstrate that the parts made from these materials could be machined at room temperature, in certain technological conditions.

  13. Oral creatine supplementation's decrease of blood lactate during exhaustive, incremental cycling.

    Science.gov (United States)

    Oliver, Jonathan M; Joubert, Dustin P; Martin, Steven E; Crouse, Stephen F

    2013-06-01

    To determine the effects of creatine supplementation on blood lactate during incremental cycling exercise. Thirteen male subjects (M ± SD 23 ± 2 yr, 178.0 ± 8.1 cm, 86.3 ± 16.0 kg, 24% ± 9% body fat) performed a maximal, incremental cycling test to exhaustion before (Pre) and after (Post) 6 d of creatine supplementation (4 doses/d of 5 g creatine + 15 g glucose). Blood lactate was measured at the end of each exercise stage during the protocol, and the lactate threshold was determined as the stage before achieving 4 mmol/L. Lactate concentrations during the incremental test were analyzed using a 2 (condition) × 6 (exercise stage) repeated-measures ANOVA. Differences in power at lactate threshold, power at exhaustion, and total exercise time were determined by paired t tests and are presented as M ± SD. Lactate concentrations were reduced during exercise after supplementation, demonstrating a significant condition effect (p = .041). There was a tendency for increased power at the lactate threshold (Pre 128 ± 45 W, Post 143 ± 26 W; p = .11). Total time to fatigue approached significant increases (Pre 22.6 ± 3.2 min, Post 23.3 ± 3.3 min; p = .056), as did maximal power output (Pre 212.5 ± 32.5 W, Post 220 ± 34.6 W; p = .082). Our findings demonstrate that creatine supplementation decreases lactate during incremental cycling exercise and tends to raise lactate threshold. Therefore, creatine supplementation could potentially benefit endurance athletes.

  14. Diagnostic of flow rate of the tumors of the boobs at increment of the blood pressure

    International Nuclear Information System (INIS)

    Pohlodek, K.; Sohn, Ch.

    1998-01-01

    54 patients with ultrasonography evident tumors of the mammary glands were examined by angiography on flow rate of the blood in the tumor (14 patients with benign tumor and 40 patients with carcinoma at increment of the blood pressure. At evaluating of the findings 4 characteristic curves were obtained: first type was typical for malignant tumors; second type was characteristic for benign findings and third and fourth types were non-specific. (authors)

  15. Understanding treatment effect mechanisms of the CAMBRA randomized trial in reducing caries increment

    OpenAIRE

    Cheng, J; Chaffee, BW; Cheng, NF; Gansky, SA; Featherstone, JDB

    2015-01-01

    © International & American Associations for Dental Research 2014. The Caries Management By Risk Assessment (CAMBRA) randomized controlled trial showed that an intervention featuring combined antibacterial and fluoride therapy significantly reduced bacterial load and suggested reduced caries increment in adults with 1 to 7 baseline cavitated teeth. While trial results speak to the overall effectiveness of an intervention, insight can be gained from understanding the mechanism by which an int...

  16. Micro-CT Comparison of Posterior Composite Resin Restorative Techniques: Sonicated versus Incremental Fill

    Science.gov (United States)

    2015-06-01

    polymerization shrinkage3-8, marginal leakage9, issues with accelerated wear10, unpolymerized resin11, fracture 12 and difficulty in establishing...flexural strength, and incomplete adhesion between the resin and tooth surface. This aspect of RBC restorations has not been well documented... tooth surface. The purpose of this investigation was to evaluate the difference between a conventional, hand-placed incremental application of RBC into

  17. Job Embeddedness Demonstrates Incremental Validity When Predicting Turnover Intentions for Australian University Employees

    OpenAIRE

    Brody eHeritage; Jessica Michelle Gilbert; Lynne D. Roberts

    2016-01-01

    Job embeddedness is a construct that describes the manner in which employees can be enmeshed in their jobs, reducing their turnover intentions. Recent questions regarding the properties of quantitative job embeddedness measures, and their predictive utility, have been raised. Our study compared two competing reflective measures of job embeddedness, examining their convergent, criterion, and incremental validity, as a means of addressing these questions. Cross-sectional quantitative data from ...

  18. Incremental versus maximum bite advancement during twin-block therapy: A randomized controlled clinical trial

    OpenAIRE

    Banks, Phil; Wright, Jean; O'Brien, Kevin

    2004-01-01

    The aim of this study was to evaluate the effectiveness of incremental and maximum bite advancement during treatment of Class II Division 1 malocclusion with the Twin-block appliance in the permanent dentition. It was performed at 3 district general hospitals in the United Kingdom with 4 operators. Two hundred three patients, 10-14 years old, were randomized. Control patients had the initial bite taken edge-to-edge for appliance construction with a standard Twin-block. Experimental patients h...

  19. Performance Analysis of Selective Decode-and-Forward Multinode Incremental Relaying with Maximal Ratio Combining

    KAUST Repository

    Hadjtaieb, Amir

    2013-09-12

    In this paper, we propose an incremental multinode relaying protocol with arbitrary N-relay nodes that allows an efficient use of the channel spectrum. The destination combines the received signals from the source and the relays using maximal ratio Combining (MRC). The transmission ends successfully once the accumulated signal-to-noise ratio (SNR) exceeds a predefined threshold. The number of relays participating in the transmission is adapted to the channel conditions based on the feedback from the destination. The use of incremental relaying allows obtaining a higher spectral efficiency. Moreover, the symbol error probability (SEP) performance is enhanced by using MRC at the relays. The use of MRC at the relays implies that each relay overhears the signals from the source and all previous relays and combines them using MRC. The proposed protocol differs from most of existing relaying protocol by the fact that it combines both incremental relaying and MRC at the relays for a multinode topology. Our analyses for a decode-and-forward mode show that: (i) compared to existing multinode relaying schemes, the proposed scheme can essentially achieve the same SEP performance but with less average number of time slots, (ii) compared to schemes without MRC at the relays, the proposed scheme can approximately achieve a 3 dB gain.

  20. First UHF Implementation of the Incremental Scheme for Open-Shell Systems.

    Science.gov (United States)

    Anacker, Tony; Tew, David P; Friedrich, Joachim

    2016-01-12

    The incremental scheme makes it possible to compute CCSD(T) correlation energies to high accuracy for large systems. We present the first extension of this fully automated black-box approach to open-shell systems using an Unrestricted Hartree-Fock (UHF) wave function, extending the efficient domain-specific basis set approach to handle open-shell references. We test our approach on a set of organic and metal organic structures and molecular clusters and demonstrate standard deviations from canonical CCSD(T) values of only 1.35 kJ/mol using a triple ζ basis set. We find that the incremental scheme is significantly more cost-effective than the canonical implementation even for relatively small systems and that the ease of parallelization makes it possible to perform high-level calculations on large systems in a few hours on inexpensive computers. We show that the approximations that make our approach widely applicable are significantly smaller than both the basis set incompleteness error and the intrinsic error of the CCSD(T) method, and we further demonstrate that incremental energies can be reliably used in extrapolation schemes to obtain near complete basis set limit CCSD(T) reaction energies for large systems.

  1. Optimization of Surface Roughness and Wall Thickness in Dieless Incremental Forming Of Aluminum Sheet Using Taguchi

    Science.gov (United States)

    Hamedon, Zamzuri; Kuang, Shea Cheng; Jaafar, Hasnulhadi; Azhari, Azmir

    2018-03-01

    Incremental sheet forming is a versatile sheet metal forming process where a sheet metal is formed into its final shape by a series of localized deformation without a specialised die. However, it still has many shortcomings that need to be overcome such as geometric accuracy, surface roughness, formability, forming speed, and so on. This project focus on minimising the surface roughness of aluminium sheet and improving its thickness uniformity in incremental sheet forming via optimisation of wall angle, feed rate, and step size. Besides, the effect of wall angle, feed rate, and step size to the surface roughness and thickness uniformity of aluminium sheet was investigated in this project. From the results, it was observed that surface roughness and thickness uniformity were inversely varied due to the formation of surface waviness. Increase in feed rate and decrease in step size will produce a lower surface roughness, while uniform thickness reduction was obtained by reducing the wall angle and step size. By using Taguchi analysis, the optimum parameters for minimum surface roughness and uniform thickness reduction of aluminium sheet were determined. The finding of this project helps to reduce the time in optimising the surface roughness and thickness uniformity in incremental sheet forming.

  2. Development of the Nonstationary Incremental Analysis Update Algorithm for Sequential Data Assimilation System

    Science.gov (United States)

    Ham, Yoo-Geun; Song, Hyo-Jong; Jung, Jaehee; Lim, Gyu-Ho

    2017-04-01

    This study introduces a altered version of the incremental analysis updates (IAU), called the nonstationary IAU (NIAU) method, to enhance the assimilation accuracy of the IAU while retaining the continuity of the analysis. Analogous to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. Still, unlike the IAU, the NIAU method applies time-evolved forcing employing the forward operator as rectifications to the model. The solution of the NIAU is better than that of the IAU, of which analysis is performed at the start of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.

  3. Habituation of the eyeblink response in humans with stimuli presented in a sequence of incremental intensity

    Directory of Open Access Journals (Sweden)

    Fernando P Ponce

    2011-01-01

    Full Text Available In an experiment we examined whether the repeated presentation of tones of gradually increasing intensities produces greater decrement in the eyeblink reflex response in humans than the repetition of tones of constant intensities. Two groups of participants matched for their initial level of response were exposed to 110 tones of 100-ms duration. For the participants in the incremental group, the tones increased from 60- to 90- dB in 3-dB steps, whereas participants in the constant group received the tones at a fixed 90-dB intensity. The results indicated that the level of response in the last block of 10 trials, in which both groups received 90-dB tones, was significantly lower in the incremental group than in the constant group. These findings support the data presented by Davis and Wagner (7 with the acoustic response in rats, but differ from several reports with autonomic responses in humans, where the advantage of the incremental condition has not been observed unambiguously. The discussion analyzes theoretical approaches to this phenomenon and the possible involvement of separate neural circuits.

  4. Two-Point Incremental Forming with Partial Die: Theory and Experimentation

    Science.gov (United States)

    Silva, M. B.; Martins, P. A. F.

    2013-04-01

    This paper proposes a new level of understanding of two-point incremental forming (TPIF) with partial die by means of a combined theoretical and experimental investigation. The theoretical developments include an innovative extension of the analytical model for rotational symmetric single point incremental forming (SPIF), originally developed by the authors, to address the influence of the major operating parameters of TPIF and to successfully explain the differences in formability between SPIF and TPIF. The experimental work comprised the mechanical characterization of the material and the determination of its formability limits at necking and fracture by means of circle grid analysis and benchmark incremental sheet forming tests. Results show the adequacy of the proposed analytical model to handle the deformation mechanics of SPIF and TPIF with partial die and demonstrate that neck formation is suppressed in TPIF, so that traditional forming limit curves are inapplicable to describe failure and must be replaced by fracture forming limits derived from ductile damage mechanics. The overall geometric accuracy of sheet metal parts produced by TPIF with partial die is found to be better than that of parts fabricated by SPIF due to smaller elastic recovery upon unloading.

  5. Developmental Word Acquisition through Self-Organized Incremental Neural Network with A Humanoid Robot

    Science.gov (United States)

    Okada, Shogo; He, Xiaoyuan; Kojima, Ryo; Hasegawa, Osamu

    This paper presents an unsupervised approach of integrating speech and visual information without using any prepared data(training data). The approach enables a humanoid robot, Incremental Knowledge Robot 1 (IKR1), to learn words' meanings. The approach is different from most existing approaches in that the robot learns online from audio-visual input, rather than from stationary data provided in advance. In addition, the robot is capable of incremental learning, which is considered to be indispensable to lifelong learning. A noise-robust self-organized incremental neural network(SOINN) is developed to represent the topological structure of unsupervised online data. We are also developing an active learning mechanism, called ``desire for knowledge'', to let the robot select the object for which it possesses the least information for subsequent learning. Experimental results show that the approach raises the efficiency of the learning process. Based on audio and visual data, we construct a mental model for the robot, which forms a basis for constructing IKR1's inner world and builds a bridge connecting the learned concepts with current and past scenes.

  6. A variational formulation for the incremental homogenization of elasto-plastic composites

    Science.gov (United States)

    Brassart, L.; Stainier, L.; Doghri, I.; Delannay, L.

    2011-12-01

    This work addresses the micro-macro modeling of composites having elasto-plastic constituents. A new model is proposed to compute the effective stress-strain relation along arbitrary loading paths. The proposed model is based on an incremental variational principle (Ortiz, M., Stainier, L., 1999. The variational formulation of viscoplastic constitutive updates. Comput. Methods Appl. Mech. Eng. 171, 419-444) according to which the local stress-strain relation derives from a single incremental potential at each time step. The effective incremental potential of the composite is then estimated based on a linear comparison composite (LCC) with an effective behavior computed using available schemes in linear elasticity. Algorithmic elegance of the time-integration of J 2 elasto-plasticity is exploited in order to define the LCC. In particular, the elastic predictor strain is used explicitly. The method yields a homogenized yield criterion and radial return equation for each phase, as well as a homogenized plastic flow rule. The predictive capabilities of the proposed method are assessed against reference full-field finite element results for several particle-reinforced composites.

  7. An application of the J-integral to an incremental analysis of blunting crack behavior

    International Nuclear Information System (INIS)

    Merkle, J.G.

    1989-01-01

    This paper describes an analytical approach to estimating the elastic-plastic stresses and strains near the tip of a blunting crack with a finite root radius. Rice's original derivation of the path independent J-integral considered the possibility of a finite crack tip root radius. For this problem Creager's elastic analysis gives the relation between the stress intensity factor K I and the near tip stresses. It can be shown that the relation K I 2 = E'J holds when the root radius is finite. Recognizing that elastic-plastic behavior is incrementally linear then allows a derivation to be performed for a bielastic specimen having a crack tip region of reduced modulus, and the result differentiated to estimate elastic-plastic behavior. The result is the incremental form of Neuber's equation. This result does not require the assumption of any particular stress-strain relation. However by assuming a pure power law stress-strain relation and using Ilyushin's principle, the ordinary deformation theory form of Neuber's equation, K σ K var epsilon = K t 2 , is obtained. Applications of the incremental form of Neuber's equation have already been made to fatigue and fracture analysis. This paper helps to provide a theoretical basis for these methods previously considered semiempirical. 26 refs., 4 figs

  8. Development of the Nonstationary Incremental Analysis Update Algorithm for Sequential Data Assimilation System

    Directory of Open Access Journals (Sweden)

    Yoo-Geun Ham

    2016-01-01

    Full Text Available This study introduces a modified version of the incremental analysis updates (IAU, called the nonstationary IAU (NIAU method, to improve the assimilation accuracy of the IAU while keeping the continuity of the analysis. Similar to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. However, unlike the IAU, the NIAU procedure uses time-evolved forcing using the forward operator as corrections to the model. The solution of the NIAU is superior to that of the forward IAU, of which analysis is performed at the beginning of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.

  9. Measurement of the Sunyaev-Zeldovich Effect Increment with Large Aperture Sub-mm Telescopes

    Science.gov (United States)

    Zemcov, Michael

    2012-05-01

    Measurement of the Sunyaev-Zeldovich (SZ) effect increment is critical for precision determination of the full spectrum of the SZ spectral distortion, which in turn is necessary for measurement of the relativistic and kinetic SZ effects that are largest shortward of the SZ null at 217 GHz. Maps of galaxy clusters at SZ increment frequencies have the added benefit of relatively high angular resolution, allowing a precise determination of the sub-mm galaxy contamination in clusters, which is a significant foreground to SZ spectral studies. Current and upcoming facilities including SPIRE, SCUBA-2, MUSIC on the CSO, and further in the future next generation instrumentation for CCAT, will provide extremely deep, high angular resolution, multi-band SZ spectrum measurements in many clusters. Such measurements will enable new types of SZ science, including detailed studies of the properties of the intra-cluster medium and line of sight velocity effects. In this talk I will review the status of measurements of the SZ increment, present new results from Herschel, and look forward to what developments we can expect over the coming years.

  10. Silhouette analysis for human action recognition based on supervised temporal t-SNE and incremental learning.

    Science.gov (United States)

    Cheng, Jian; Liu, Haijun; Wang, Feng; Li, Hongsheng; Zhu, Ce

    2015-10-01

    This paper develops a human action recognition method for human silhouette sequences based on supervised temporal t-stochastic neighbor embedding (ST-tSNE) and incremental learning. Inspired by the SNE and its variants, ST-tSNE is proposed to learn the underlying relationship between action frames in a manifold, where the class label information and temporal information are introduced to well represent those frames from the same action class. As to the incremental learning, an important step for action recognition, we introduce three methods to perform the low-dimensional embedding of new data. Two of them are motivated by local methods, locally linear embedding and locality preserving projection. Those two techniques are proposed to learn explicit linear representations following the local neighbor relationship, and their effectiveness is investigated for preserving the intrinsic action structure. The rest one is based on manifold-oriented stochastic neighbor projection to find a linear projection from high-dimensional to low-dimensional space capturing the underlying pattern manifold. Extensive experimental results and comparisons with the state-of-the-art methods demonstrate the effectiveness and robustness of the proposed ST-tSNE and incremental learning methods in the human action silhouette analysis.

  11. Pornographic image recognition and filtering using incremental learning in compressed domain

    Science.gov (United States)

    Zhang, Jing; Wang, Chao; Zhuo, Li; Geng, Wenhao

    2015-11-01

    With the rapid development and popularity of the network, the openness, anonymity, and interactivity of networks have led to the spread and proliferation of pornographic images on the Internet, which have done great harm to adolescents' physical and mental health. With the establishment of image compression standards, pornographic images are mainly stored with compressed formats. Therefore, how to efficiently filter pornographic images is one of the challenging issues for information security. A pornographic image recognition and filtering method in the compressed domain is proposed by using incremental learning, which includes the following steps: (1) low-resolution (LR) images are first reconstructed from the compressed stream of pornographic images, (2) visual words are created from the LR image to represent the pornographic image, and (3) incremental learning is adopted to continuously adjust the classification rules to recognize the new pornographic image samples after the covering algorithm is utilized to train and recognize the visual words in order to build the initial classification model of pornographic images. The experimental results show that the proposed pornographic image recognition method using incremental learning has a higher recognition rate as well as costing less recognition time in the compressed domain.

  12. The use of single point incremental forming for customized implants of unicondylar knee arthroplasty: a review

    Directory of Open Access Journals (Sweden)

    Pankaj Kailasrao Bhoyar

    Full Text Available Abstract Introduction The implantable devices are having enormous market. These products are basically made by traditional manufacturing process, but for the custom-made implants Incremental Sheet Forming is a paramount alternative. Single Point Incremental Forming (SPIF is a manufacturing process to form intricate, asymmetrical components. It forms the component using stretching and bending by maintaining materials crystal structure. SPIF process can be performed using conventional Computer Numerical Control (CNC milling machine. Review This review paper elaborates the various manufacturing processes carried on various biocompatible metallic and nonmetallic customised implantable devices. Conclusion Ti-6Al-4V alloy is broadly used for biomedical implants, but in this alloy, Vanadium is toxic so this alloy is not compatible for implants. The attention of researchers is towards the non toxic and suitable biocompatible materials. For this reason, a novel approach was developed in order to enhance the mechanical properties of this material. . The development of incremental forming technique can improve the formability of existing alloys and may meet the current strict requirements for performance of dies and punches.

  13. Performance and delay analysis of hybrid ARQ with incremental redundancy over double rayleigh fading channels

    KAUST Repository

    Chelli, Ali

    2014-11-01

    In this paper, we study the performance of hybrid automatic repeat request (HARQ) with incremental redundancy over double Rayleigh channels, a common model for the fading amplitude of vehicle-to-vehicle communication systems. We investigate the performance of HARQ from an information theoretic perspective. Analytical expressions are derived for the \\\\epsilon-outage capacity, the average number of transmissions, and the average transmission rate of HARQ with incremental redundancy assuming a maximum number of HARQ rounds. Moreover, we evaluate the delay experienced by Poisson arriving packets for HARQ with incremental redundancy. We provide analytical expressions for the expected waiting time, the packet\\'s sojourn time in the queue, the average consumed power, and the energy efficiency. In our study, the communication rate per HARQ round is adjusted to the average signal-to-noise ratio (SNR) such that a target outage probability is not exceeded. This setting conforms with communication systems in which a quality of service is expected regardless of the channel conditions. Our analysis underscores the importance of HARQ in improving the spectral efficiency and reliability of communication systems. We demonstrate as well that the explored HARQ scheme achieves full diversity. Additionally, we investigate the tradeoff between energy efficiency and spectral efficiency.

  14. Incremental View Maintenance for Deductive Graph Databases Using Generalized Discrimination Networks

    Directory of Open Access Journals (Sweden)

    Thomas Beyhl

    2016-12-01

    Full Text Available Nowadays, graph databases are employed when relationships between entities are in the scope of database queries to avoid performance-critical join operations of relational databases. Graph queries are used to query and modify graphs stored in graph databases. Graph queries employ graph pattern matching that is NP-complete for subgraph isomorphism. Graph database views can be employed that keep ready answers in terms of precalculated graph pattern matches for often stated and complex graph queries to increase query performance. However, such graph database views must be kept consistent with the graphs stored in the graph database. In this paper, we describe how to use incremental graph pattern matching as technique for maintaining graph database views. We present an incremental maintenance algorithm for graph database views, which works for imperatively and declaratively specified graph queries. The evaluation shows that our maintenance algorithm scales when the number of nodes and edges stored in the graph database increases. Furthermore, our evaluation shows that our approach can outperform existing approaches for the incremental maintenance of graph query results.

  15. An Optimal Seed Based Compression Algorithm for DNA Sequences

    Directory of Open Access Journals (Sweden)

    Pamela Vinitha Eric

    2016-01-01

    Full Text Available This paper proposes a seed based lossless compression algorithm to compress a DNA sequence which uses a substitution method that is similar to the LempelZiv compression scheme. The proposed method exploits the repetition structures that are inherent in DNA sequences by creating an offline dictionary which contains all such repeats along with the details of mismatches. By ensuring that only promising mismatches are allowed, the method achieves a compression ratio that is at par or better than the existing lossless DNA sequence compression algorithms.

  16. Dynamic index and LZ factorization in compressed space

    OpenAIRE

    Nishimoto, Takaaki; I, Tomohiro; Inenaga, Shunsuke; Bannai, Hideo; Takeda, Masayuki

    2016-01-01

    In this paper, we propose a new \\emph{dynamic compressed index} of $O(w)$ space for a dynamic text $T$, where $w = O(\\min(z \\log N \\log^*M, N))$ is the size of the signature encoding of $T$, $z$ is the size of the Lempel-Ziv77 (LZ77) factorization of $T$, $N$ is the length of $T$, and $M \\geq 3N$ is an integer that can be handled in constant time under word RAM model. Our index supports searching for a pattern $P$ in $T$ in $O(|P| f_{\\mathcal{A}} + \\log w \\log |P| \\log^* M (\\log N + \\log |P| ...

  17. Random Access to Grammar-Compressed Strings and Trees

    DEFF Research Database (Denmark)

    Bille, Philip; Landau, Gad M.; Raman, Rajeev

    2015-01-01

    Grammar-based compression, where one replaces a long string by a small context-free grammar that generates the string, is a simple and powerful paradigm that captures (sometimes with slight reduction in efficiency) many of the popular compression schemes, including the Lempel-Ziv family, run...... representations of S achieving O(log N) random access time, and either O(n · αk(n)) construction time and space on the pointer machine model, or O(n) construction time and space on the RAM. Here, αk(n) is the inverse of the kth row of Ackermann's function. Our representations also efficiently support...

  18. An Optimal Seed Based Compression Algorithm for DNA Sequences.

    Science.gov (United States)

    Eric, Pamela Vinitha; Gopalakrishnan, Gopakumar; Karunakaran, Muralikrishnan

    2016-01-01

    This paper proposes a seed based lossless compression algorithm to compress a DNA sequence which uses a substitution method that is similar to the LempelZiv compression scheme. The proposed method exploits the repetition structures that are inherent in DNA sequences by creating an offline dictionary which contains all such repeats along with the details of mismatches. By ensuring that only promising mismatches are allowed, the method achieves a compression ratio that is at par or better than the existing lossless DNA sequence compression algorithms.

  19. Understanding treatment effect mechanisms of the CAMBRA randomized trial in reducing caries increment.

    Science.gov (United States)

    Cheng, J; Chaffee, B W; Cheng, N F; Gansky, S A; Featherstone, J D B

    2015-01-01

    The Caries Management By Risk Assessment (CAMBRA) randomized controlled trial showed that an intervention featuring combined antibacterial and fluoride therapy significantly reduced bacterial load and suggested reduced caries increment in adults with 1 to 7 baseline cavitated teeth. While trial results speak to the overall effectiveness of an intervention, insight can be gained from understanding the mechanism by which an intervention acts on putative intermediate variables (mediators) to affect outcomes. This study conducted mediation analyses on 109 participants who completed the trial to understand whether the intervention reduced caries increment through its action on potential mediators (oral bacterial load, fluoride levels, and overall caries risk based on the composite of bacterial challenge and salivary fluoride) between the intervention and dental outcomes. The primary outcome was the increment from baseline in decayed, missing, and filled permanent surfaces (ΔDMFS) 24 mo after completing restorations for baseline cavitated lesions. Analyses adjusted for baseline overall risk, bacterial challenge, and fluoride values under a potential outcome framework using generalized linear models. Overall, the CAMBRA intervention was suggestive in reducing the 24-mo DMFS increment (reduction in ΔDMFS: -0.96; 95% confidence interval [CI]: -2.01 to 0.08; P = 0.07); the intervention significantly reduced the 12-mo overall risk (reduction in overall risk: -19%; 95% CI, -7 to -41%;], P = 0.005). Individual mediators, salivary log10 mutans streptococci, log10 lactobacilli, and fluoride level, did not represent statistically significant pathways alone through which the intervention effect was transmitted. However, 36% of the intervention effect on 24-mo DMFS increment was through a mediation effect on 12-mo overall risk (P = 0.03). These findings suggest a greater intervention effect carried through the combined action on multiple aspects of the caries process rather than

  20. Quasi-static incremental behavior of granular materials: Elastic-plastic coupling and micro-scale dissipation

    Science.gov (United States)

    Kuhn, Matthew R.; Daouadji, Ali

    2018-05-01

    The paper addresses a common assumption of elastoplastic modeling: that the recoverable, elastic strain increment is unaffected by alterations of the elastic moduli that accompany loading. This assumption is found to be false for a granular material, and discrete element (DEM) simulations demonstrate that granular materials are coupled materials at both micro- and macro-scales. Elasto-plastic coupling at the macro-scale is placed in the context of thermomechanics framework of Tomasz Hueckel and Hans Ziegler, in which the elastic moduli are altered by irreversible processes during loading. This complex behavior is explored for multi-directional loading probes that follow an initial monotonic loading. An advanced DEM model is used in the study, with non-convex non-spherical particles and two different contact models: a conventional linear-frictional model and an exact implementation of the Hertz-like Cattaneo-Mindlin model. Orthotropic true-triaxial probes were used in the study (i.e., no direct shear strain), with tiny strain increments of 2 ×10-6 . At the micro-scale, contact movements were monitored during small increments of loading and load-reversal, and results show that these movements are not reversed by a reversal of strain direction, and some contacts that were sliding during a loading increment continue to slide during reversal. The probes show that the coupled part of a strain increment, the difference between the recoverable (elastic) increment and its reversible part, must be considered when partitioning strain increments into elastic and plastic parts. Small increments of irreversible (and plastic) strain and contact slipping and frictional dissipation occur for all directions of loading, and an elastic domain, if it exists at all, is smaller than the strain increment used in the simulations.

  1. Methods of determining incremental energy costs for economic dispatch and inter-utility interchange in Canadian utilities

    International Nuclear Information System (INIS)

    El-Hawary, M.E.; El-Hawary, F.; Mbamalu, G.A.N.

    1991-01-01

    A questionnaire was mailed to ten Canadian utilities to determine the methods the utilities use in determining the incremental cost of delivering energy at any time. The questionnaire was divided into three parts: generation, transmission and general. The generation section dealt with heat rates, fuel, operation and maintenance, startup and shutdown, and method of prioritizing and economic evaluation of interchange transactions. Transmission dealt with inclusion of transmission system incremental maintenance costs, and transmission losses determination. The general section dealt with incremental costs aspects, and various other economic considerations. A summary is presented of responses to the questionnaire

  2. PCA/INCREMENT MEMORY interface for analog processors on-line with PC-XT/AT IBM

    International Nuclear Information System (INIS)

    Biri, S.; Buttsev, V.S.; Molnar, J.; Samojlov, V.N.

    1989-01-01

    The functional and operational descriptions on PCA/INCREMENT MEMORY interface are discussed. The following is solved with this unit: connection between the analogue signal processor and PC, nuclear spectrum acquisition up to 2 24 -1 counts/channel using increment or decrement method, data read/write from or to memory via data bus PC during the spectrum acquisition. Dual ported memory organization is 4096x24 bit, increment cycle time at 4.77 MHz system clock frequency is 1.05 μs. 6 refs.; 2 figs

  3. Comparison of different incremental analysis update schemes in a realistic assimilation system with Ensemble Kalman Filter

    Science.gov (United States)

    Yan, Y.; Barth, A.; Beckers, J. M.; Brankart, J. M.; Brasseur, P.; Candille, G.

    2017-07-01

    In this paper, three incremental analysis update schemes (IAU 0, IAU 50 and IAU 100) are compared in the same assimilation experiments with a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. The difference between the three IAU schemes lies on the position of the increment update window. The relevance of each IAU scheme is evaluated through analyses on both thermohaline and dynamical variables. The validation of the assimilation results is performed according to both deterministic and probabilistic metrics against different sources of observations. For deterministic validation, the ensemble mean and the ensemble spread are compared to the observations. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score. The obtained results show that 1) the IAU 50 scheme has the same performance as the IAU 100 scheme 2) the IAU 50/100 schemes outperform the IAU 0 scheme in error covariance propagation for thermohaline variables in relatively stable region, while the IAU 0 scheme outperforms the IAU 50/100 schemes in dynamical variables estimation in dynamically active region 3) in case with sufficient number of observations and good error specification, the impact of IAU schemes is negligible. The differences between the IAU 0 scheme and the IAU 50/100 schemes are mainly due to different model integration time and different instability (density inversion, large vertical velocity, etc.) induced by the increment update. The longer model integration time with the IAU 50/100 schemes, especially the free model integration, on one hand, allows for better re-establishment of the equilibrium model state, on the other hand, smooths the strong gradients in dynamically active region.

  4. Incremental direct and indirect cost burden attributed to endometriosis surgeries in the United States.

    Science.gov (United States)

    Soliman, Ahmed M; Taylor, Hugh; Bonafede, Machaon; Nelson, James K; Castelli-Haley, Jane

    2017-05-01

    To compare direct and indirect costs between endometriosis patients who underwent endometriosis-related surgery (surgery cohort) and those who have not received surgery (no-surgery cohort). Retrospective cohort study. Not applicable. Endometriosis patients (aged 18-49 years) with (n = 124,530) or without (n = 37,106) a claim for endometriosis-related surgery were identified from the Truven Health MarketScan Commercial and Health and Productivity Management databases for 2006-2014. Not applicable. Primary outcomes were healthcare utilization during 12-month pre- and post-index periods, annual direct (healthcare) and indirect (absenteeism and short- and long-term disability) costs during the 12-month post-index period (in 2014 US dollars). Indirect costs were assessed for patients with available productivity data. Patients in the surgery cohort had significantly higher healthcare resource utilization during the post-index period and had mean annual total adjusted post-index direct costs approximately three times the costs among patients in the no-surgery cohort ($19,203 [SD $7,133] vs. $6,365 [SD $2,364]; average incremental annual direct cost = $12,838). The mean cost of surgery ($7,268 [SD $7,975]) was the single largest contributor to incremental annual direct cost. Mean estimated annual total indirect costs were $8,843 (surgery cohort) vs. $5,603 (no-surgery cohort); average incremental annual indirect cost = $3,240. Endometriosis patients who underwent surgery, compared with endometriosis patients who did not, incurred significantly higher direct costs due to healthcare utilization and indirect costs due to absenteeism or short-term disability. Regardless of the surgery type, the cost of index surgery contributed substantially to the total healthcare expenditure. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  5. Partial and incremental PCMH practice transformation: implications for quality and costs.

    Science.gov (United States)

    Paustian, Michael L; Alexander, Jeffrey A; El Reda, Darline K; Wise, Chris G; Green, Lee A; Fetters, Michael D

    2014-02-01

    To examine the associations between partial and incremental implementation of the Patient Centered Medical Home (PCMH) model and measures of cost and quality of care. We combined validated, self-reported PCMH capabilities data with administrative claims data for a diverse statewide population of 2,432 primary care practices in Michigan. These data were supplemented with contextual data from the Area Resource File. We measured medical home capabilities in place as of June 2009 and change in medical home capabilities implemented between July 2009 and June 2010. Generalized estimating equations were used to estimate the mean effect of these PCMH measures on total medical costs and quality of care delivered in physician practices between July 2009 and June 2010, while controlling for potential practice, patient cohort, physician organization, and practice environment confounders. Based on the observed relationships for partial implementation, full implementation of the PCMH model is associated with a 3.5 percent higher quality composite score, a 5.1 percent higher preventive composite score, and $26.37 lower per member per month medical costs for adults. Full PCMH implementation is also associated with a 12.2 percent higher preventive composite score, but no reductions in costs for pediatric populations. Incremental improvements in PCMH model implementation yielded similar positive effects on quality of care for both adult and pediatric populations but were not associated with cost savings for either population. Estimated effects of the PCMH model on quality and cost of care appear to improve with the degree of PCMH implementation achieved and with incremental improvements in implementation. © Health Research and Educational Trust.

  6. Experimental measurement of enthalpy increments of Th0.25Ce0.75O2

    International Nuclear Information System (INIS)

    Babu, R.; Balakrishnan, S.; Ananthasivan, K.; Nagarajan, K.

    2013-01-01

    Thorium has been suggested as an alternative fertile material for a nuclear fuel cycle, and an inert matrix for burning plutonium and for waste disposal. The third stage of India's nuclear power programme envisages utilization of thorium and plutonium as a fuel in Advanced Heavy Water Reactor (AHWR) and Accelerator Driven Sub-critical Systems (ADSS). Solid solutions of ThO 2 -PuO 2 are of importance because of coexistence of Th with Pu during the breeding cycle. CeO 2 is used as a PuO 2 analog due to similar ionic radii of cations and similar physico-chemical properties of the oxides. ThO 2 forms a homogeneous solid solution with the cubic fluorite structure when doped with Ce in the entire compositional range. In the development of mixed oxide nuclear fuels, knowledge of thermodynamic properties of thorium oxide and its mixtures has become extremely importance for understanding the fuel behavior during irradiation and for predicting the performance of the fuel under accidental conditions. Thermodynamic functions such as the enthalpy increment and heat capacity of the theria-ceria solid solution have not been measured experimentally. Hence, the enthalpy increments of thoria-ceria solid solutions, Th 0.25 Ce 0.75 O 2 by inverse drop calorimetry in the temperature range 523-1723 K have been measured. The measured enthalpy increments were fitted in to polynomial functions by using the least squares method and the other thermodynamic functions such as heat capacity, entropy and Gibbs energy functions were computed in the temperature range 298-1800 K. The reported thermodynamic functions for Th 0.25 Ce 0.75 O 2 forms the first experimental data and the heat capacity of (Th,Ce)O 2 solid solutions was shown to obey the Neumann-Kopp's rule. (author)

  7. An electromyographic-based test for estimating neuromuscular fatigue during incremental treadmill running

    International Nuclear Information System (INIS)

    Camic, Clayton L; Kovacs, Attila J; Hill, Ethan C; Calantoni, Austin M; Yemm, Allison J; Enquist, Evan A; VanDusseldorp, Trisha A

    2014-01-01

    The purposes of the present study were two fold: (1) to determine if the model used for estimating the physical working capacity at the fatigue threshold (PWC FT ) from electromyographic (EMG) amplitude data during incremental cycle ergometry could be applied to treadmill running to derive a new neuromuscular fatigue threshold for running, and (2) to compare the running velocities associated with the PWC FT , ventilatory threshold (VT), and respiratory compensation point (RCP). Fifteen college-aged subjects (21.5  ±  1.3 y, 68.7  ±  10.5 kg, 175.9  ±  6.7 cm) performed an incremental treadmill test to exhaustion with bipolar surface EMG signals recorded from the vastus lateralis. There were significant (p < 0.05) mean differences in running velocities between the VT (11.3  ±  1.3 km h −1 ) and PWC FT (14.0  ±  2.3 km h −1 ), VT and RCP (14.0  ±  1.8 km h −1 ), but not the PWC FT and RCP. The findings of the present study indicated that the PWC FT model could be applied to a single continuous, incremental treadmill test to estimate the maximal running velocity that can be maintained prior to the onset of neuromuscular fatigue. In addition, these findings suggested that the PWC FT , like the RCP, may be used to differentiate the heavy from severe domains of exercise intensity. (paper)

  8. Can Aerosol Direct Radiative Effects Account for Analysis Increments of Temperature in the Tropical Atlantic?

    Science.gov (United States)

    da Silva, Arlindo M.; Alpert, Pinhas

    2016-01-01

    In the late 1990's, prior to the launch of the Terra satellite, atmospheric general circulation models (GCMs) did not include aerosol processes because aerosols were not properly monitored on a global scale and their spatial distributions were not known well enough for their incorporation in operational GCMs. At the time of the first GEOS Reanalysis (Schubert et al. 1993), long time series of analysis increments (the corrections to the atmospheric state by all available meteorological observations) became readily available, enabling detailed analysis of the GEOS-1 errors on a global scale. Such analysis revealed that temperature biases were particularly pronounced in the Tropical Atlantic region, with patterns depicting a remarkable similarity to dust plumes emanating from the African continent as evidenced by TOMS aerosol index maps. Yoram Kaufman was instrumental encouraging us to pursue this issue further, resulting in the study reported in Alpert et al. (1998) where we attempted to assess aerosol forcing by studying the errors of a the GEOS-1 GCM without aerosol physics within a data assimilation system. Based on this analysis, Alpert et al. (1998) put forward that dust aerosols are an important source of inaccuracies in numerical weather-prediction models in the Tropical Atlantic region, although a direct verification of this hypothesis was not possible back then. Nearly 20 years later, numerical prediction models have increased in resolution and complexity of physical parameterizations, including the representation of aerosols and their interactions with the circulation. Moreover, with the advent of NASA's EOS program and subsequent satellites, atmospheric aerosols are now monitored globally on a routine basis, and their assimilation in global models are becoming well established. In this talk we will reexamine the Alpert et al. (1998) hypothesis using the most recent version of the GEOS-5 Data Assimilation System with assimilation of aerosols. We will

  9. Enhancing pattern of gastric carcinoma at dynamic incremental CT: correlation with gross and histologic findings

    International Nuclear Information System (INIS)

    Shin, Hong Seop; Lee, Dong Ho; Kim, Yoon Hwa; Ko, Young Tae; Lim, Joo Won; Yoon, Yup

    1996-01-01

    To evaluate the enhancing pattern of gastric carcinomas at dynamic incremental CT and to correlate it with pathologic findings. We retrospectively evaluated the enhancement pattern of stomach cancer on dynamic incremental CT of the 78 patients. All the lesions had been pathologically proved after surgery. The enhancement pattern was categorized as good or poor in the early phase;homogeneous, heterogeneous or ring enhancement;the presence or absence of delayed enhancement. There were 16 cases of early gastric cancer (EGC), and 62 cases of advanced gastric cancer(AGC). The Borrmann type of AGC were 1(n=1), 2(n=20), 3=(n=32), 4(n=8) and 5(n=1). The histologic patterns of AGC were tubular(n=49), signet ring cell(n=10), and mucinous(n=3). The enhancing patterns were compared with gross and histologic findings and delayed enhancement was correlated with pathologic evidence of desmoplasia. Good enhancement of tumor was seen in 24/41cases (58.5%) with AGC Borrmann type 3-5, in 6/21(28.6%) with AGC Borrmann type 1-2, and in 3/16(18.8%) with EGC (P<.05). By histologic pattern, good enhancement of tumor was seen in 8/10(80%) with signet ring cell type, in 21/49(42.9%) with tubular type, and in 1/3(33.3%) with mucinous type(P<.05). EGC was homogeneously enhanced in 14/16cases (87.5%), but AGC was heterogeneously enhanced in 33/62(53.2%), respectively(P<.01). There was no significant correlation between delayed enhancement and the presence of desmoplasia. AGC Borrmann type 3-5 and signet ring cell type have a tendency to show good enhancement and EGC is more homogeneously enhanced at dynamic incremental CT

  10. Blood flow patterns during incremental and steady-state aerobic exercise.

    Science.gov (United States)

    Coovert, Daniel; Evans, LeVisa D; Jarrett, Steven; Lima, Carla; Lima, Natalia; Gurovich, Alvaro N

    2017-05-30

    Endothelial shear stress (ESS) is a physiological stimulus for vascular homeostasis, highly dependent on blood flow patterns. Exercise-induced ESS might be beneficial on vascular health. However, it is unclear what type of ESS aerobic exercise (AX) produces. The aims of this study are to characterize exercise-induced blood flow patterns during incremental and steady-state AX. We expect blood flow pattern during exercise will be intensity-dependent and bidirectional. Six college-aged students (2 males and 4 females) were recruited to perform 2 exercise tests on cycleergometer. First, an 8-12-min incremental test (Test 1) where oxygen uptake (VO2), heart rate (HR), blood pressure (BP), and blood lactate (La) were measured at rest and after each 2-min step. Then, at least 48-hr. after the first test, a 3-step steady state exercise test (Test 2) was performed measuring VO2, HR, BP, and La. The three steps were performed at the following exercise intensities according to La: 0-2 mmol/L, 2-4 mmol/L, and 4-6 mmol/L. During both tests, blood flow patterns were determined by high-definition ultrasound and Doppler on the brachial artery. These measurements allowed to determine blood flow velocities and directions during exercise. On Test 1 VO2, HR, BP, La, and antegrade blood flow velocity significantly increased in an intensity-dependent manner (repeated measures ANOVA, pblood flow velocity did not significantly change during Test 1. On Test 2 all the previous variables significantly increased in an intensity-dependent manner (repeated measures ANOVA, pblood flow patterns during incremental and steady-state exercises include both antegrade and retrograde blood flows.

  11. Incremental health care utilization and costs for acute otitis media in children.

    Science.gov (United States)

    Ahmed, Sameer; Shapiro, Nina L; Bhattacharyya, Neil

    2014-01-01

    Determine the incremental health care costs associated with the diagnosis and treatment of acute otitis media (AOM) in children. Cross-sectional analysis of a national health-care cost database. Pediatric patients (age children with and without a diagnosis of AOM, adjusting for age, sex, region, race, ethnicity, insurance coverage, and Charlson comorbidity Index. A total of 8.7 ± 0.4 million children were diagnosed with AOM (10.7 ± 0.4% annually, mean age 5.3 years, 51.3% male) among 81.5 ± 2.3 million children sampled (mean age 8.9 years, 51.3% male). Children with AOM manifested an additional +2.0 office visits, +0.2 emergency department visits, and +1.6 prescription fills (all P <0.001) per year versus those without AOM, adjusting for demographics and medical comorbidities. Similarly, AOM was associated with an incremental increase in outpatient health care costs of $314 per child annually (P <0.001) and an increase of $17 in patient medication costs (P <0.001), but was not associated with an increase in total prescription expenses ($13, P = 0.766). The diagnosis of AOM confers a significant incremental health-care utilization burden on both patients and the health care system. With its high prevalence across the United States, pediatric AOM accounts for approximately $2.88 billion in added health care expense annually and is a significant health-care utilization concern. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  12. Stable Myoelectric Control of a Hand Prosthesis using Non-Linear Incremental Learning

    Directory of Open Access Journals (Sweden)

    Arjan eGijsberts

    2014-02-01

    Full Text Available Stable myoelectric control of hand prostheses remains an open problem. The only successful human-machine interface is surface electromyography, typically allowing control of a few degrees of freedom. Machine learning techniques may have the potential to remove these limitations, but their performance is thus far inadequate: myoelectric signals change over time under the influence of various factors, deteriorating control performance. It is therefore necessary, in the standard approach, to regularly retrain a new model from scratch.We hereby propose a non-linear incremental learning method in which occasional updates with a modest amount of novel training data allow continual adaptation to the changes in the signals. In particular, Incremental Ridge Regression and an approximation of the Gaussian Kernel known as Random Fourier Features are combined to predict finger forces from myoelectric signals, both finger-by-finger and grouped in grasping patterns.We show that the approach is effective and practically applicable to this problem by first analyzing its performance while predicting single-finger forces. Surface electromyography and finger forces were collected from 10 intact subjects during four sessions spread over two different days; the results of the analysis show that small incremental updates are indeed effective to maintain a stable level of performance.Subsequently, we employed the same method on-line to teleoperate a humanoid robotic arm equipped with a state-of-the-art commercial prosthetic hand. The subject could reliably grasp, carry and release everyday-life objects, enforcing stable grasping irrespective of the signal changes, hand/arm movements and wrist pronation and supination.

  13. Analytic expression of the temperature increment in a spin transfer torque nanopillar structure

    International Nuclear Information System (INIS)

    You, Chun-Yeol; Ha, Seung-Seok; Lee, Hyun-Woo

    2009-01-01

    The temperature increment due to the Joule heating in a nanopillar spin transfer torque system is investigated. We obtain a time-dependent analytic solution of the heat conduction equation in nanopillar geometry by using the Green's function method after some simplifications of the problem. While Holm's equation is applicable only to steady states in metallic systems, our solution describes the time dependence and is also applicable to a nanopillar-shaped magnetic tunneling junction with an insulator barrier layer. The validity of the analytic solution is confirmed by numerical finite element method simulations and by the comparison with Holm's equation.

  14. Switch-mode High Voltage Drivers for Dielectric Electro Active Polymer (DEAP) Incremental Actuators

    DEFF Research Database (Denmark)

    Thummala, Prasanth

    Actuators based on dielectric electro active polymers (DEAPs) have attracted special attention in the recent years. The unique characteristics of DEAP are large strain (5-100%), light weight (7 times lighter than steel and copper), high flexibility (100,000 times less stiff than steel), low noise...... operation, and low power consumption. DEAP actuators require very high voltage (2-2.5 kV) to fully elongate them. In general, the elongation or stroke length of a DEAP actuator is of the order of mm. DEAP actuators can be configured to provide incremental motion, thus overcoming the inherent size...

  15. F-22 Increment 3.2B Modernization (F-22 Inc 3.2B Mod)

    Science.gov (United States)

    2015-12-01

    predecessor F - 22 software programs. Notes Delivery Order (DO) 0004 was issued under the overarching F - 22 Raptor , Enhancement, Development, and...Selected Acquisition Report (SAR) RCS: DD-A&T(Q&A)823-474 F - 22 Increment 3.2B Modernization ( F - 22 Inc 3.2B Mod) As of FY 2017 President’s Budget...Defense Acquisition Management Information Retrieval (DAMIR) March 23, 2016 16:11:54 UNCLASSIFIED F - 22 Inc 3.2B Mod December 2015 SAR March 23, 2016

  16. On the Perturb-and-Observe and Incremental Conductance MPPT methods for PV systems

    DEFF Research Database (Denmark)

    Sera, Dezso; Mathe, Laszlo; Kerekes, Tamas

    2013-01-01

    This paper presents a detailed analysis of the two most well-known hill-climbing MPPT algorithms, the Perturb-and-Observe (P&O) and Incremental Conductance (INC). The purpose of the analysis is to clarify some common misconceptions in the literature regarding these two trackers, therefore helping...... according to the EN 50530 standard, resulting in a deviation between their efficiencies of 0.13% in dynamic, and as low as 0.02% in static conditions. The results show that despite the common opinion in the literature, the P&O and INC are equivalent....

  17. The Incremental Information Content of the Cash Flow Statement: An Australian Empirical Investigation

    OpenAIRE

    Hadri Kusuma

    2014-01-01

    The general objective of the present study is to investigate and assess the incremental information content of cash flow disclosures as required by the AASB 1026 ¡°Statement of Cash Flows¡±. This test addresses the issue of whether a change in cash flow components has the same relationship with security prices as that in earnings. Several previous studies indicate both income and cash flow statements may be mutually exclusive or mutually inclusive statements. The data to test three hypotheses...

  18. Incremental Adaptive Fuzzy Control for Sensorless Stroke Control of A Halbach-type Linear Oscillatory Motor

    Science.gov (United States)

    Lei, Meizhen; Wang, Liqiang

    2018-01-01

    The halbach-type linear oscillatory motor (HT-LOM) is multi-variable, highly coupled, nonlinear and uncertain, and difficult to get a satisfied result by conventional PID control. An incremental adaptive fuzzy controller (IAFC) for stroke tracking was presented, which combined the merits of PID control, the fuzzy inference mechanism and the adaptive algorithm. The integral-operation is added to the conventional fuzzy control algorithm. The fuzzy scale factor can be online tuned according to the load force and stroke command. The simulation results indicate that the proposed control scheme can achieve satisfied stroke tracking performance and is robust with respect to parameter variations and external disturbance.

  19. Developing risk prediction models for kidney injury and assessing incremental value for novel biomarkers.

    Science.gov (United States)

    Kerr, Kathleen F; Meisner, Allison; Thiessen-Philbrook, Heather; Coca, Steven G; Parikh, Chirag R

    2014-08-07

    The field of nephrology is actively involved in developing biomarkers and improving models for predicting patients' risks of AKI and CKD and their outcomes. However, some important aspects of evaluating biomarkers and risk models are not widely appreciated, and statistical methods are still evolving. This review describes some of the most important statistical concepts for this area of research and identifies common pitfalls. Particular attention is paid to metrics proposed within the last 5 years for quantifying the incremental predictive value of a new biomarker. Copyright © 2014 by the American Society of Nephrology.

  20. Localised and Learnt Applications of Machine Learning for Robotic Incremental Sheet Forming

    DEFF Research Database (Denmark)

    Nicholas, Paul; Zwierzycki, Mateusz; Ramsgaard Thomsen, Mette

    2017-01-01

    While fabrication is becoming a well-established field for architectural robotics, new possibilities for modelling and control situate feedback, modelling methods and adaptation as key concerns. In this paper we detail two methods for implementing adaptation, in the context of Robotic Incremental...... Sheet Forming (ISF) and exemplified in the fabrication of a bridge structure. The methods we describe compensate for springback and improve forming tolerance by using localised in-process distance sensing to adapt tool-paths, and by using pre-process supervised machine learning to predict stringback...

  1. Economic and fiscal aspects of incremental oil investments in the UKCS

    International Nuclear Information System (INIS)

    Kemp, A.G.; MacDonald, B.; Stephen, L.

    1994-01-01

    In this study a detailed examination has been made of the pre-tax and post-tax returns to seven typical incremental oil investments undertaken within the PRT (Petroleum Revenue Tax) ring fence. It has been demonstrated that such projects are generally of high risk. Risk is present in two senses. The chance of making a loss is generally significant (except with the infill drilling project). The risk is also generally high in the sense that the spread of expected returns is high in relation to the mean expected value. This has been demonstrated by the use of Monte Carlo analysis. (Author)

  2. Individualized 6-mercaptopurine increments in consolidation treatment of childhood acute lymphoblastic leukemia

    DEFF Research Database (Denmark)

    Tulstrup, Morten; Frandsen, Thomas L; Abrahamsson, Jonas

    2018-01-01

    OBJECTIVES: This randomized controlled trial tested the hypothesis that children with non-high-risk acute lymphoblastic leukemia could benefit from individualized 6-mercaptopurine increments during consolidation therapy (NCT00816049). Primary and secondary end points were end of consolidation...... at end of consolidation vs 77 of 389 (20%) in the control arm (P = .08). Five-year probability of event-free survival was 0.89 (95% CI: 0.85-0.93) in the experimental arm vs 0.93 (0.90-0.96) in the control arm (P = .13). The median accumulated length of 6-mercaptopurine treatment interruptions was 7 (IQR...

  3. A gradient surface produced by combined electroplating and incremental frictional sliding

    DEFF Research Database (Denmark)

    Yu, Tianbo; Hong, Chuanshi; Kitamura, K.

    2017-01-01

    A Cu plate was first electroplated with a Ni layer, with a thickness controlled to be between 1 and 2 mu m. The coated surface was then deformed by incremental frictional sliding with liquid nitrogen cooling. The combined treatment led to a multifunctional surface with a gradient in strain...... processed Cu plate without Ni coating, showing a strong effect of the coated layer on the deformation. The experimental results are followed by an analysis of strengthening mechanisms and a discussion of the applicability of the new technique for increasing the durability and lifetime of components exposed...

  4. Simple Moving Voltage Average Incremental Conductance MPPT Technique with Direct Control Method under Nonuniform Solar Irradiance Conditions

    OpenAIRE

    Ali, Amjad; Li, Wuhua; He, Xiangning

    2015-01-01

    A new simple moving voltage average (SMVA) technique with fixed step direct control incremental conductance method is introduced to reduce solar photovoltaic voltage (VPV) oscillation under nonuniform solar irradiation conditions. To evaluate and validate the performance of the proposed SMVA method in comparison with the conventional fixed step direct control incremental conductance method under extreme conditions, different scenarios were simulated. Simulation results show that in most cases...

  5. Proposal for element size and time increment selection guideline by 3-D finite element method for elastic waves propagation analysis

    International Nuclear Information System (INIS)

    Ishida, Hitoshi; Meshii, Toshiyuki

    2008-01-01

    This paper proposes a guideline for selection of element size and time increment by 3-D finite element method, which is applied to elastic wave propagation analysis for a long distance of a large structure. An element size and a time increment are determined by quantitative evaluation of strain, which must be 0 on the analysis model with a uniform motion, caused by spatial and time discretization. (author)

  6. Ultrasonic-Assisted Incremental Microforming of Thin Shell Pyramids of Metallic Foil

    Directory of Open Access Journals (Sweden)

    Toshiyuki Obikawa

    2017-05-01

    Full Text Available Single point incremental forming is used for rapid prototyping of sheet metal parts. This forming technology was applied to the fabrication of thin shell micropyramids of aluminum, stainless steel, and titanium foils. A single point tool used had a tip radius of 0.1 mm or 0.01 mm. An ultrasonic spindle with axial vibration was implemented for improving the shape accuracy of micropyramids formed on 5–12 micrometers-thick aluminum, stainless steel, and titanium foils. The formability was also investigated by comparing the forming limits of micropyramids of aluminum foil formed with and without ultrasonic vibration. The shapes of pyramids incrementally formed were truncated pyramids, twisted pyramids, stepwise pyramids, and star pyramids about 1 mm in size. A much smaller truncated pyramid was formed only for titanium foil for qualitative investigation of the size reduction on forming accuracy. It was found that the ultrasonic vibration improved the shape accuracy of the formed pyramids. In addition, laser heating increased the forming limit of aluminum foil and it is more effective when both the ultrasonic vibration and laser heating are applied.

  7. Blood lactate minimum of rats during swimming test using three incremental stages

    Directory of Open Access Journals (Sweden)

    Mariana de Souza Sena

    2015-09-01

    Full Text Available AbstractThe purpose of this study was to determine the lactate minimum intensity (LMI by swimming LACmintest using three incremental stages (LACmintest3 and to evaluate its sensitivity to changes in aerobic fitness (AF. Twenty Wistar rats performed: LACmintest3 (1: induction of hyperlactacidemia and incremental phase (4%, 5% and 6.5% of bw; Constant loads tests on (2 and above (3 the LMI. Half of the animals were subjected to training with the individual LMI and the tests were performed again. The mean exercise load in LACmintest3 was 5.04 ± 0.13% bw at 5.08 ± 0.55 mmol L-1 blood lactate minimum (BLM. There was a stabilize and disproportionate increase of blood lactate in tests 2 and 3, respectively. After the training period, the mean BLM was lower in the trained animals. The LACmintest3 seems to be a good indicator of LMI and responsive to changes in AF in rats subjected to swim training.

  8. Awareness and its use in Incremental Data Driven Modelling for Plug and Play Process Control

    DEFF Research Database (Denmark)

    Knudsen, Torben; Bendtsen, Jan Dimon; Trangbæk, Klaus

    2012-01-01

    In this paper, we focus on the problem of incremental system identification for the purpose of automatic reconfiguration of control systems. We consider the particular case where a linear time-invariant system is augmented with either an extra sensor or an extra actuator and derive prediction err...... to as awareness, indicates if there is a relation between the signal provided by the new device and the existing process, as well as what the new device is good for in terms of control performance. Finally, a simulation example illustrates the potentials of the proposed method.......In this paper, we focus on the problem of incremental system identification for the purpose of automatic reconfiguration of control systems. We consider the particular case where a linear time-invariant system is augmented with either an extra sensor or an extra actuator and derive prediction error...... methods for recursively estimating the additional parameters while retaining the existing system model. Next, we propose a novel measure of the "usefulness'' of new signals that appear in an existing control loop due to the addition of a new device, e.g., a sensor. This measure, which we refer...

  9. Variational formulation for dissipative continua and an incremental J-integral

    Science.gov (United States)

    Rahaman, Md. Masiur; Dhas, Bensingh; Roy, D.; Reddy, J. N.

    2018-01-01

    Our aim is to rationally formulate a proper variational principle for dissipative (viscoplastic) solids in the presence of inertia forces. As a first step, a consistent linearization of the governing nonlinear partial differential equations (PDEs) is carried out. An additional set of complementary (adjoint) equations is then formed to recover an underlying variational structure for the augmented system of linearized balance laws. This makes it possible to introduce an incremental Lagrangian such that the linearized PDEs, including the complementary equations, become the Euler-Lagrange equations. Continuous groups of symmetries of the linearized PDEs are computed and an analysis is undertaken to identify the variational groups of symmetries of the linearized dissipative system. Application of Noether's theorem leads to the conservation laws (conserved currents) of motion corresponding to the variational symmetries. As a specific outcome, we exploit translational symmetries of the functional in the material space and recover, via Noether's theorem, an incremental J-integral for viscoplastic solids in the presence of inertia forces. Numerical demonstrations are provided through a two-dimensional plane strain numerical simulation of a compact tension specimen of annealed mild steel under dynamic loading.

  10. Considerations for Using an Incremental Scheduler for Human Exploration Task Scheduling

    Science.gov (United States)

    Jaap, John; Phillips, Shaun

    2005-01-01

    As humankind embarks on longer space missions farther from home, the requirements and environments for scheduling the activities performed on these missions are changing. As we begin to prepare for these missions it is appropriate to evaluate the merits and applicability of the different types of scheduling engines. Scheduling engines temporally arrange tasks onto a timeline so that all constraints and objectives are met and resources are not overbooked. Scheduling engines used to schedule space missions fall into three general categories: batch, mixed-initiative, and incremental. This paper presents an assessment of the engine types, a discussion of the impact of human exploration of the moon and Mars on planning and scheduling, and the applicability of the different types of scheduling engines. This paper will pursue the hypothesis that incremental scheduling engines may have a place in the new environment; they have the potential to reduce cost, to improve the satisfaction of those who execute or benefit from a particular timeline (the customers), and to allow astronauts to plan their own tasks.

  11. Two-level incremental checkpoint recovery scheme for reducing system total overheads.

    Science.gov (United States)

    Li, Huixian; Pang, Liaojun; Wang, Zhangquan

    2014-01-01

    Long-running applications are often subject to failures. Once failures occur, it will lead to unacceptable system overheads. The checkpoint technology is used to reduce the losses in the event of a failure. For the two-level checkpoint recovery scheme used in the long-running tasks, it is unavoidable for the system to periodically transfer huge memory context to a remote stable storage. Therefore, the overheads of setting checkpoints and the re-computing time become a critical issue which directly impacts the system total overheads. Motivated by these concerns, this paper presents a new model by introducing i-checkpoints into the existing two-level checkpoint recovery scheme to deal with the more probable failures with the smaller cost and the faster speed. The proposed scheme is independent of the specific failure distribution type and can be applied to different failure distribution types. We respectively make analyses between the two-level incremental and two-level checkpoint recovery schemes with the Weibull distribution and exponential distribution, both of which fit with the actual failure distribution best. The comparison results show that the total overheads of setting checkpoints, the total re-computing time and the system total overheads in the two-level incremental checkpoint recovery scheme are all significantly smaller than those in the two-level checkpoint recovery scheme. At last, limitations of our study are discussed, and at the same time, open questions and possible future work are given.

  12. Multiobjective memetic estimation of distribution algorithm based on an incremental tournament local searcher.

    Science.gov (United States)

    Yang, Kaifeng; Mu, Li; Yang, Dongdong; Zou, Feng; Wang, Lei; Jiang, Qiaoyong

    2014-01-01

    A novel hybrid multiobjective algorithm is presented in this paper, which combines a new multiobjective estimation of distribution algorithm, an efficient local searcher and ε-dominance. Besides, two multiobjective problems with variable linkages strictly based on manifold distribution are proposed. The Pareto set to the continuous multiobjective optimization problems, in the decision space, is a piecewise low-dimensional continuous manifold. The regularity by the manifold features just build probability distribution model by globally statistical information from the population, yet, the efficiency of promising individuals is not well exploited, which is not beneficial to search and optimization process. Hereby, an incremental tournament local searcher is designed to exploit local information efficiently and accelerate convergence to the true Pareto-optimal front. Besides, since ε-dominance is a strategy that can make multiobjective algorithm gain well distributed solutions and has low computational complexity, ε-dominance and the incremental tournament local searcher are combined here. The novel memetic multiobjective estimation of distribution algorithm, MMEDA, was proposed accordingly. The algorithm is validated by experiment on twenty-two test problems with and without variable linkages of diverse complexities. Compared with three state-of-the-art multiobjective optimization algorithms, our algorithm achieves comparable results in terms of convergence and diversity metrics.

  13. Incremental Knowledge Acquisition for WSD: A Rough Set and IL based Method

    Directory of Open Access Journals (Sweden)

    Xu Huang

    2015-07-01

    Full Text Available Word sense disambiguation (WSD is one of tricky tasks in natural language processing (NLP as it needs to take into full account all the complexities of language. Because WSD involves in discovering semantic structures from unstructured text, automatic knowledge acquisition of word sense is profoundly difficult. To acquire knowledge about Chinese multi-sense verbs, we introduce an incremental machine learning method which combines rough set method and instance based learning. First, context of a multi-sense verb is extracted into a table; its sense is annotated by a skilled human and stored in the same table. By this way, decision table is formed, and then rules can be extracted within the framework of attributive value reduction of rough set. Instances not entailed by any rule are treated as outliers. When new instances are added to decision table, only the new added and outliers need to be learned further, thus incremental leaning is fulfilled. Experiments show the scale of decision table can be reduced dramatically by this method without performance decline.

  14. Incremental validity of positive and negative valence in predicting personality disorder.

    Science.gov (United States)

    Simms, Leonard J; Yufik, Tom; Gros, Daniel F

    2010-04-01

    The Big Seven model of personality includes five dimensions similar to the Big Five model as well as two evaluative dimensions—Positive Valence (PV) and Negative Valence (NV)—which reflect extremely positive and negative person descriptors, respectively. Recent theory and research have suggested that PV and NV predict significant variance in personality disorder (PD) above that predicted by the Big Five, but firm conclusions have not been possible because previous studies have been limited to only single measures of PV, NV, and the Big Five traits. In the present study, we replicated and extended previous findings using three markers of all key constructs—including PV, NV, and the Big Five—in a diverse sample of 338 undergraduates. Results of hierarchical multiple regression analyses revealed that PV incrementally predicted Narcissistic and Histrionic PDs above the Big Five and that NV nonspecifically incremented the prediction of most PDs. Implications for dimensional models of personality pathology are discussed. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  15. VOLUME INCREMENT OF Nectandra megapotamica (Spreng. MEZ ON A MIXED RAIN FOREST

    Directory of Open Access Journals (Sweden)

    Luis Henrique da Silva Souza

    2009-10-01

    Full Text Available In this work, it is studied the growth of Nectandra megapotamica (Spreng. Mez species through a sampler compounded by eleven dominant trees selected from a rainforest in Nova Prata, Rio Grande do Sul. The mathematical model which better represents the growth trend in percentage volume was described as iv%=(b0+b1.ln d2, fitting individually per tree, being possible to describe the variations of slope (b1 and intercept (b0 coefficients between trees with the relation height – diameter (h/d, diameter increment measured on the last 5 cm on bore increment (id5 sampled on diameter breast height (d, exposition (EXPOS, corrected Basal Area Large (BALcor and crown length exposed to sunlight (Lc, considering the following expressions: b1= 0,41739 + 4,16179.id5 - 6,29332.h/d + 0,37823.EXPOS - 0,11519Ir + 5,80419.BALcor – 0,06858.Lc, with determination coefficient equal to 0,9979 and standard error in percentage of the mean of –1,73%, and b0= 2,62466 –13,37024.id5 + 22,08329.h/d – 1,33161.EXPOS + 0,34689.Ir –23,88899.BALcor + 0,25692.L, with determination coefficient fitting on 0,9907 and standard error in percentage of the mean of 2,71%.

  16. Multiobjective Memetic Estimation of Distribution Algorithm Based on an Incremental Tournament Local Searcher

    Directory of Open Access Journals (Sweden)

    Kaifeng Yang

    2014-01-01

    Full Text Available A novel hybrid multiobjective algorithm is presented in this paper, which combines a new multiobjective estimation of distribution algorithm, an efficient local searcher and ε-dominance. Besides, two multiobjective problems with variable linkages strictly based on manifold distribution are proposed. The Pareto set to the continuous multiobjective optimization problems, in the decision space, is a piecewise low-dimensional continuous manifold. The regularity by the manifold features just build probability distribution model by globally statistical information from the population, yet, the efficiency of promising individuals is not well exploited, which is not beneficial to search and optimization process. Hereby, an incremental tournament local searcher is designed to exploit local information efficiently and accelerate convergence to the true Pareto-optimal front. Besides, since ε-dominance is a strategy that can make multiobjective algorithm gain well distributed solutions and has low computational complexity, ε-dominance and the incremental tournament local searcher are combined here. The novel memetic multiobjective estimation of distribution algorithm, MMEDA, was proposed accordingly. The algorithm is validated by experiment on twenty-two test problems with and without variable linkages of diverse complexities. Compared with three state-of-the-art multiobjective optimization algorithms, our algorithm achieves comparable results in terms of convergence and diversity metrics.

  17. Effect of chromatic mechanisms on the detection of mesopic incremental targets at different eccentricities.

    Science.gov (United States)

    Bodrogi, Peter; Vas, Zoltán; Haferkemper, Nils; Várady, Géza; Schiller, Christoph; Khanh, Tran Quoc; Schanda, János

    2010-01-01

    Spectral sensitivity functions for the threshold detection of mesopic incremental targets were compared for different target eccentricities (10, 20, and 30 degrees ) and for different mesopic backgrounds (0.1, 0.5 and 1.0 cd m(-2)). Relative responsivities of achromatic mechanisms (L + M and rods) and chromatic mechanisms (S and /L-M/) were estimated for each eccentricity and background. Chromatic mechanisms contribute significantly to detection but their effect is lower at 30 degrees . A new contrast metric (C(CHC2)) is introduced to account for the selective adaptation of the photoreceptors and the effects of the chromatic mechanisms i.e. broadening of the range of spectral sensitivity with multiple local maxima and yellow sub-additivity of detection performance. The C(CHC2) metric is compared with the achromatic contrast metric of the MOVE model (C(MOVE)). For the same target, C(CHC2) generally predicts a higher visibility level than C(MOVE). However, in accordance with visual observations, for grey or yellowish incremental targets appearing at the eccentricities of 20 and 30 degrees , the visibility predicted by C(CHC2) is less than the visibility predicted by C(MOVE).

  18. Single point incremental sheet forming investigated by in-process 3D digital image correlation

    Directory of Open Access Journals (Sweden)

    Bernhart G.

    2010-06-01

    Full Text Available Single Point Incremental Forming (SPIF is a promising sheet metal forming process for prototyping and small batches, in which the blank is formed in a stepwise fashion by a displacement-controlled small-sized tool. Due to specific strain paths induced by the process and limited plastic zones in the contact region between the tool and the workpiece, forming diagrams and forming strategies are different from the classical stamping processes. One major limitation of SPIF is the lack of accuracy of the obtained final parts because of the poor knowledge of the state of stress during the process that requires a good description of the material models and a right choice of the process parameters. In this paper, the SPIF process is experimentally investigated by the mean of surface 3D digital image correlation during the forming of a AW-5086-H111 grade aluminium alloy. Development of strain fields encountered in incremental forming is reported and material formability is evaluated on several formed shapes, taking into account a wide range of straining conditions of this process.

  19. Single point incremental sheet forming investigated by in-process 3D digital image correlation

    Science.gov (United States)

    Decultot, N.; Robert, L.; Velay, V.; Bernhart, G.

    2010-06-01

    Single Point Incremental Forming (SPIF) is a promising sheet metal forming process for prototyping and small batches, in which the blank is formed in a stepwise fashion by a displacement-controlled small-sized tool. Due to specific strain paths induced by the process and limited plastic zones in the contact region between the tool and the workpiece, forming diagrams and forming strategies are different from the classical stamping processes. One major limitation of SPIF is the lack of accuracy of the obtained final parts because of the poor knowledge of the state of stress during the process that requires a good description of the material models and a right choice of the process parameters. In this paper, the SPIF process is experimentally investigated by the mean of surface 3D digital image correlation during the forming of a AW-5086-H111 grade aluminium alloy. Development of strain fields encountered in incremental forming is reported and material formability is evaluated on several formed shapes, taking into account a wide range of straining conditions of this process.

  20. Mechanical Behavior of Red Sandstone under Incremental Uniaxial Cyclical Compressive and Tensile Loading

    Directory of Open Access Journals (Sweden)

    Baoyun Zhao

    2017-01-01

    Full Text Available Uniaxial experiments were carried out on red sandstone specimens to investigate their short-term and creep mechanical behavior under incremental cyclic compressive and tensile loading. First, based on the results of short-term uniaxial incremental cyclic compressive and tensile loading experiments, deformation characteristics and energy dissipation were analyzed. The results show that the stress-strain curve of red sandstone has an obvious memory effect in the compressive and tensile loading stages. The strains at peak stresses and residual strains increase with the cycle number. Energy dissipation, defined as the area of the hysteresis loop in the stress-strain curves, increases nearly in a power function with the cycle number. Creep test of the red sandstone was also conducted. Results show that the creep curve under each compressive or tensile stress level can be divided into decay and steady stages, which cannot be described by the conventional Burgers model. Therefore, an improved Burgers creep model of rock material is constructed through viscoplastic mechanics, which agrees very well with the experimental results and can describe the creep behavior of red sandstone better than the Burgers creep model.

  1. Modeling of Photovoltaic System with Modified Incremental Conductance Algorithm for Fast Changes of Irradiance

    Directory of Open Access Journals (Sweden)

    Saad Motahhir

    2018-01-01

    Full Text Available The first objective of this work is to determine some of the performance parameters characterizing the behavior of a particular photovoltaic (PV panels that are not normally provided in the manufacturers’ specifications. These provide the basis for developing a simple model for the electrical behavior of the PV panel. Next, using this model, the effects of varying solar irradiation, temperature, series and shunt resistances, and partial shading on the output of the PV panel are presented. In addition, the PV panel model is used to configure a large photovoltaic array. Next, a boost converter for the PV panel is designed. This converter is put between the panel and the load in order to control it by means of a maximum power point tracking (MPPT controller. The MPPT used is based on incremental conductance (INC, and it is demonstrated here that this technique does not respond accurately when solar irradiation is increased. To investigate this, a modified incremental conductance technique is presented in this paper. It is shown that this system does respond accurately and reduces the steady-state oscillations when solar irradiation is increased. Finally, simulations of the conventional and modified algorithm are compared, and the results show that the modified algorithm provides an accurate response to a sudden increase in solar irradiation.

  2. Mutual-Information-Based Incremental Relaying Communications for Wireless Biomedical Implant Systems.

    Science.gov (United States)

    Liao, Yangzhe; Leeson, Mark S; Cai, Qing; Ai, Qingsong; Liu, Quan

    2018-02-08

    Network lifetime maximization of wireless biomedical implant systems is one of the major research challenges of wireless body area networks (WBANs). In this paper, a mutual information (MI)-based incremental relaying communication protocol is presented where several on-body relay nodes and one coordinator are attached to the clothes of a patient. Firstly, a comprehensive analysis of a system model is investigated in terms of channel path loss, energy consumption, and the outage probability from the network perspective. Secondly, only when the MI value becomes smaller than the predetermined threshold is data transmission allowed. The communication path selection can be either from the implanted sensor to the on-body relay then forwards to the coordinator or from the implanted sensor to the coordinator directly, depending on the communication distance. Moreover, mathematical models of quality of service (QoS) metrics are derived along with the related subjective functions. The results show that the MI-based incremental relaying technique achieves better performance in comparison to our previous proposed protocol techniques regarding several selected performance metrics. The outcome of this paper can be applied to intra-body continuous physiological signal monitoring, artificial biofeedback-oriented WBANs, and telemedicine system design.

  3. Comparison of manufacturing of lightweight corrugated sheet sandwiches by hydroforming and incremental sheet forming

    Science.gov (United States)

    Maqbool, Fawad; Elze, Lars; Seidlitz, Holger; Bambach, Markus

    2016-10-01

    Sandwich materials made from corrugated sheet metal provide excellent mechanical properties for lightweight design without using filler material. The increased mechanical properties of these sandwich materials are achieved by the 3-D geometry of the corrugated sheet and the hardening due to pre-forming. In the present study, manufacturing of corrugated sheet metal consisting of hexagonal bulge patterns through hydroforming and incremental forming is analyzed. Double layered corrugated sheet metal sandwiches with hexagonal patterns of free-form bulge geometries are investigated through finite element analysis for the maximum increase in stiffness over the normal flat sheets. The analysis shows that a bending stiffness increase of up to 13 times over flat sheet of the same mass is attainable by corrugated sandwiches. Further, it is proved for these types of corrugation sandwiches that stiffness increases by increasing the height of the corrugation bulge but that hydroforming poses restrictions with respect to bulge height, since it is limited by forming force and formability of the material. Incremental sheet metal forming can be used to produce sheets with a hexagonal bulge pattern with increased height. Hence, a higher increase in stiffness as compared to hydroforming is possible but at the expense of process speed.

  4. Motor unit firing frequency of lower limb muscles during an incremental slide board skating test.

    Science.gov (United States)

    Piucco, Tatiane; Bini, Rodrigo; Sakaguchi, Masanori; Diefenthaeler, Fernando; Stefanyshyn, Darren

    2017-11-01

    This study investigated how the combination of workload and fatigue affected the frequency components of muscle activation and possible recruitment priority of motor units during skating to exhaustion. Ten male competitive speed skaters performed an incremental maximal test on a slide board. Activation of six muscles from the right leg was recorded throughout the test. A time-frequency analysis was performed to compute overall, high, and low frequency bands from the whole signal at 10, 40, 70, and 90% of total test time. Overall activation increased for all muscles throughout the test (p  0.80). There was an increase in low frequency (90 vs. 10%, p = 0.035, ES = 1.06) and a decrease in high frequency (90 vs. 10%, p = 0.009, ES = 1.38, and 90 vs. 40%, p = 0.025, ES = 1.12) components of gluteus maximus. Strong correlations were found between the maximal cadence and vastus lateralis, gluteus maximus and gluteus medius activation at the end of the test. In conclusion, the incremental skating test lead to an increase in activation of lower limb muscles, but only gluteus maximus was sensitive to changes in frequency components, probably caused by a pronounced fatigue.

  5. Otolith development in larval and juvenile Schizothorax davidi: ontogeny and growth increment characteristics

    Science.gov (United States)

    Yan, Taiming; Hu, Jiaxiang; Cai, Yueping; Xiong, Sen; Yang, Shiyong; Wang, Xiongyan; He, Zhi

    2017-09-01

    Laboratory-reared Schizothorax davidi larvae and juveniles were examined to assess the formation and characteristics of David's schizothoracin otoliths. Otolith development was observed and their formation period was verified by monitoring larvae and juveniles of known age. The results revealed that lapilli and sagittae developed before hatching, and the first otolith increment was identified at 2 days post hatching in both. The shape of lapilli was relatively stable during development compared with that of sagittae; however, growth of four sagittae and lapilli areas was consistent, but the posterior area grew faster than the anterior area and the ventral surface grew faster than the dorsal surface. Similarly, the sum length of the radius of the anterior and posterior areas on sagittae and lapilli were linearly and binomially related to total fish length, respectively. Moreover, daily deposition rates were validated by monitoring knownage larvae and juveniles. The increase in lapilli width was 1.88±0.080 0 μm at the ninth increment, which reached a maximum and the decreased gradually toward the otolith edge, whereas that of sagittae increased more slowly. These results illustrate the developmental biology of S. davidi, which will aid in population conservation and fish stock management.

  6. A Novel Classification Algorithm Based on Incremental Semi-Supervised Support Vector Machine

    Science.gov (United States)

    Gao, Fei; Mei, Jingyuan; Sun, Jinping; Wang, Jun; Yang, Erfu; Hussain, Amir

    2015-01-01

    For current computational intelligence techniques, a major challenge is how to learn new concepts in changing environment. Traditional learning schemes could not adequately address this problem due to a lack of dynamic data selection mechanism. In this paper, inspired by human learning process, a novel classification algorithm based on incremental semi-supervised support vector machine (SVM) is proposed. Through the analysis of prediction confidence of samples and data distribution in a changing environment, a “soft-start” approach, a data selection mechanism and a data cleaning mechanism are designed, which complete the construction of our incremental semi-supervised learning system. Noticeably, with the ingenious design procedure of our proposed algorithm, the computation complexity is reduced effectively. In addition, for the possible appearance of some new labeled samples in the learning process, a detailed analysis is also carried out. The results show that our algorithm does not rely on the model of sample distribution, has an extremely low rate of introducing wrong semi-labeled samples and can effectively make use of the unlabeled samples to enrich the knowledge system of classifier and improve the accuracy rate. Moreover, our method also has outstanding generalization performance and the ability to overcome the concept drift in a changing environment. PMID:26275294

  7. An incremental community detection method for social tagging systems using locality-sensitive hashing.

    Science.gov (United States)

    Wu, Zhenyu; Zou, Ming

    2014-10-01

    An increasing number of users interact, collaborate, and share information through social networks. Unprecedented growth in social networks is generating a significant amount of unstructured social data. From such data, distilling communities where users have common interests and tracking variations of users' interests over time are important research tracks in fields such as opinion mining, trend prediction, and personalized services. However, these tasks are extremely difficult considering the highly dynamic characteristics of the data. Existing community detection methods are time consuming, making it difficult to process data in real time. In this paper, dynamic unstructured data is modeled as a stream. Tag assignments stream clustering (TASC), an incremental scalable community detection method, is proposed based on locality-sensitive hashing. Both tags and latent interactions among users are incorporated in the method. In our experiments, the social dynamic behaviors of users are first analyzed. The proposed TASC method is then compared with state-of-the-art clustering methods such as StreamKmeans and incremental k-clique; results indicate that TASC can detect communities more efficiently and effectively. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Incremental heating of Bishop Tuff sanidine reveals preeruptive radiogenic Ar and rapid remobilization from cold storage.

    Science.gov (United States)

    Andersen, Nathan L; Jicha, Brian R; Singer, Brad S; Hildreth, Wes

    2017-11-21

    Accurate and precise ages of large silicic eruptions are critical to calibrating the geologic timescale and gauging the tempo of changes in climate, biologic evolution, and magmatic processes throughout Earth history. The conventional approach to dating these eruptive products using the 40 Ar/ 39 Ar method is to fuse dozens of individual feldspar crystals. However, dispersion of fusion dates is common and interpretation is complicated by increasingly precise data obtained via multicollector mass spectrometry. Incremental heating of 49 individual Bishop Tuff (BT) sanidine crystals produces 40 Ar/ 39 Ar dates with reduced dispersion, yet we find a 16-ky range of plateau dates that is not attributable to excess Ar. We interpret this dispersion to reflect cooling of the magma reservoir margins below ∼475 °C, accumulation of radiogenic Ar, and rapid preeruption remobilization. Accordingly, these data elucidate the recycling of subsolidus material into voluminous rhyolite magma reservoirs and the effect of preeruptive magmatic processes on the 40 Ar/ 39 Ar system. The youngest sanidine dates, likely the most representative of the BT eruption age, yield a weighted mean of 764.8 ± 0.3/0.6 ka (2σ analytical/full uncertainty) indicating eruption only ∼7 ky following the Matuyama-Brunhes magnetic polarity reversal. Single-crystal incremental heating provides leverage with which to interpret complex populations of 40 Ar/ 39 Ar sanidine and U-Pb zircon dates and a substantially improved capability to resolve the timing and causal relationship of events in the geologic record.

  9. Persistent Symptoms of Dengue: Estimates of the Incremental Disease and Economic Burden in Mexico

    Science.gov (United States)

    Tiga, D. Carolina; Undurraga, Eduardo A.; Ramos-Castañeda, José; Martínez-Vega, Ruth A.; Tschampl, Cynthia A.; Shepard, Donald S.

    2016-01-01

    Dengue is mostly considered an acute illness with three phases: febrile, critical with possible hemorrhagic manifestations, and recovery. But some patients present persistent symptoms, including fatigue and depression, as acknowledged by the World Health Organization. If persistent symptoms affect a non-negligible share of patients, the burden of dengue will be underestimated. On the basis of a systematic literature review and econometric modeling, we found a significant relationship between the share of patients reporting persisting symptoms and time. We updated estimates of the economic burden of dengue in Mexico, addressing uncertainty in productivity loss and incremental expenses using Monte Carlo simulations. Persistent symptoms represent annually about US$22.6 (95% certainty level [CL]: US$13–US$29) million in incremental costs and 28.2 (95% CL: 21.6–36.2) additional disability-adjusted life years per million population, or 13% and 43% increases over previous estimates, respectively. Although our estimates have uncertainty from limited data, they show a substantial, unmeasured burden. Similar patterns likely extend to other dengue-endemic countries. PMID:26976885

  10. Parent praise to toddlers predicts fourth grade academic achievement via children's incremental mindsets.

    Science.gov (United States)

    Gunderson, Elizabeth A; Sorhagen, Nicole S; Gripshover, Sarah J; Dweck, Carol S; Goldin-Meadow, Susan; Levine, Susan C

    2018-03-01

    In a previous study, parent-child praise was observed in natural interactions at home when children were 1, 2, and 3 years of age. Children who received a relatively high proportion of process praise (e.g., praise for effort and strategies) showed stronger incremental motivational frameworks, including a belief that intelligence can be developed and a greater desire for challenge, when they were in 2nd or 3rd grade (Gunderson et al., 2013). The current study examines these same children's (n = 53) academic achievement 1 to 2 years later, in 4th grade. Results provide the first evidence that process praise to toddlers predicts children's academic achievement (in math and reading comprehension) 7 years later, in elementary school, via their incremental motivational frameworks. Further analysis of these motivational frameworks shows that process praise had its effect on fourth grade achievement through children's trait beliefs (e.g., believing that intelligence is fixed vs. malleable), rather than through their learning goals (e.g., preference for easy vs. challenging tasks). Implications for the socialization of motivation are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Incremental rate of prefrontal oxygenation determines performance speed during cognitive Stroop test: the effect of ageing.

    Science.gov (United States)

    Endo, Kana; Liang, Nan; Idesako, Mitsuhiro; Ishii, Kei; Matsukawa, Kanji

    2018-02-19

    Cognitive function declines with age. The underlying mechanisms responsible for the deterioration of cognitive performance, however, remain poorly understood. We hypothesized that an incremental rate of prefrontal oxygenation during a cognitive Stroop test decreases in progress of ageing, resulting in a slowdown of cognitive performance. To test this hypothesis, we identified, using multichannel near-infrared spectroscopy, the characteristics of the oxygenated-hemoglobin concentration (Oxy-Hb) responses of the prefrontal cortex to both incongruent Stroop and congruent word-reading test. Spatial distributions of the significant changes in the three components (initial slope, peak amplitude, and area under the curve) of the Oxy-Hb response were compared between young and elderly subjects. The Stroop interference time (as a difference in total periods for executing Stroop and word-reading test, respectively) approximately doubled in elderly as compared to young subjects. The Oxy-Hb in the rostrolateral, but not caudal, prefrontal cortex increased during the Stroop test in both age groups. The initial slope of the Oxy-Hb response, rather than the peak and area under the curve, had a strong correlation with cognitive performance speed. Taken together, it is likely that the incremental rate of prefrontal oxygenation may decrease in progress of ageing, resulting in a decline in cognitive performance.

  12. High Birth Weight Is a Risk Factor of Dental Caries Increment during Adolescence in Sweden

    Directory of Open Access Journals (Sweden)

    Annika Julihn

    2014-11-01

    Full Text Available This study aimed to assess whether birth weight is associated with dental caries during the teenage period. In this register-based cohort study, all children of 13 years of age (n = 18,142 who resided in the county of Stockholm, Sweden, in 2000, were included. The cohort was followed until individuals were 19 years of age. Information regarding dental caries was collected from the Public Health Care Administration in Stockholm. Data concerning prenatal and perinatal factors and parental socio-demographic determinants were collected from the Swedish Medical Birth Register and National Registers at Statistics Sweden. The final logistic regression model showed that birth weight ≥4000 g, adjusted for potential confounders, was significantly associated with caries increment (DMFT ≥ 1 (D = decayed, M = missing, F = filled, T = teeth between 13 and 19 age (OR, 1.22; 95% CI = 1.09–1.36. The relatively enhanced risk OR was further increased from 1.22 to 1.43 in subjects with birth weight ≥4600 g. On the contrary, subjects with birth weight <2500 g exhibited a significantly lower risk (OR, 0.67; 95% CI = 0.50–0.89 for exhibiting caries experience (DMFT ≥ 4 at 19 years of age. In conclusion, high birth weight can be regarded as a predictor for dental caries, and especially, birth weight ≥4500 g is a risk factor for caries increment during adolescence.

  13. Modeling and optimization of surface roughness in single point incremental forming process

    Directory of Open Access Journals (Sweden)

    Suresh Kurra

    2015-07-01

    Full Text Available Single point incremental forming (SPIF is a novel and potential process for sheet metal prototyping and low volume production applications. This article is focuses on the development of predictive models for surface roughness estimation in SPIF process. Surface roughness in SPIF has been modeled using three different techniques namely, Artificial Neural Networks (ANN, Support Vector Regression (SVR and Genetic Programming (GP. In the development of these predictive models, tool diameter, step depth, wall angle, feed rate and lubricant type have been considered as model variables. Arithmetic mean surface roughness (Ra and maximum peak to valley height (Rz are used as response variables to assess the surface roughness of incrementally formed parts. The data required to generate, compare and evaluate the proposed models have been obtained from SPIF experiments performed on Computer Numerical Control (CNC milling machine using Box–Behnken design. The developed models are having satisfactory goodness of fit in predicting the surface roughness. Further, the GP model has been used for optimization of Ra and Rz using genetic algorithm. The optimum process parameters for minimum surface roughness in SPIF have been obtained and validated with the experiments and found highly satisfactory results within 10% error.

  14. Incremental heating of Bishop Tuff sanidine reveals preeruptive radiogenic Ar and rapid remobilization from cold storage

    Science.gov (United States)

    Andersen, Nathan L.; Jicha, Brian R.; Singer, Brad S.; Hildreth, Wes

    2017-11-01

    Accurate and precise ages of large silicic eruptions are critical to calibrating the geologic timescale and gauging the tempo of changes in climate, biologic evolution, and magmatic processes throughout Earth history. The conventional approach to dating these eruptive products using the 40Ar/39Ar method is to fuse dozens of individual feldspar crystals. However, dispersion of fusion dates is common and interpretation is complicated by increasingly precise data obtained via multicollector mass spectrometry. Incremental heating of 49 individual Bishop Tuff (BT) sanidine crystals produces 40Ar/39Ar dates with reduced dispersion, yet we find a 16-ky range of plateau dates that is not attributable to excess Ar. We interpret this dispersion to reflect cooling of the magma reservoir margins below ˜475 °C, accumulation of radiogenic Ar, and rapid preeruption remobilization. Accordingly, these data elucidate the recycling of subsolidus material into voluminous rhyolite magma reservoirs and the effect of preeruptive magmatic processes on the 40Ar/39Ar system. The youngest sanidine dates, likely the most representative of the BT eruption age, yield a weighted mean of 764.8 ± 0.3/0.6 ka (2σ analytical/full uncertainty) indicating eruption only ˜7 ky following the Matuyama‑Brunhes magnetic polarity reversal. Single-crystal incremental heating provides leverage with which to interpret complex populations of 40Ar/39Ar sanidine and U-Pb zircon dates and a substantially improved capability to resolve the timing and causal relationship of events in the geologic record.

  15. A Novel Electromyographic Approach to Estimate Fatigue Threshold in Maximum Incremental Strength Tests.

    Science.gov (United States)

    Aragón-Vela, Jerónimo; Barranco-Ruiz, Yaira; Casals-Vázquez, Cristina; Plaza-Díaz, Julio; Casuso, Rafael A; Fontana, Luis; Huertas, Jesús F Rodríguez

    2018-04-01

    Evaluation of muscular fatigue thresholds in athletes performing short-duration and explosive exercises is difficult because classic parameters do not suffer large variations. Therefore, the aim of this study was to develop a new method to estimate the fatigue threshold in single muscles. Our approach is based on electromyographic data recorded during a maximum incremental strength test until the one repetition maximum is reached. Ten men and 10 women performed a half-squat strength test consisting of five incremental intensities of one repetition maximum. Neither heart rate nor blood lactate concentrations showed significant differences at the various intensities tested. Surface electromyographic activities of vastus lateralis, vastus medialis, and rectus femoris were recorded, finding a break point corresponding to the fatigue threshold occurring in men at 70.74%, 71.48%, and 72.52% of one repetition maximum, respectively. In women, break-point values were 76.66% for vastus lateralis, 76.27% for vastus medialis, and 72.10% for rectus femoris. In conclusion, surface electromyography could be a useful, rapid, and noninvasive tool to determine the fatigue threshold of independent muscles during a maximal half-squat strength test.

  16. A compensation strategy for geometric inaccuracies of hot incrementally formed parts

    Science.gov (United States)

    Thyssen, Lars; Störkle, Denis D.; Kuhlenkötter, Bernd

    2017-10-01

    The incremental sheet forming (ISF) offers high geometrical form flexibility without the need of any part-dependent tools. However, the industrial application of incremental sheet metal forming is still limited by certain constraints, e.g. the low geometrical accuracy and the small number of formable alloys. One method to overcome the stated constraints is to use the advantages of metal forming at elevated temperatures. Literature shows the benefits of hot forming, e.g. to decrease the resulting forming forces, to increase the number of formable materials, and to enlarge the achievable deformation, but also the negative effects of forming at elevated temperatures. One of those negative effects is that hot formed parts tend to be smaller than parts which have been formed at room temperature. The paper presents a new approach to compensate the resulting inaccuracies of hot formed parts. More precisely, an online compensation strategy is presented which continuously calculates the present part deviation for each point of the tool path according to the actual process parameters and which adds a movement in the opposite direction of the deviation.

  17. Incremental Prognostic Value of ADC Histogram Analysis over MGMT Promoter Methylation Status in Patients with Glioblastoma.

    Science.gov (United States)

    Choi, Yoon Seong; Ahn, Sung Soo; Kim, Dong Wook; Chang, Jong Hee; Kang, Seok-Gu; Kim, Eui Hyun; Kim, Se Hoon; Rim, Tyler Hyungtaek; Lee, Seung-Koo

    2016-10-01

    Purpose To investigate the incremental prognostic value of apparent diffusion coefficient (ADC) histogram analysis over oxygen 6-methylguanine-DNA methyltransferase (MGMT) promoter methylation status in patients with glioblastoma and the correlation between ADC parameters and MGMT status. Materials and Methods This retrospective study was approved by institutional review board, and informed consent was waived. A total of 112 patients with glioblastoma were divided into training (74 patients) and test (38 patients) sets. Overall survival (OS) and progression-free survival (PFS) was analyzed with ADC parameters, MGMT status, and other clinical factors. Multivariate Cox regression models with and without ADC parameters were constructed. Model performance was assessed with c index and receiver operating characteristic curve analyses for 12- and 16-month OS and 12-month PFS in the training set and validated in the test set. ADC parameters were compared according to MGMT status for the entire cohort. Results By using ADC parameters, the c indices and diagnostic accuracies for 12- and 16-month OS and 12-month PFS in the models showed significant improvement, with the exception of c indices in the models for PFS (P MGMT status. Conclusion ADC histogram analysis had incremental prognostic value over MGMT promoter methylation status in patients with glioblastoma. (©) RSNA, 2016 Online supplemental material is available for this article.

  18. PCIU: Hardware Implementations of an Efficient Packet Classification Algorithm with an Incremental Update Capability

    Directory of Open Access Journals (Sweden)

    O. Ahmed

    2011-01-01

    Full Text Available Packet classification plays a crucial role for a number of network services such as policy-based routing, firewalls, and traffic billing, to name a few. However, classification can be a bottleneck in the above-mentioned applications if not implemented properly and efficiently. In this paper, we propose PCIU, a novel classification algorithm, which improves upon previously published work. PCIU provides lower preprocessing time, lower memory consumption, ease of incremental rule update, and reasonable classification time compared to state-of-the-art algorithms. The proposed algorithm was evaluated and compared to RFC and HiCut using several benchmarks. Results obtained indicate that PCIU outperforms these algorithms in terms of speed, memory usage, incremental update capability, and preprocessing time. The algorithm, furthermore, was improved and made more accessible for a variety of applications through implementation in hardware. Two such implementations are detailed and discussed in this paper. The results indicate that a hardware/software codesign approach results in a slower, but easier to optimize and improve within time constraints, PCIU solution. A hardware accelerator based on an ESL approach using Handel-C, on the other hand, resulted in a 31x speed-up over a pure software implementation running on a state of the art Xeon processor.

  19. Incremental validity of positive orientation: predictive efficiency beyond the five-factor model

    Directory of Open Access Journals (Sweden)

    Łukasz Roland Miciuk

    2016-05-01

    Full Text Available Background The relation of positive orientation (a basic predisposition to think positively of oneself, one’s life and one’s future and personality traits is still disputable. The purpose of the described research was to verify the hypothesis that positive orientation has predictive efficiency beyond the five-factor model. Participants and procedure One hundred and thirty participants (at the mean age M = 24.84 completed the following questionnaires: the Self-Esteem Scale (SES, the Satisfaction with Life Scale (SWLS, the Life Orientation Test-Revised (LOT-R, the Positivity Scale (P-SCALE, the NEO Five Factor Inventory (NEO-FFI, the Self-Concept Clarity Scale (SCC, the Generalized Self-Efficacy Scale (GSES and the Life Engagement Test (LET. Results The introduction of positive orientation as an additional predictor in the second step of regression analyses led to better prediction of the following variables: purpose in life, self-concept clarity and generalized self-efficacy. This effect was the strongest for predicting purpose in life (i.e. 14% increment of the explained variance. Conclusions The results confirmed our hypothesis that positive orientation can be characterized by incremental validity – its inclusion in the regression model (in addition to the five main factors of personality increases the amount of explained variance. These findings may provide further evidence for the legitimacy of measuring positive orientation and personality traits separately.

  20. Prediction of soft tissue deformations after CMF surgery with incremental kernel ridge regression.

    Science.gov (United States)

    Pan, Binbin; Zhang, Guangming; Xia, James J; Yuan, Peng; Ip, Horace H S; He, Qizhen; Lee, Philip K M; Chow, Ben; Zhou, Xiaobo

    2016-08-01

    Facial soft tissue deformation following osteotomy is associated with the corresponding biomechanical characteristics of bone and soft tissues. However, none of the methods devised to predict soft tissue deformation after osteotomy incorporates population-based statistical data. The aim of this study is to establish a statistical model to describe the relationship between biomechanical characteristics and soft tissue deformation after osteotomy. We proposed an incremental kernel ridge regression (IKRR) model to accomplish this goal. The input of the model is the biomechanical information computed by the Finite Element Method (FEM). The output is the soft tissue deformation generated from the paired pre-operative and post-operative 3D images. The model is adjusted incrementally with each new patient's biomechanical information. Therefore, the IKRR model enables us to predict potential soft tissue deformations for new patient by using both biomechanical and statistical information. The integration of these two types of data is critically important for accurate simulations of soft-tissue changes after surgery. The proposed method was evaluated by leave-one-out cross-validation using data from 11 patients. The average prediction error of our model (0.9103mm) was lower than some state-of-the-art algorithms. This model is promising as a reliable way to prevent the risk of facial distortion after craniomaxillofacial surgery. Copyright © 2016 Elsevier Ltd. All rights reserved.